Algorithms for Nonparametric Estimation

A special issue of Algorithms (ISSN 1999-4893). This special issue belongs to the section "Algorithms for Multidisciplinary Applications".

Deadline for manuscript submissions: closed (30 July 2020) | Viewed by 5335

Special Issue Editor


E-Mail Website
Guest Editor
Department of Computer Science and Software Engineering, Concordia University, Montreal, QC H3G 1M8, Canada
Interests: nonparametric function estimation including density and regression estimation; nonparametric classification; nonparametric modeling, multivariate analysis; nonparametric Bayes methods and decision procedures; semiparametric models and procedures; statistical methods for imaging and tomography; statistical inverse problems; statistical algorithms and machine learning; applications in image processing and computer vision; financial; bioinformatics and medicine

Special Issue Information

Dear Colleagues,

We invite you to submit your latest research in the area of the development of algorithms for nonparametric estimation to this Special Issue, “Algorithms for Nonparametric Estimation”. We are looking for new and innovative approaches for solving nonparametric estimation problems exactly or approximately. High-quality papers are solicited to address both theoretical and practical issues of nonparametric estimation. Submissions are welcome both for traditional nonparametric estimation problems, as well as new applications. Potential topics include, but are not limited to, algorithms for nonparametric density and regression estimation, nonparametric classification and decision procedures, nonparametric modeling, deep learning, dimensionality reduction and variable selection, algorithms for nonparametric techniques in image processing, computer vision, financial statistics, bioinformatics and medical applications.

Prof. Dr. Adam Krzyzak
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Algorithms is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Nonparametric Regression and Density Estimation
  • Rank and Distribution-free Procedures
  • Multivariate Analysis
  • Deep Learning Algorithms
  • Dimension Reduction
  • Inverse Problems
  • Image Processing and Computer Vision
  • Algorithms for Financial Statistics
  • Algorithms for Bioinformatics and Medical Applications
  • Complexity Analysis for Nonparametric Estimation Algorithms

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

24 pages, 1660 KiB  
Article
Variational Multiscale Nonparametric Regression: Algorithms and Implementation
by Miguel del Alamo, Housen Li, Axel Munk and Frank Werner
Algorithms 2020, 13(11), 296; https://0-doi-org.brum.beds.ac.uk/10.3390/a13110296 - 13 Nov 2020
Cited by 1 | Viewed by 2316
Abstract
Many modern statistically efficient methods come with tremendous computational challenges, often leading to large-scale optimisation problems. In this work, we examine such computational issues for recently developed estimation methods in nonparametric regression with a specific view on image denoising. We consider in particular [...] Read more.
Many modern statistically efficient methods come with tremendous computational challenges, often leading to large-scale optimisation problems. In this work, we examine such computational issues for recently developed estimation methods in nonparametric regression with a specific view on image denoising. We consider in particular certain variational multiscale estimators which are statistically optimal in minimax sense, yet computationally intensive. Such an estimator is computed as the minimiser of a smoothness functional (e.g., TV norm) over the class of all estimators such that none of its coefficients with respect to a given multiscale dictionary is statistically significant. The so obtained multiscale Nemirowski-Dantzig estimator (MIND) can incorporate any convex smoothness functional and combine it with a proper dictionary including wavelets, curvelets and shearlets. The computation of MIND in general requires to solve a high-dimensional constrained convex optimisation problem with a specific structure of the constraints induced by the statistical multiscale testing criterion. To solve this explicitly, we discuss three different algorithmic approaches: the Chambolle-Pock, ADMM and semismooth Newton algorithms. Algorithmic details and an explicit implementation is presented and the solutions are then compared numerically in a simulation study and on various test images. We thereby recommend the Chambolle-Pock algorithm in most cases for its fast convergence. We stress that our analysis can also be transferred to signal recovery and other denoising problems to recover more general objects whenever it is possible to borrow statistical strength from data patches of similar object structure. Full article
(This article belongs to the Special Issue Algorithms for Nonparametric Estimation)
Show Figures

Figure 1

20 pages, 11681 KiB  
Article
Nonparametric Estimation of Continuously Parametrized Families of Probability Density Functions—Computational Aspects
by Wojciech Rafajłowicz
Algorithms 2020, 13(7), 164; https://0-doi-org.brum.beds.ac.uk/10.3390/a13070164 - 08 Jul 2020
Cited by 1 | Viewed by 2614
Abstract
We consider a rather general problem of nonparametric estimation of an uncountable set of probability density functions (p.d.f.’s) of the form: f ( x ; r ) , where r is a non-random real variable and ranges from R 1 to [...] Read more.
We consider a rather general problem of nonparametric estimation of an uncountable set of probability density functions (p.d.f.’s) of the form: f ( x ; r ) , where r is a non-random real variable and ranges from R 1 to R 2 . We put emphasis on the algorithmic aspects of this problem, since they are crucial for exploratory analysis of big data that are needed for the estimation. A specialized learning algorithm, based on the 2D FFT, is proposed and tested on observations that allow for estimate p.d.f.’s of a jet engine temperatures as a function of its rotation speed. We also derive theoretical results concerning the convergence of the estimation procedure that contains hints on selecting parameters of the estimation algorithm. Full article
(This article belongs to the Special Issue Algorithms for Nonparametric Estimation)
Show Figures

Figure 1

Back to TopTop