entropy-logo

Journal Browser

Journal Browser

Probabilistic Methods for Inverse Problems

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Statistical Physics".

Deadline for manuscript submissions: closed (30 September 2018) | Viewed by 23694

Special Issue Editor


E-Mail Website
Guest Editor
Laboratoire des Signaux et Système, CNRS CentraleSupélec, Université Paris-Saclay, 3, Rue Joliot-Curie, 91192 Gif-sur-Yvette, France
Interests: inference; inverse problems; Bayesian computation; information and maximum entropy; knowledge extraction
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Inverse problems arise in many applications. Whatever the domain of application, when the unknown quantities on which we want to infer, and the quantities on which we can do measurements, and the mathematical relations linking them are identified, the problem then become inference. Deterministic regularization methods have been successfully developed and used. Two main difficulties still remain: How to choose the different criteria and how to weight them and how to quantify the uncertainties. In the three last decades, the probabilistic methods and, in particular, the Bayesian approach have shown their efficiency. The focus of this Special Issue is to have original papers on these probabilistic methods where the real advantages on regularization methods have been shown. The papers with real applications in different area such as biological and medical imaging, industrial nondestructive testing, radio astronomical, and geophysical imaging are preferred.

Prof. Dr. Ali Mohammad-Djafari
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Inverse problems
  • Bayesian inference
  • Approximate Bayesian computation
  • Variational methods
  • Variational Bayesian approximation
  • Expectation propagation
  • Uncertainty quantification
  • MCMC
  • Computed tomography
  • Medical imaging
  • Nondestructive industrial imaging

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

24 pages, 2071 KiB  
Article
Bayesian 3D X-ray Computed Tomography with a Hierarchical Prior Model for Sparsity in Haar Transform Domain
by Li Wang, Ali Mohammad-Djafari, Nicolas Gac and Mircea Dumitru
Entropy 2018, 20(12), 977; https://0-doi-org.brum.beds.ac.uk/10.3390/e20120977 - 16 Dec 2018
Cited by 2 | Viewed by 3240
Abstract
In this paper, a hierarchical prior model based on the Haar transformation and an appropriate Bayesian computational method for X-ray CT reconstruction are presented. Given the piece-wise continuous property of the object, a multilevel Haar transformation is used to associate a sparse representation [...] Read more.
In this paper, a hierarchical prior model based on the Haar transformation and an appropriate Bayesian computational method for X-ray CT reconstruction are presented. Given the piece-wise continuous property of the object, a multilevel Haar transformation is used to associate a sparse representation for the object. The sparse structure is enforced via a generalized Student-t distribution ( S t g ), expressed as the marginal of a normal-inverse Gamma distribution. The proposed model and corresponding algorithm are designed to adapt to specific 3D data sizes and to be used in both medical and industrial Non-Destructive Testing (NDT) applications. In the proposed Bayesian method, a hierarchical structured prior model is proposed, and the parameters are iteratively estimated. The initialization of the iterative algorithm uses the parameters of the prior distributions. A novel strategy for the initialization is presented and proven experimentally. We compare the proposed method with two state-of-the-art approaches, showing that our method has better reconstruction performance when fewer projections are considered and when projections are acquired from limited angles. Full article
(This article belongs to the Special Issue Probabilistic Methods for Inverse Problems)
Show Figures

Figure 1

28 pages, 880 KiB  
Article
Random Finite Set Based Parameter Estimation Algorithm for Identifying Stochastic Systems
by Peng Wang, Ge Li, Yong Peng and Rusheng Ju
Entropy 2018, 20(8), 569; https://0-doi-org.brum.beds.ac.uk/10.3390/e20080569 - 31 Jul 2018
Cited by 2 | Viewed by 3046
Abstract
Parameter estimation is one of the key technologies for system identification. The Bayesian parameter estimation algorithms are very important for identifying stochastic systems. In this paper, a random finite set based algorithm is proposed to overcome the disadvantages of the existing Bayesian parameter [...] Read more.
Parameter estimation is one of the key technologies for system identification. The Bayesian parameter estimation algorithms are very important for identifying stochastic systems. In this paper, a random finite set based algorithm is proposed to overcome the disadvantages of the existing Bayesian parameter estimation algorithms. It can estimate the unknown parameters of the stochastic system which consists of a varying number of constituent elements by using the measurements disturbed by false detections, missed detections and noises. The models used for parameter estimation are constructed by using random finite set. Based on the proposed system model and measurement model, the key principles and formula derivation of the proposed algorithm are detailed. Then, the implementation of the algorithm is presented by using sequential Monte Carlo based Probability Hypothesis Density (PHD) filter and simulated tempering based importance sampling. Finally, the experiments of systematic errors estimation of multiple sensors are provided to prove the main advantages of the proposed algorithm. The sensitivity analysis is carried out to further study the mechanism of the algorithm. The experimental results verify the superiority of the proposed algorithm. Full article
(This article belongs to the Special Issue Probabilistic Methods for Inverse Problems)
Show Figures

Figure 1

24 pages, 821 KiB  
Article
Stochastic Proximal Gradient Algorithms for Multi-Source Quantitative Photoacoustic Tomography
by Simon Rabanser, Lukas Neumann and Markus Haltmeier
Entropy 2018, 20(2), 121; https://doi.org/10.3390/e20020121 - 11 Feb 2018
Cited by 9 | Viewed by 4188
Abstract
The development of accurate and efficient image reconstruction algorithms is a central aspect of quantitative photoacoustic tomography (QPAT). In this paper, we address this issues for multi-source QPAT using the radiative transfer equation (RTE) as accurate model for light transport. The tissue parameters [...] Read more.
The development of accurate and efficient image reconstruction algorithms is a central aspect of quantitative photoacoustic tomography (QPAT). In this paper, we address this issues for multi-source QPAT using the radiative transfer equation (RTE) as accurate model for light transport. The tissue parameters are jointly reconstructed from the acoustical data measured for each of the applied sources. We develop stochastic proximal gradient methods for multi-source QPAT, which are more efficient than standard proximal gradient methods in which a single iterative update has complexity proportional to the number applies sources. Additionally, we introduce a completely new formulation of QPAT as multilinear (MULL) inverse problem which avoids explicitly solving the RTE. The MULL formulation of QPAT is again addressed with stochastic proximal gradient methods. Numerical results for both approaches are presented. Besides the introduction of stochastic proximal gradient algorithms to QPAT, we consider the new MULL formulation of QPAT as main contribution of this paper. Full article
(This article belongs to the Special Issue Probabilistic Methods for Inverse Problems)
Show Figures

Figure 1

35 pages, 2860 KiB  
Article
An Auxiliary Variable Method for Markov Chain Monte Carlo Algorithms in High Dimension
by Yosra Marnissi, Emilie Chouzenoux, Amel Benazza-Benyahia and Jean-Christophe Pesquet
Entropy 2018, 20(2), 110; https://0-doi-org.brum.beds.ac.uk/10.3390/e20020110 - 07 Feb 2018
Cited by 17 | Viewed by 3911
Abstract
In this paper, we are interested in Bayesian inverse problems where either the data fidelity term or the prior distribution is Gaussian or driven from a hierarchical Gaussian model. Generally, Markov chain Monte Carlo (MCMC) algorithms allow us to generate sets of samples [...] Read more.
In this paper, we are interested in Bayesian inverse problems where either the data fidelity term or the prior distribution is Gaussian or driven from a hierarchical Gaussian model. Generally, Markov chain Monte Carlo (MCMC) algorithms allow us to generate sets of samples that are employed to infer some relevant parameters of the underlying distributions. However, when the parameter space is high-dimensional, the performance of stochastic sampling algorithms is very sensitive to existing dependencies between parameters. In particular, this problem arises when one aims to sample from a high-dimensional Gaussian distribution whose covariance matrix does not present a simple structure. Another challenge is the design of Metropolis–Hastings proposals that make use of information about the local geometry of the target density in order to speed up the convergence and improve mixing properties in the parameter space, while not being too computationally expensive. These two contexts are mainly related to the presence of two heterogeneous sources of dependencies stemming either from the prior or the likelihood in the sense that the related covariance matrices cannot be diagonalized in the same basis. In this work, we address these two issues. Our contribution consists of adding auxiliary variables to the model in order to dissociate the two sources of dependencies. In the new augmented space, only one source of correlation remains directly related to the target parameters, the other sources of correlations being captured by the auxiliary variables. Experiments are conducted on two practical image restoration problems—namely the recovery of multichannel blurred images embedded in Gaussian noise and the recovery of signal corrupted by a mixed Gaussian noise. Experimental results indicate that adding the proposed auxiliary variables makes the sampling problem simpler since the new conditional distribution no longer contains highly heterogeneous correlations. Thus, the computational cost of each iteration of the Gibbs sampler is significantly reduced while ensuring good mixing properties. Full article
(This article belongs to the Special Issue Probabilistic Methods for Inverse Problems)
Show Figures

Figure 1

14 pages, 1896 KiB  
Article
Particle Swarm Optimization and Uncertainty Assessment in Inverse Problems
by José L. G. Pallero, María Zulima Fernández-Muñiz, Ana Cernea, Óscar Álvarez-Machancoses, Luis Mariano Pedruelo-González, Sylvain Bonvalot and Juan Luis Fernández-Martínez
Entropy 2018, 20(2), 96; https://0-doi-org.brum.beds.ac.uk/10.3390/e20020096 - 30 Jan 2018
Cited by 18 | Viewed by 4244
Abstract
Most inverse problems in the industry (and particularly in geophysical exploration) are highly underdetermined because the number of model parameters too high to achieve accurate data predictions and because the sampling of the data space is scarce and incomplete; it is always affected [...] Read more.
Most inverse problems in the industry (and particularly in geophysical exploration) are highly underdetermined because the number of model parameters too high to achieve accurate data predictions and because the sampling of the data space is scarce and incomplete; it is always affected by different kinds of noise. Additionally, the physics of the forward problem is a simplification of the reality. All these facts result in that the inverse problem solution is not unique; that is, there are different inverse solutions (called equivalent), compatible with the prior information that fits the observed data within similar error bounds. In the case of nonlinear inverse problems, these equivalent models are located in disconnected flat curvilinear valleys of the cost-function topography. The uncertainty analysis consists of obtaining a representation of this complex topography via different sampling methodologies. In this paper, we focus on the use of a particle swarm optimization (PSO) algorithm to sample the region of equivalence in nonlinear inverse problems. Although this methodology has a general purpose, we show its application for the uncertainty assessment of the solution of a geophysical problem concerning gravity inversion in sedimentary basins, showing that it is possible to efficiently perform this task in a sampling-while-optimizing mode. Particularly, we explain how to use and analyze the geophysical models sampled by exploratory PSO family members to infer different descriptors of nonlinear uncertainty. Full article
(This article belongs to the Special Issue Probabilistic Methods for Inverse Problems)
Show Figures

Figure 1

648 KiB  
Article
An Improved Chaotic Optimization Algorithm Applied to a DC Electrical Motor Modeling
by Simone Fiori and Ruben Di Filippo
Entropy 2017, 19(12), 665; https://0-doi-org.brum.beds.ac.uk/10.3390/e19120665 - 04 Dec 2017
Cited by 11 | Viewed by 4355
Abstract
The chaos-based optimization algorithm (COA) is a method to optimize possibly nonlinear complex functions of several variables by chaos search. The main innovation behind the chaos-based optimization algorithm is to generate chaotic trajectories by means of nonlinear, discrete-time dynamical systems to explore the [...] Read more.
The chaos-based optimization algorithm (COA) is a method to optimize possibly nonlinear complex functions of several variables by chaos search. The main innovation behind the chaos-based optimization algorithm is to generate chaotic trajectories by means of nonlinear, discrete-time dynamical systems to explore the search space while looking for the global minimum of a complex criterion function. The aim of the present research is to investigate the numerical properties of the COA, both on complex optimization test-functions from the literature and on a real-world problem, to contribute to the understanding of its global-search features. In addition, the present research suggests a refinement of the original COA algorithm to improve its optimization performances. In particular, the real-world optimization problem tackled within the paper is the estimation of six electro-mechanical parameters of a model of a direct-current (DC) electrical motor. A large number of test results prove that the algorithm achieves an excellent numerical precision at a little expense in the computational complexity, which appears as extremely limited, compared to the complexity of other benchmark optimization algorithms, namely, the genetic algorithm and the simulated annealing algorithm. Full article
(This article belongs to the Special Issue Probabilistic Methods for Inverse Problems)
Show Figures

Figure 1

Back to TopTop