Entropy | |
Entropy, Information Theory, Information Geometry and Bayesian Inference in Data, Signal and Image Processing and Inverse Problems |
|
Ali Mohammad-Djafari1  | |
[1] Laboratoire des Signaux et Systèmes, UMR 8506 CNRS-SUPELEC-UNIV PARIS SUD, SUPELEC, Plateau de Moulon, 3 rue Juliot-Curie, 91192 Gif-sur-Yvette, |
|
关键词: Bayes; Laplace; entropy; Bayesian inference; maximum entropy principle; information theory; Kullback/Leibler divergence; Fisher information; geometrical science of information; inverse problems; | |
DOI : 10.3390/e17063989 | |
来源: mdpi | |
【 摘 要 】
The main content of this review article is first to review the main inference tools using Bayes rule, the maximum entropy principle (MEP), information theory, relative entropy and the Kullback–Leibler (KL) divergence, Fisher information and its corresponding geometries. For each of these tools, the precise context of their use is described. The second part of the paper is focused on the ways these tools have been used in data, signal and image processing and in the inverse problems, which arise in different physical sciences and engineering applications. A few examples of the applications are described: entropy in independent components analysis (ICA) and in blind source separation, Fisher information in data model selection, different maximum entropy-based methods in time series spectral estimation and in linear inverse problems and, finally, the Bayesian inference for general inverse problems. Some original materials concerning the approximate Bayesian computation (ABC) and, in particular, the variational Bayesian approximation (VBA) methods are also presented. VBA is used for proposing an alternative Bayesian computational tool to the classical Markov chain Monte Carlo (MCMC) methods. We will also see that VBA englobes joint maximum
【 授权许可】
CC BY
© 2015 by the authors; licensee MDPI, Basel, Switzerland
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
RO202003190011089ZK.pdf | 347KB | download |