Reprint

Divergence Measures

Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems

Edited by
May 2022
256 pages
  • ISBN978-3-0365-4332-1 (Hardback)
  • ISBN978-3-0365-4331-4 (PDF)

This book is a reprint of the Special Issue Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems that was published in

Chemistry & Materials Science
Computer Science & Mathematics
Physical Sciences
Summary

Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the Rényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures.

Format
  • Hardback
License
© 2022 by the authors; CC BY-NC-ND license
Keywords
Bregman divergence; f-divergence; Jensen–Bregman divergence; Jensen diversity; Jensen–Shannon divergence; capacitory discrimination; Jensen–Shannon centroid; mixture family; information geometry; difference of convex (DC) programming; conditional Rényi divergence; horse betting; Kelly gambling; Rényi divergence; Rényi mutual information; relative entropy; chi-squared divergence; f-divergences; method of types; large deviations; strong data–processing inequalities; information contraction; maximal correlation; Markov chains; information inequalities; mutual information; Rényi entropy; Carlson–Levin inequality; information measures; f-divergence; hypothesis testing; total variation; skew-divergence; convexity; Pinsker’s inequality; Bayes risk; Jensen–Shannon divergence; statistical divergences; minimum divergence estimator; maximum likelihood; bootstrap; conditional limit theorem; Bahadur efficiency; information measures; relative entropy; Rényi divergence; mutual information; α-mutual information; Augustin–Csiszár mutual information; data transmission; error exponents; large deviations; dimensionality reduction; discriminant analysis; f-divergence; statistical inference; n/a