entropy-logo

Journal Browser

Journal Browser

Information Theory and Information Geometry in Dynamical Systems and Machine Learning

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (30 December 2022) | Viewed by 3132

Special Issue Editor


E-Mail Website
Guest Editor
1. Department of Mathematics, University of Edinburgh, Edinburgh, UK
2. The Alan Turing Institute for Data Science, London, UK
Interests: information theory and stochastics for uncertainty quantification in prediction of partially observed dynamical systems; stochastic filtering and data assimilation in high-dimensional dynamical systems; information geometry and mathemamtical foundations of machine learning
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Information theory and the evolution of its entropic functionals have long been known to be crucial for understanding important properties of dynamical systems, and for providing deep links to ergodic theory and statistical physics. The analysis of dynamical systems through the prism of information-bearing processes and symbolic dynamics enabled a rigorous description of entropic complexity and information content of underlying dynamics. The flow of information between components of a dynamical system allows one to study its predictability and the dynamic uncertainty, while quantifying the loss of information provides important tools for constructing coarse-grained approximations.

Information geometry and its inherent links to information theory on manifolds of probability measures has gained importance in the systematic analysis of statistical estimation, time series analysis, machine learning, or signal processing and optimization. Importantly, a number of these themes can be also approached through the theory of dynamical systems.

This Special Issue aims to serve as a timely locus for bringing together modern theoretical and application-oriented developments arising from the cross-fertilisation between information theory, information geometry, and dynamical systems theory. Contemporary applications to the quantification of predictability and dynamic uncertainty, as well as various aspects of learning theory, robustness, and approximation capacity of neural networks and other information processing systems, are of particular interest.

Dr. Michal Branicki
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • information theory
  • information geometry
  • dynamical systems
  • ergodic theory
  • machine learning
  • neural networks
  • Bayesian estimation
  • data-driven models

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

10 pages, 2287 KiB  
Article
Hellinger Information Matrix and Hellinger Priors
by Arkady Shemyakin
Entropy 2023, 25(2), 344; https://0-doi-org.brum.beds.ac.uk/10.3390/e25020344 - 13 Feb 2023
Cited by 2 | Viewed by 1067
Abstract
Hellinger information as a local characteristic of parametric distribution families was first introduced in 2011. It is related to the much older concept of the Hellinger distance between two points in a parametric set. Under certain regularity conditions, the local behavior of the [...] Read more.
Hellinger information as a local characteristic of parametric distribution families was first introduced in 2011. It is related to the much older concept of the Hellinger distance between two points in a parametric set. Under certain regularity conditions, the local behavior of the Hellinger distance is closely connected to Fisher information and the geometry of Riemann manifolds. Nonregular distributions (non-differentiable distribution densities, undefined Fisher information or denisities with support depending on the parameter), including uniform, require using analogues or extensions of Fisher information. Hellinger information may serve to construct information inequalities of the Cramer–Rao type, extending the lower bounds of the Bayes risk to the nonregular case. A construction of non-informative priors based on Hellinger information was also suggested by the author in 2011. Hellinger priors extend the Jeffreys rule to nonregular cases. For many examples, they are identical or close to the reference priors or probability matching priors. Most of the paper was dedicated to the one-dimensional case, but the matrix definition of Hellinger information was also introduced for higher dimensions. Conditions of existence and the nonnegative definite property of Hellinger information matrix were not discussed. Hellinger information for the vector parameter was applied by Yin et al. to problems of optimal experimental design. A special class of parametric problems was considered, requiring the directional definition of Hellinger information, but not a full construction of Hellinger information matrix. In the present paper, a general definition, the existence and nonnegative definite property of Hellinger information matrix is considered for nonregular settings. Full article
Show Figures

Figure 1

10 pages, 259 KiB  
Article
On Almost Norden Statistical Manifolds
by Leila Samereh, Esmaeil Peyghan and Ion Mihai
Entropy 2022, 24(6), 758; https://0-doi-org.brum.beds.ac.uk/10.3390/e24060758 - 27 May 2022
Cited by 3 | Viewed by 1558
Abstract
We consider a statistical connection ∇ on an almost complex manifold with (pseudo-) Riemannian metric, in particular the Norden metric. We investigate almost Norden (statistical) manifolds under the condition that the almost complex structure J is ∇-recurrent. We provide one example of a [...] Read more.
We consider a statistical connection ∇ on an almost complex manifold with (pseudo-) Riemannian metric, in particular the Norden metric. We investigate almost Norden (statistical) manifolds under the condition that the almost complex structure J is ∇-recurrent. We provide one example of a complex statistical connection. Full article
Back to TopTop