entropy-logo

Journal Browser

Journal Browser

Information Geometry III

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (31 August 2020) | Viewed by 20019

Special Issue Editor


E-Mail Website
Guest Editor
Research Unit Nuclear Fusion, Department of Applied Physics, Ghent University, Sint-Pietersnieuwstraat 41, B-9000 Ghent, Belgium
Interests: probability theory; Bayesian inference; machine learning; information geometry; differential geometry; nuclear fusion; plasma physics; plasma turbulence; continuum mechanics; statistical mechanics
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The mathematical field of Information Geometry originated from the observation that the Fisher information can be used to define a Riemannian metric on manifolds of probability distributions. This led to a geometrical description of probability theory and statistics, allowing studies of the invariant properties of statistical manifolds. It was through the work of S.-I. Amari and others that it was later realized that the differential-geometric structure of a statistical manifold can be extended to families of dual affine connections and that such a structure can be derived from divergence functions.

Since then, Information Geometry has become a truly interdisciplinary field with applications in various domains. It enables a deeper and more intuitive understanding of the methods of statistical inference and machine learning, while providing a powerful framework for deriving new algorithms. As such, Information Geometry has many applications in optimization, signal and image processing, computer vision, neural networks and other subfields of the information sciences. Furthermore, the methods of Information Geometry have been applied to a broad variety of topics in physics, mathematical finance, biology and the neurosciences. In physics, there are many links with fields that have a natural probabilistic interpretation, including (nonextensive) statistical mechanics and quantum mechanics.

For this Special Issue we welcome submissions related to the foundations and applications of Information Geometry. We envisage contributions that aim at clarifying the connection of Information Geometry with both the information sciences and the physical sciences, so as to demonstrate the profound impact of the field in these disciplines. In addition, we hope to receive original papers illustrating the wide variety of applications of the methods of Information Geometry.

Prof. Dr. Geert Verdoolaege
Guest Editor

Volume I: https://0-www-mdpi-com.brum.beds.ac.uk/journal/entropy/special_issues/information-geometry
Volume II: https://0-www-mdpi-com.brum.beds.ac.uk/journal/entropy/special_issues/information_geometry_II

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

18 pages, 5563 KiB  
Article
Changing the Geometry of Representations: α-Embeddings for NLP Tasks
by Riccardo Volpi, Uddhipan Thakur and Luigi Malagò
Entropy 2021, 23(3), 287; https://0-doi-org.brum.beds.ac.uk/10.3390/e23030287 - 26 Feb 2021
Cited by 2 | Viewed by 2643
Abstract
Word embeddings based on a conditional model are commonly used in Natural Language Processing (NLP) tasks to embed the words of a dictionary in a low dimensional linear space. Their computation is based on the maximization of the likelihood of a conditional probability [...] Read more.
Word embeddings based on a conditional model are commonly used in Natural Language Processing (NLP) tasks to embed the words of a dictionary in a low dimensional linear space. Their computation is based on the maximization of the likelihood of a conditional probability distribution for each word of the dictionary. These distributions form a Riemannian statistical manifold, where word embeddings can be interpreted as vectors in the tangent space of a specific reference measure on the manifold. A novel family of word embeddings, called α-embeddings have been recently introduced as deriving from the geometrical deformation of the simplex of probabilities through a parameter α, using notions from Information Geometry. After introducing the α-embeddings, we show how the deformation of the simplex, controlled by α, provides an extra handle to increase the performances of several intrinsic and extrinsic tasks in NLP. We test the α-embeddings on different tasks with models of increasing complexity, showing that the advantages associated with the use of α-embeddings are present also for models with a large number of parameters. Finally, we show that tuning α allows for higher performances compared to the use of larger models in which additionally a transformation of the embeddings is learned during training, as experimentally verified in attention models. Full article
(This article belongs to the Special Issue Information Geometry III)
Show Figures

Figure 1

41 pages, 1470 KiB  
Article
The Siegel–Klein Disk: Hilbert Geometry of the Siegel Disk Domain
by Frank Nielsen
Entropy 2020, 22(9), 1019; https://0-doi-org.brum.beds.ac.uk/10.3390/e22091019 - 12 Sep 2020
Cited by 6 | Viewed by 4352
Abstract
We study the Hilbert geometry induced by the Siegel disk domain, an open-bounded convex set of complex square matrices of operator norm strictly less than one. This Hilbert geometry yields a generalization of the Klein disk model of hyperbolic geometry, henceforth called the [...] Read more.
We study the Hilbert geometry induced by the Siegel disk domain, an open-bounded convex set of complex square matrices of operator norm strictly less than one. This Hilbert geometry yields a generalization of the Klein disk model of hyperbolic geometry, henceforth called the Siegel–Klein disk model to differentiate it from the classical Siegel upper plane and disk domains. In the Siegel–Klein disk, geodesics are by construction always unique and Euclidean straight, allowing one to design efficient geometric algorithms and data structures from computational geometry. For example, we show how to approximate the smallest enclosing ball of a set of complex square matrices in the Siegel disk domains: We compare two generalizations of the iterative core-set algorithm of Badoiu and Clarkson (BC) in the Siegel–Poincaré disk and in the Siegel–Klein disk: We demonstrate that geometric computing in the Siegel–Klein disk allows one (i) to bypass the time-costly recentering operations to the disk origin required at each iteration of the BC algorithm in the Siegel–Poincaré disk model, and (ii) to approximate fast and numerically the Siegel–Klein distance with guaranteed lower and upper bounds derived from nested Hilbert geometries. Full article
(This article belongs to the Special Issue Information Geometry III)
Show Figures

Figure 1

13 pages, 342 KiB  
Article
Lagrangian Submanifolds of Symplectic Structures Induced by Divergence Functions
by Marco Favretti
Entropy 2020, 22(9), 983; https://0-doi-org.brum.beds.ac.uk/10.3390/e22090983 - 03 Sep 2020
Cited by 1 | Viewed by 2211
Abstract
Divergence functions play a relevant role in Information Geometry as they allow for the introduction of a Riemannian metric and a dual connection structure on a finite dimensional manifold of probability distributions. They also allow to define, in a canonical way, a symplectic [...] Read more.
Divergence functions play a relevant role in Information Geometry as they allow for the introduction of a Riemannian metric and a dual connection structure on a finite dimensional manifold of probability distributions. They also allow to define, in a canonical way, a symplectic structure on the square of the above manifold of probability distributions, a property that has received less attention in the literature until recent contributions. In this paper, we hint at a possible application: we study Lagrangian submanifolds of this symplectic structure and show that they are useful for describing the manifold of solutions of the Maximum Entropy principle. Full article
(This article belongs to the Special Issue Information Geometry III)
Show Figures

Figure 1

34 pages, 1942 KiB  
Article
On Voronoi Diagrams on the Information-Geometric Cauchy Manifolds
by Frank Nielsen
Entropy 2020, 22(7), 713; https://0-doi-org.brum.beds.ac.uk/10.3390/e22070713 - 28 Jun 2020
Cited by 10 | Viewed by 5042
Abstract
We study the Voronoi diagrams of a finite set of Cauchy distributions and their dual complexes from the viewpoint of information geometry by considering the Fisher-Rao distance, the Kullback-Leibler divergence, the chi square divergence, and a flat divergence derived from Tsallis entropy related [...] Read more.
We study the Voronoi diagrams of a finite set of Cauchy distributions and their dual complexes from the viewpoint of information geometry by considering the Fisher-Rao distance, the Kullback-Leibler divergence, the chi square divergence, and a flat divergence derived from Tsallis entropy related to the conformal flattening of the Fisher-Rao geometry. We prove that the Voronoi diagrams of the Fisher-Rao distance, the chi square divergence, and the Kullback-Leibler divergences all coincide with a hyperbolic Voronoi diagram on the corresponding Cauchy location-scale parameters, and that the dual Cauchy hyperbolic Delaunay complexes are Fisher orthogonal to the Cauchy hyperbolic Voronoi diagrams. The dual Voronoi diagrams with respect to the dual flat divergences amount to dual Bregman Voronoi diagrams, and their dual complexes are regular triangulations. The primal Bregman Voronoi diagram is the Euclidean Voronoi diagram and the dual Bregman Voronoi diagram coincides with the Cauchy hyperbolic Voronoi diagram. In addition, we prove that the square root of the Kullback-Leibler divergence between Cauchy distributions yields a metric distance which is Hilbertian for the Cauchy scale families. Full article
(This article belongs to the Special Issue Information Geometry III)
Show Figures

Figure 1

16 pages, 328 KiB  
Article
A Geometric Approach to Average Problems on Multinomial and Negative Multinomial Models
by Mingming Li, Huafei Sun and Didong Li
Entropy 2020, 22(3), 306; https://0-doi-org.brum.beds.ac.uk/10.3390/e22030306 - 08 Mar 2020
Viewed by 2131
Abstract
This paper is concerned with the formulation and computation of average problems on the multinomial and negative multinomial models. It can be deduced that the multinomial and negative multinomial models admit complementary geometric structures. Firstly, we investigate these geometric structures by providing various [...] Read more.
This paper is concerned with the formulation and computation of average problems on the multinomial and negative multinomial models. It can be deduced that the multinomial and negative multinomial models admit complementary geometric structures. Firstly, we investigate these geometric structures by providing various useful pre-derived expressions of some fundamental geometric quantities, such as Fisher-Riemannian metrics, α -connections and α -curvatures. Then, we proceed to consider some average methods based on these geometric structures. Specifically, we study the formulation and computation of the midpoint of two points and the Karcher mean of multiple points. In conclusion, we find some parallel results for the average problems on these two complementary models. Full article
(This article belongs to the Special Issue Information Geometry III)
12 pages, 290 KiB  
Article
Global Geometry of Bayesian Statistics
by Atsuhide Mori
Entropy 2020, 22(2), 240; https://0-doi-org.brum.beds.ac.uk/10.3390/e22020240 - 20 Feb 2020
Cited by 1 | Viewed by 2752
Abstract
In the previous work of the author, a non-trivial symmetry of the relative entropy in the information geometry of normal distributions was discovered. The same symmetry also appears in the symplectic/contact geometry of Hilbert modular cusps. Further, it was observed that a contact [...] Read more.
In the previous work of the author, a non-trivial symmetry of the relative entropy in the information geometry of normal distributions was discovered. The same symmetry also appears in the symplectic/contact geometry of Hilbert modular cusps. Further, it was observed that a contact Hamiltonian flow presents a certain Bayesian inference on normal distributions. In this paper, we describe Bayesian statistics and the information geometry in the language of current geometry in order to spread our interest in statistics through general geometers and topologists. Then, we foliate the space of multivariate normal distributions by symplectic leaves to generalize the above result of the author. This foliation arises from the Cholesky decomposition of the covariance matrices. Full article
(This article belongs to the Special Issue Information Geometry III)
Show Figures

Graphical abstract

Back to TopTop