entropy-logo

Journal Browser

Journal Browser

Information Geometry, Complexity Measures and Data Analysis

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Complexity".

Deadline for manuscript submissions: closed (30 November 2021) | Viewed by 18573

Special Issue Editors


E-Mail Website
Guest Editor
Centro de Investigación Operativa, Universidad Miguel Hernández, Avenida de la Universidad s/n, 03202 Elche, Spain
Interests: dynamical systems; nonlinear time-series analysis; ergodic theory; mathematical physics
Special Issues, Collections and Topics in MDPI journals

E-Mail
Guest Editor
1. Departamento de Física Teórica, Facultad de Ciencias Físicas, Universidad Complutense de Madrid, 28040 Madrid, Spain;
2. Instituto de Ciencias Matemáticas (ICMAT), 28049 - Madrid Spain
Interests: mathematical physics; number theory; statistical mechanics

Special Issue Information

Dear Colleagues,

In the last several years, a new approach to information theory, called information geometry, has emerged. Its main objective is the investigation of the geometrical structures that can be introduced in the manifold associated with the set of probability distributions of a statistical model. In this approach, one defines a Riemannian metric in a manifold of probability distributions, together with dually coupled affine connections. Information geometry provides a new methodology applicable to various areas of information sciences, such as statistical inference, quantum information theory, machine learning, convex optimization, and time-series analysis. It is also a key tool for other areas, such as neuro-computing (where a set of neural networks forms a neuro-manifold, a nonlinear system equipped with the Fisher metric).

‘Complexity measure’ is a broad concept that embraces any way of characterizing a process or a generic output according to its “complexity” in a predefined sense. Familiar examples include (i) Lempez–Ziv complexity and Shannon entropy in the case of sequences, (ii) differential entropy and permutation entropy in the case of analog signals, and (iii) Kolmogorov entropy and topological entropy in the case of measure-preserving and a continuous dynamic, respectively. Further examples include other information-theoretical tools (Renyi and Tsallis entropies, as well as a long list of entropy-like measures), statistical tools (statistical complexity), and dynamical tools (recurrence plots, the correlation integral) together with a variety of complexity–causality planes, ordinal networks, and ad hoc tools based on symbolic dynamics. Such a multiplicity of approaches attests to the great interest of this topic.

As for data analysis, this is a multidisciplinary topic that is going through a period of intense research, where old and new ideas are driving interesting developments. Thus, most of the complexity measures mentioned above are becoming increasingly popular in nonlinear time-series analysis. At the same time, machine learning tools such as recurrent nets and reservoir computing are pushing the limits of time-series prediction further. New synergies are expected from the application of deep learning to dynamical and complex systems.

This Special Issue is meant to be a showcase of recent progress in the intersections of complex systems with information theory and data analysis. Due to the ubiquity of these topics in the hard and soft sciences, we welcome contributions in the form of research papers or reviews from all fields. Papers submitted to this Special Issue may address theoretical issues and applications.

Prof. Dr. José María Amigó
Dr. Piergiulio Tempesta
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • complex systems
  • information geometry
  • complexity measures
  • data analysis
  • information theory
  • applications

Published Papers (8 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research, Review

3 pages, 200 KiB  
Editorial
Information Geometry, Complexity Measures and Data Analysis
by José M. Amigó and Piergiulio Tempesta
Entropy 2022, 24(12), 1797; https://0-doi-org.brum.beds.ac.uk/10.3390/e24121797 - 09 Dec 2022
Viewed by 1017
Abstract
In the last several years, a new approach to information theory, called information geometry, has emerged [...] Full article
(This article belongs to the Special Issue Information Geometry, Complexity Measures and Data Analysis)

Research

Jump to: Editorial, Review

11 pages, 408 KiB  
Article
f-Gintropy: An Entropic Distance Ranking Based on the Gini Index
by Tamás Sándor Biró, András Telcs, Máté Józsa and Zoltán Néda
Entropy 2022, 24(3), 407; https://0-doi-org.brum.beds.ac.uk/10.3390/e24030407 - 14 Mar 2022
Cited by 3 | Viewed by 1695
Abstract
We consider an entropic distance analog quantity based on the density of the Gini index in the Lorenz map, i.e., gintropy. Such a quantity might be used for pairwise mapping and ranking between various countries and regions based on income and wealth inequality. [...] Read more.
We consider an entropic distance analog quantity based on the density of the Gini index in the Lorenz map, i.e., gintropy. Such a quantity might be used for pairwise mapping and ranking between various countries and regions based on income and wealth inequality. Its generalization to f-gintropy, using a function of the income or wealth value, distinguishes between regional inequalities more sensitively than the original construction. Full article
(This article belongs to the Special Issue Information Geometry, Complexity Measures and Data Analysis)
Show Figures

Figure 1

16 pages, 604 KiB  
Article
Along the Lines of Nonadditive Entropies: q-Prime Numbers and q-Zeta Functions
by Ernesto P. Borges, Takeshi Kodama and Constantino Tsallis
Entropy 2022, 24(1), 60; https://0-doi-org.brum.beds.ac.uk/10.3390/e24010060 - 28 Dec 2021
Cited by 4 | Viewed by 1266
Abstract
The rich history of prime numbers includes great names such as Euclid, who first analytically studied the prime numbers and proved that there is an infinite number of them, Euler, who introduced the function [...] Read more.
The rich history of prime numbers includes great names such as Euclid, who first analytically studied the prime numbers and proved that there is an infinite number of them, Euler, who introduced the function ζ(s)n=1ns=pprime11ps, Gauss, who estimated the rate at which prime numbers increase, and Riemann, who extended ζ(s) to the complex plane z and conjectured that all nontrivial zeros are in the R(z)=1/2 axis. The nonadditive entropy Sq=kipilnq(1/pi)(qR;S1=SBGkipilnpi, where BG stands for Boltzmann-Gibbs) on which nonextensive statistical mechanics is based, involves the function lnqzz1q11q(ln1z=lnz). It is already known that this function paves the way for the emergence of a q-generalized algebra, using q-numbers defined as xqelnqx, which recover the number x for q=1. The q-prime numbers are then defined as the q-natural numbers nqelnqn(n=1,2,3,), where n is a prime number p=2,3,5,7, We show that, for any value of q, infinitely many q-prime numbers exist; for q1 they diverge for increasing prime number, whereas they converge for q>1; the standard prime numbers are recovered for q=1. For q1, we generalize the ζ(s) function as follows: ζq(s)ζ(s)q (sR). We show that this function appears to diverge at s=1+0, q. Also, we alternatively define, for q1, ζq(s)n=11nqs=1+12qs+ and ζq(s)pprime11pqs=112qs113qs115qs, which, for q<1, generically satisfy ζq(s)<ζq(s), in variance with the q=1 case, where of course ζ1(s)=ζ1(s). Full article
(This article belongs to the Special Issue Information Geometry, Complexity Measures and Data Analysis)
Show Figures

Figure 1

17 pages, 360 KiB  
Article
The Causal Interaction between Complex Subsystems
by X. San Liang
Entropy 2022, 24(1), 3; https://0-doi-org.brum.beds.ac.uk/10.3390/e24010003 - 21 Dec 2021
Cited by 6 | Viewed by 2463
Abstract
Information flow provides a natural measure for the causal interaction between dynamical events. This study extends our previous rigorous formalism of componentwise information flow to the bulk information flow between two complex subsystems of a large-dimensional parental system. Analytical formulas have been obtained [...] Read more.
Information flow provides a natural measure for the causal interaction between dynamical events. This study extends our previous rigorous formalism of componentwise information flow to the bulk information flow between two complex subsystems of a large-dimensional parental system. Analytical formulas have been obtained in a closed form. Under a Gaussian assumption, their maximum likelihood estimators have also been obtained. These formulas have been validated using different subsystems with preset relations, and they yield causalities just as expected. On the contrary, the commonly used proxies for the characterization of subsystems, such as averages and principal components, generally do not work correctly. This study can help diagnose the emergence of patterns in complex systems and is expected to have applications in many real world problems in different disciplines such as climate science, fluid dynamics, neuroscience, financial economics, etc. Full article
(This article belongs to the Special Issue Information Geometry, Complexity Measures and Data Analysis)
Show Figures

Figure 1

26 pages, 6259 KiB  
Article
Application of Generalized Composite Multiscale Lempel–Ziv Complexity in Identifying Wind Turbine Gearbox Faults
by Xiaoan Yan, Daoming She, Yadong Xu and Minping Jia
Entropy 2021, 23(11), 1372; https://0-doi-org.brum.beds.ac.uk/10.3390/e23111372 - 20 Oct 2021
Cited by 17 | Viewed by 1676
Abstract
Wind turbine gearboxes operate in harsh environments; therefore, the resulting gear vibration signal has characteristics of strong nonlinearity, is non-stationary, and has a low signal-to-noise ratio, which indicates that it is difficult to identify wind turbine gearbox faults effectively by the traditional methods. [...] Read more.
Wind turbine gearboxes operate in harsh environments; therefore, the resulting gear vibration signal has characteristics of strong nonlinearity, is non-stationary, and has a low signal-to-noise ratio, which indicates that it is difficult to identify wind turbine gearbox faults effectively by the traditional methods. To solve this problem, this paper proposes a new fault diagnosis method for wind turbine gearboxes based on generalized composite multiscale Lempel–Ziv complexity (GCMLZC). Within the proposed method, an effective technique named multiscale morphological-hat convolution operator (MHCO) is firstly presented to remove the noise interference information of the original gear vibration signal. Then, the GCMLZC of the filtered signal was calculated to extract gear fault features. Finally, the extracted fault features were input into softmax classifier for automatically identifying different health conditions of wind turbine gearboxes. The effectiveness of the proposed method was validated by the experimental and engineering data analysis. The results of the analysis indicate that the proposed method can identify accurately different gear health conditions. Moreover, the identification accuracy of the proposed method is higher than that of traditional multiscale Lempel–Ziv complexity (MLZC) and several representative multiscale entropies (e.g., multiscale dispersion entropy (MDE), multiscale permutation entropy (MPE) and multiscale sample entropy (MSE)). Full article
(This article belongs to the Special Issue Information Geometry, Complexity Measures and Data Analysis)
Show Figures

Figure 1

14 pages, 844 KiB  
Article
On α-Limit Sets in Lorenz Maps
by Łukasz Cholewa and Piotr Oprocha
Entropy 2021, 23(9), 1153; https://0-doi-org.brum.beds.ac.uk/10.3390/e23091153 - 02 Sep 2021
Cited by 2 | Viewed by 1739
Abstract
The aim of this paper is to show that α-limit sets in Lorenz maps do not have to be completely invariant. This highlights unexpected dynamical behavior in these maps, showing gaps existing in the literature. Similar result is obtained for unimodal maps [...] Read more.
The aim of this paper is to show that α-limit sets in Lorenz maps do not have to be completely invariant. This highlights unexpected dynamical behavior in these maps, showing gaps existing in the literature. Similar result is obtained for unimodal maps on [0,1]. On the basis of provided examples, we also present how the performed study on the structure of α-limit sets is closely connected with the calculation of the topological entropy. Full article
(This article belongs to the Special Issue Information Geometry, Complexity Measures and Data Analysis)
Show Figures

Figure 1

7 pages, 284 KiB  
Article
Decoherence, Anti-Decoherence, and Fisher Information
by Andres M. Kowalski and Angelo Plastino
Entropy 2021, 23(8), 1035; https://0-doi-org.brum.beds.ac.uk/10.3390/e23081035 - 12 Aug 2021
Cited by 4 | Viewed by 1575
Abstract
In this work, we study quantum decoherence as reflected by the dynamics of a system that accounts for the interaction between matter and a given field. The process is described by an important information geometry tool: Fisher’s information measure (FIM). We find that [...] Read more.
In this work, we study quantum decoherence as reflected by the dynamics of a system that accounts for the interaction between matter and a given field. The process is described by an important information geometry tool: Fisher’s information measure (FIM). We find that it appropriately describes this concept, detecting salient details of the quantum–classical changeover (qcc). A good description of the qcc report can thus be obtained; in particular, a clear insight into the role that the uncertainty principle (UP) plays in the pertinent proceedings is presented. Plotting FIM versus a system’s motion invariant related to the UP, one can also visualize how anti-decoherence takes place, as opposed to the decoherence process studied in dozens of papers. In Fisher terms, the qcc can be seen as an order (quantum)–disorder (classical, including chaos) transition. Full article
(This article belongs to the Special Issue Information Geometry, Complexity Measures and Data Analysis)
Show Figures

Figure 1

Review

Jump to: Editorial, Research

18 pages, 732 KiB  
Review
Attention Mechanisms and Their Applications to Complex Systems
by Adrián Hernández and José M. Amigó
Entropy 2021, 23(3), 283; https://0-doi-org.brum.beds.ac.uk/10.3390/e23030283 - 26 Feb 2021
Cited by 30 | Viewed by 5991
Abstract
Deep learning models and graphics processing units have completely transformed the field of machine learning. Recurrent neural networks and long short-term memories have been successfully used to model and predict complex systems. However, these classic models do not perform sequential reasoning, a process [...] Read more.
Deep learning models and graphics processing units have completely transformed the field of machine learning. Recurrent neural networks and long short-term memories have been successfully used to model and predict complex systems. However, these classic models do not perform sequential reasoning, a process that guides a task based on perception and memory. In recent years, attention mechanisms have emerged as a promising solution to these problems. In this review, we describe the key aspects of attention mechanisms and some relevant attention techniques and point out why they are a remarkable advance in machine learning. Then, we illustrate some important applications of these techniques in the modeling of complex systems. Full article
(This article belongs to the Special Issue Information Geometry, Complexity Measures and Data Analysis)
Show Figures

Figure 1

Back to TopTop