Special Issue "Information Theoretic Measures and Their Applications II"

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: 15 July 2022.

Special Issue Editors

Dr. Osvaldo Anibal Rosso
E-Mail
Guest Editor
Instituto de Física, Universidade Federal de Alagoas, Maceió, Alagoas 57072-970, Brazil
Interests: time-series analysis; information theory; time–frequency transform; wavelet transform; entropy and complexity; non-linear dynamics and chaos; complex networks, medical and biological applications
Special Issues, Collections and Topics in MDPI journals
Dr. Fernando Montani
E-Mail
Guest Editor
Instituto de Física La Plata, CONICET-Universidad Nacional de la Plata, La Plata, Diagonal 113 entre 63 y 64, La Plata 1900, Argentina
Interests: time-series analysis; information theory; brain and neuronal dynamics, neural coding; entropy and complexity; nonlinear dynamics and chaos; complex networks, medical and biological applications
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The evaluation of information theory quantifiers supposes some prior knowledge about the system; specifically, a probability distribution function (PDF) associated to the time series under analysis should be provided beforehand. The determination of the most adequate PDF is a fundamental problem, because the PDF P and the sample space Ω are inextricably linked. Many methods have been proposed for a proper selection of the probability space (Ω, P).

Among others, we can mention frequency counting; procedures based on amplitude statistics; binary symbolic dynamics; Fourier analysis; Gabor transform, wavelet transform; and permutation patterns. The suitability of each of the proposed methodologies depends on the peculiarity of data, such as stationarity, length of the series, the variation of the parameters, the level of noise contamination, etc. In all these cases, global aspects of the dynamics can somehow be captured, but different approaches are not equivalent in their ability to discern all relevant physical details. 

Usual methodologies assign a symbol from a finite alphabet A to each time point of the series X(t), thus creating a symbolic sequence that can be regarded as a non-causal coarse-grained description of the time series under consideration. As a consequence, order relations and the time scales of the dynamics are lost. The usual histogram technique corresponds to this kind of assignment. Time causal information may be duly incorporated if information about the past dynamics of the system is included in the symbolic sequence, i.e., symbols of alphabet A are assigned to a portion of the phase–space or trajectory.

Along different methodologies of non-causal coarse-grained type, we can mention frequency counting, procedures based on amplitude statistics, binary symbolic dynamics, Fourier analysis, or wavelet transform. The suitability of each of the proposed methodologies depends on the peculiarity of data, such as stationarity, length of the series, the variation of the parameters, the level of noise contamination, etc. In all these cases, global aspects of the dynamics can somehow be captured, but the different approaches are not equivalent in their ability to discern all relevant physical details.

In particular, Bandt and Pompe (BP) in their seminal work “Permutation Entropy: A Natural Complexity Measure for Time Series” (Phys. Rev. Lett. 1972, 88, 174102) introduced a simple and robust symbolic methodology that takes into account the time causality of the time series (causal coarse-grained methodology) by comparing neighboring values in a time series. The symbolic data are (i) created by ranking the values of the series; and (ii) defined by reordering the embedded data in ascending order, which is tantamount to a phase space reconstruction with embedding dimension (pattern length) D ≥ 2, D ∊ N and time lag τ ∊ N. In this way, it is possible to quantify the diversity of the ordering symbols (patterns) derived from a scalar time series. Note that the appropriate symbol sequence arises naturally from the time series, and no model-based assumptions are needed. In fact, the necessary “partitions” are devised by comparing the order of neighboring relative values rather than by apportioning amplitudes according to different levels. This technique, as opposed to most in current practice, takes into account the temporal structure of the time series generated by the physical process under study. As such, it allows us to uncover important details concerning the ordinal structure of the time series and can also yield information about temporal correlation.

Furthermore, the ordinal patterns associated with the Bandt–Pompe PDF are invariant with respect to nonlinear monotonous transformations. Accordingly, nonlinear drifts or scaling artificially introduced by a measurement device will not modify the estimation of quantifiers, a nice property if one deals with experimental data.

Recent approaches to obtain knowledge about the time series dynamics consider the creation of graphs based on the observed ordinal patterns of a given time series. These graphs are constructed after the transformation of a time series onto the set of ordinal patterns, taking into account the transitions between consecutive patterns. Each D! possible ordinal pattern is a vertex in the graph, and a directed edge connects two ordinal patterns in the graph if they appear sequentially in the time series. Each edge represents the transition between patterns, thus the name “ordinal patterns transition graphs”, denoted by Gπ. The study of time series via their transformation into graphs is a very successful strategy. Some notable examples are the visibility graph (VG) and horizontal visibility graph (HVG). For these approaches, each point in the data series is a vertex in the graph, and two vertices are connected by an edge if they satisfy the visibility criterion, i.e., it is possible to trace a line between two data points without intersecting intermediate points or, more strictly, a horizontal line for the HVG. However, the visibility approaches may not scale depending on the time series length, and thus, ordinal pattern transition graphs are the best of both worlds. Analysis of these graphs is often performed on their structure, by accounting for both graph measures and information quantifiers (such as probability of self-transition, in the case of graph Gπ, and node degree probability in the case of VG and HVG) that can be used for time series proper characterization as well as precise distinction among different time series dynamics.

In relation to other quantifiers, we can mention those based on mutual information which rigorously quantifies, in units known as “bits”, how much information the value of one variable reveals about the value of another. This is a dimensionless quantity that can be thought of as the reduction in uncertainty about one random variable given knowledge of another. Recent advances in machine learning have shown the relevance of deep learning methodologies, as these approaches have the capability to reshape the way we understand neuronal systems. Those are the calculations performed within the deep learning methodologies, which have the capability to learn from the experience instead of being performed by the researcher. In the case of deep learning, the most common use of information theory is to characterize probability distributions and quantify the similarity between two probability distributions. For these purposes, we use concepts such as Kullback–Leibler (KL) divergence and Jensen–Shannon divergence. The mutual information is the KL divergence between the joint distribution P(X; Y) and the factorized distribution P (x)P (y). The information bottleneck method is a theoretical principle for extracting the relevant information contained in input variable X from output variable Y . Given their joint distribution P(X,Y), the relevant information is defined as mutual information I(X; Y). It is assumed that there is a statistical relationship between X and Y, and Y implicitly determines the tangible and intangible characteristics of X. An optimal representation of X* would capture all the relevant information useful to predict Y and would eliminate all other irrelevant information. Importantly, this kind of measure can be used to study deep neural networks and can help to infer the complexity of an information flow and learning understood as the acquisition and processing of information.

Among the most recent entropy proposals, we can mention approximate entropy; sample entropy; delayed permutation entropy; the conditional entropy bottleneck method; and permutation min-entropy. That is, different methodologies have been used to understand the mechanisms behind information processing. Among those, there are also methods of frequency analysis such as wavelet transform (WT), which distinguishes itself from others due to its high efficiency when dealing with feature extraction. Wavelet analysis is the appropriate mathematical tool to analyze signals in the time and frequency domain. All these measures have important applications not only in physics but also in quite distinct areas, such as biology, medicine, economy, cognitive sciences, numerical and computational sciences, bigdata analysis, complex networks, deep learning, and neuroscience.

Dr. Osvaldo Anibal Rosso
Dr. Fernando Montani
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Article
Two-Dimensional EspEn: A New Approach to Analyze Image Texture by Irregularity
Entropy 2021, 23(10), 1261; https://0-doi-org.brum.beds.ac.uk/10.3390/e23101261 - 28 Sep 2021
Viewed by 379
Abstract
Image processing has played a relevant role in various industries, where the main challenge is to extract specific features from images. Specifically, texture characterizes the phenomenon of the occurrence of a pattern along the spatial distribution, taking into account the intensities of the [...] Read more.
Image processing has played a relevant role in various industries, where the main challenge is to extract specific features from images. Specifically, texture characterizes the phenomenon of the occurrence of a pattern along the spatial distribution, taking into account the intensities of the pixels for which it has been applied in classification and segmentation tasks. Therefore, several feature extraction methods have been proposed in recent decades, but few of them rely on entropy, which is a measure of uncertainty. Moreover, entropy algorithms have been little explored in bidimensional data. Nevertheless, there is a growing interest in developing algorithms to solve current limits, since Shannon Entropy does not consider spatial information, and SampEn2D generates unreliable values in small sizes. We introduce a proposed algorithm, EspEn (Espinosa Entropy), to measure the irregularity present in two-dimensional data, where the calculation requires setting the parameters as follows: m (length of square window), r (tolerance threshold), and ρ (percentage of similarity). Three experiments were performed; the first two were on simulated images contaminated with different noise levels. The last experiment was with grayscale images from the Normalized Brodatz Texture database (NBT). First, we compared the performance of EspEn against the entropy of Shannon and SampEn2D. Second, we evaluated the dependence of EspEn on variations of the values of the parameters m, r, and ρ. Third, we evaluated the EspEn algorithm on NBT images. The results revealed that EspEn could discriminate images with different size and degrees of noise. Finally, EspEn provides an alternative algorithm to quantify the irregularity in 2D data; the recommended parameters for better performance are m = 3, r = 20, and ρ = 0.7. Full article
(This article belongs to the Special Issue Information Theoretic Measures and Their Applications II)
Show Figures

Figure 1

Article
Estimating Phase Amplitude Coupling between Neural Oscillations Based on Permutation and Entropy
Entropy 2021, 23(8), 1070; https://0-doi-org.brum.beds.ac.uk/10.3390/e23081070 - 18 Aug 2021
Viewed by 465
Abstract
Cross-frequency phase–amplitude coupling (PAC) plays an important role in neuronal oscillations network, reflecting the interaction between the phase of low-frequency oscillation (LFO) and amplitude of the high-frequency oscillations (HFO). Thus, we applied four methods based on permutation analysis to measure PAC, including multiscale [...] Read more.
Cross-frequency phase–amplitude coupling (PAC) plays an important role in neuronal oscillations network, reflecting the interaction between the phase of low-frequency oscillation (LFO) and amplitude of the high-frequency oscillations (HFO). Thus, we applied four methods based on permutation analysis to measure PAC, including multiscale permutation mutual information (MPMI), permutation conditional mutual information (PCMI), symbolic joint entropy (SJE), and weighted-permutation mutual information (WPMI). To verify the ability of these four algorithms, a performance test including the effects of coupling strength, signal-to-noise ratios (SNRs), and data length was evaluated by using simulation data. It was shown that the performance of SJE was similar to that of other approaches when measuring PAC strength, but the computational efficiency of SJE was the highest among all these four methods. Moreover, SJE can also accurately identify the PAC frequency range under the interference of spike noise. All in all, the results demonstrate that SJE is better for evaluating PAC between neural oscillations. Full article
(This article belongs to the Special Issue Information Theoretic Measures and Their Applications II)
Show Figures

Figure 1

Article
From Continuous-Time Chaotic Systems to Pseudo Random Number Generators: Analysis and Generalized Methodology
Entropy 2021, 23(6), 671; https://0-doi-org.brum.beds.ac.uk/10.3390/e23060671 - 26 May 2021
Viewed by 880
Abstract
The use of chaotic systems in electronics, such as Pseudo-Random Number Generators (PRNGs), is very appealing. Among them, continuous-time ones are used less because, in addition to having strong temporal correlations, they require further computations to obtain the discrete solutions. Here, the time [...] Read more.
The use of chaotic systems in electronics, such as Pseudo-Random Number Generators (PRNGs), is very appealing. Among them, continuous-time ones are used less because, in addition to having strong temporal correlations, they require further computations to obtain the discrete solutions. Here, the time step and discretization method selection are first studied by conducting a detailed analysis of their effect on the systems’ statistical and chaotic behavior. We employ an approach based on interpreting the time step as a parameter of the new “maps”. From our analysis, it follows that to use them as PRNGs, two actions should be achieved (i) to keep the chaotic oscillation and (ii) to destroy the inner and temporal correlations. We then propose a simple methodology to achieve chaos-based PRNGs with good statistical characteristics and high throughput, which can be applied to any continuous-time chaotic system. We analyze the generated sequences by means of quantifiers based on information theory (permutation entropy, permutation complexity, and causal entropy × complexity plane). We show that the proposed PRNG generates sequences that successfully pass Marsaglia Diehard and NIST (National Institute of Standards and Technology) tests. Finally, we show that its hardware implementation requires very few resources. Full article
(This article belongs to the Special Issue Information Theoretic Measures and Their Applications II)
Show Figures

Figure 1

Back to TopTop