entropy-logo

Journal Browser

Journal Browser

Measures of Information II

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (30 September 2022) | Viewed by 19522

Special Issue Editor


E-Mail Website
Guest Editor
Dipartimento di Biologia, Università di Napoli Federico II, 80126 Naples, NA, Italy
Interests: stochastic orders; reliability theory; measures of discrimination (in particular entropy, extropies, inaccuracy, Kullback-Leibler); coherent systems; inference
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

How important is uncertainty in the life of a human being? Certainly an existence in which everything is deterministic is not worth living.

In 1948, Claude Shannon developed the general concept of entropy, a “measure of uncertainty”, a fundamental cornerstone of information theory, coming out from the idea of quantifying how much information there is in a message. In his paper “A Mathematical Theory of Communication”, he set out to mathematically quantify the statistical nature of “lost information” in phone-line signals, while working at Bell Telephone Laboratories.

Entropy in information theory is directly analogous to entropy in statistical thermodynamics.

In information theory, the entropy of a random variable is the average level of “information”, “uncertainty” or “surprise”, inherent in the variable’s possible outcomes.

The entropy was originally a part of his theory of communication, in which a data communication system is composed of three elements: a source of data, a communication channel, and a receiver. In Shannon’s theory, the “fundamental problem of communication” is for the receiver to be able to identify what data were generated by the source, based on the signal it receives through the channel. Thus, the basic idea is that the “informational value” of a communicated message depends on the degree to which the content of the message is surprising.

Entropy has relevance to other areas of mathematics. The definition comes from a set of axioms establishing that entropy should be a measure of how “surprising” the average outcome of a variable is. For a continuous random variable, differential entropy is analogous to entropy.

If an event is very probable, it is uninteresting when that event happens as expected; hence, transmission of such a message carries very little new information. However, if an event is unlikely to occur, it is much more informative to learn if the event happened or will happen.

In the last decades, several new measures of information and of discrimination have been defined and studied, and it is clear that many other ones (with applications in different fields) will be introduced. This Special Volume has the aim of enriching notions related to measures of discrimination.

Prof. Dr. Maria Longobardi
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Related Special Issues

Published Papers (11 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

10 pages, 5152 KiB  
Article
EspEn Graph for the Spatial Analysis of Entropy in Images
by Ricardo Alonso Espinosa Medina
Entropy 2023, 25(1), 159; https://0-doi-org.brum.beds.ac.uk/10.3390/e25010159 - 12 Jan 2023
Viewed by 1346
Abstract
The quantification of entropy in images is a topic of interest that has had different applications in the field of agronomy, product generation and medicine. Some algorithms have been proposed for the quantification of the irregularity present in an image; however, the challenges [...] Read more.
The quantification of entropy in images is a topic of interest that has had different applications in the field of agronomy, product generation and medicine. Some algorithms have been proposed for the quantification of the irregularity present in an image; however, the challenges to overcome in the computational cost involved in large images and the reliable measurements in small images are still topics of discussion. In this research we propose an algorithm, EspEn Graph, which allows the quantification and graphic representation of the irregularity present in an image, revealing the location of the places where there are more or less irregular textures in the image. EspEn is used to calculate entropy because it presents reliable and stable measurements for small size images. This allows an image to be subdivided into small sections to calculate the entropy in each section and subsequently perform the conversion of values to graphically show the regularity present in an image. In conclusion, the EspEn Graph returns information on the spatial regularity that an image with different textures has and the average of these entropy values allows a reliable measure of the general entropy of the image. Full article
(This article belongs to the Special Issue Measures of Information II)
Show Figures

Figure 1

19 pages, 390 KiB  
Article
Jarzyski’s Equality and Crooks’ Fluctuation Theorem for General Markov Chains with Application to Decision-Making Systems
by Pedro Hack, Sebastian Gottwald and Daniel A. Braun
Entropy 2022, 24(12), 1731; https://0-doi-org.brum.beds.ac.uk/10.3390/e24121731 - 27 Nov 2022
Cited by 2 | Viewed by 1304
Abstract
We define common thermodynamic concepts purely within the framework of general Markov chains and derive Jarzynski’s equality and Crooks’ fluctuation theorem in this setup. In particular, we regard the discrete-time case, which leads to an asymmetry in the definition of work that appears [...] Read more.
We define common thermodynamic concepts purely within the framework of general Markov chains and derive Jarzynski’s equality and Crooks’ fluctuation theorem in this setup. In particular, we regard the discrete-time case, which leads to an asymmetry in the definition of work that appears in the usual formulation of Crooks’ fluctuation theorem. We show how this asymmetry can be avoided with an additional condition regarding the energy protocol. The general formulation in terms of Markov chains allows transferring the results to other application areas outside of physics. Here, we discuss how this framework can be applied in the context of decision-making. This involves the definition of the relevant quantities, the assumptions that need to be made for the different fluctuation theorems to hold, as well as the consideration of discrete trajectories instead of the continuous trajectories, which are relevant in physics. Full article
(This article belongs to the Special Issue Measures of Information II)
Show Figures

Figure 1

25 pages, 576 KiB  
Article
Tsallis Entropy for Loss Models and Survival Models Involving Truncated and Censored Random Variables
by Vasile Preda, Silvia Dedu, Iuliana Iatan, Ioana Dănilă Cernat and Muhammad Sheraz
Entropy 2022, 24(11), 1654; https://0-doi-org.brum.beds.ac.uk/10.3390/e24111654 - 14 Nov 2022
Cited by 4 | Viewed by 1405
Abstract
The aim of this paper consists in developing an entropy-based approach to risk assessment for actuarial models involving truncated and censored random variables by using the Tsallis entropy measure. The effect of some partial insurance models, such as inflation, truncation and censoring from [...] Read more.
The aim of this paper consists in developing an entropy-based approach to risk assessment for actuarial models involving truncated and censored random variables by using the Tsallis entropy measure. The effect of some partial insurance models, such as inflation, truncation and censoring from above and truncation and censoring from below upon the entropy of losses is investigated in this framework. Analytic expressions for the per-payment and per-loss entropies are obtained, and the relationship between these entropies are studied. The Tsallis entropy of losses of the right-truncated loss random variable corresponding to the per-loss risk model with a deductible d and a policy limit u is computed for the exponential, Weibull, χ2 or Gamma distribution. In this context, the properties of the resulting entropies, such as the residual loss entropy and the past loss entropy, are studied as a result of using a deductible and a policy limit, respectively. Relationships between these entropy measures are derived, and the combined effect of a deductible and a policy limit is also analyzed. By investigating residual and past entropies for survival models, the entropies of losses corresponding to the proportional hazard and proportional reversed hazard models are derived. The Tsallis entropy approach for actuarial models involving truncated and censored random variables is new and more realistic, since it allows a greater degree of flexibility and improves the modeling accuracy. Full article
(This article belongs to the Special Issue Measures of Information II)
Show Figures

Figure 1

12 pages, 325 KiB  
Article
Weighted Cumulative Past Extropy and Its Inference
by Mohammad Reza Kazemi, Majid Hashempour and Maria Longobardi
Entropy 2022, 24(10), 1444; https://0-doi-org.brum.beds.ac.uk/10.3390/e24101444 - 11 Oct 2022
Cited by 5 | Viewed by 1254
Abstract
This paper introduces and studies a new generalization of cumulative past extropy called weighted cumulative past extropy (WCPJ) for continuous random variables. We explore the following: if the WCPJs of the last order statistic are equal for two distributions, then these two distributions [...] Read more.
This paper introduces and studies a new generalization of cumulative past extropy called weighted cumulative past extropy (WCPJ) for continuous random variables. We explore the following: if the WCPJs of the last order statistic are equal for two distributions, then these two distributions will be equal. We examine some properties of the WCPJ, and a number of inequalities involving bounds for WCPJ are obtained. Studies related to reliability theory are discussed. Finally, the empirical version of the WCPJ is considered, and a test statistic is proposed. The critical cutoff points of the test statistic are computed numerically. Then, the power of this test is compared to a number of alternative approaches. In some situations, its power is superior to the rest, and in some other settings, it is somewhat weaker than the others. The simulation study shows that the use of this test statistic can be satisfactory with due attention to its simple form and the rich information content behind it. Full article
(This article belongs to the Special Issue Measures of Information II)
Show Figures

Figure 1

17 pages, 410 KiB  
Article
Stochastic Properties of Fractional Generalized Cumulative Residual Entropy and Its Extensions
by Ghadah Alomani and Mohamed Kayid
Entropy 2022, 24(8), 1041; https://0-doi-org.brum.beds.ac.uk/10.3390/e24081041 - 28 Jul 2022
Cited by 2 | Viewed by 1124
Abstract
The fractional generalized cumulative residual entropy (FGCRE) has been introduced recently as a novel uncertainty measure which can be compared with the fractional Shannon entropy. Various properties of the FGCRE have been studied in the literature. In this paper, further results for this [...] Read more.
The fractional generalized cumulative residual entropy (FGCRE) has been introduced recently as a novel uncertainty measure which can be compared with the fractional Shannon entropy. Various properties of the FGCRE have been studied in the literature. In this paper, further results for this measure are obtained. The results include new representations of the FGCRE and a derivation of some bounds for it. We conduct a number of stochastic comparisons using this measure and detect the connections it has with some well-known stochastic orders and other reliability measures. We also show that the FGCRE is the Bayesian risk of a mean residual lifetime (MRL) under a suitable prior distribution function. A normalized version of the FGCRE is considered and its properties and connections with the Lorenz curve ordering are studied. The dynamic version of the measure is considered in the context of the residual lifetime and appropriate aging paths. Full article
(This article belongs to the Special Issue Measures of Information II)
Show Figures

Figure 1

21 pages, 1017 KiB  
Article
A Bayesian Surprise Approach in Designing Cognitive Radar for Autonomous Driving
by Yeganeh Zamiri-Jafarian and Konstantinos N. Plataniotis
Entropy 2022, 24(5), 672; https://0-doi-org.brum.beds.ac.uk/10.3390/e24050672 - 10 May 2022
Cited by 2 | Viewed by 2380
Abstract
This article proposes the Bayesian surprise as the main methodology that drives the cognitive radar to estimate a target’s future state (i.e., velocity, distance) from noisy measurements and execute a decision to minimize the estimation error over time. The research aims to demonstrate [...] Read more.
This article proposes the Bayesian surprise as the main methodology that drives the cognitive radar to estimate a target’s future state (i.e., velocity, distance) from noisy measurements and execute a decision to minimize the estimation error over time. The research aims to demonstrate whether the cognitive radar as an autonomous system can modify its internal model (i.e., waveform parameters) to gain consecutive informative measurements based on the Bayesian surprise. By assuming that the radar measurements are constructed from linear Gaussian state-space models, the paper applies Kalman filtering to perform state estimation for a simple vehicle-following scenario. According to the filter’s estimate, the sensor measures the contribution of prospective waveforms—which are available from the sensor profile library—to state estimation and selects the one that maximizes the expectation of Bayesian surprise. Numerous experiments examine the estimation performance of the proposed cognitive radar for single-target tracking in practical highway and urban driving environments. The robustness of the proposed method is compared to the state-of-the-art for various error measures. Results indicate that the Bayesian surprise outperforms its competitors with respect to the mean square relative error when one-step and multiple-step planning is considered. Full article
(This article belongs to the Special Issue Measures of Information II)
Show Figures

Figure 1

15 pages, 311 KiB  
Article
A Generalized Measure of Cumulative Residual Entropy
by Sudheesh Kumar Kattumannil, E. P. Sreedevi and Narayanaswamy Balakrishnan
Entropy 2022, 24(4), 444; https://0-doi-org.brum.beds.ac.uk/10.3390/e24040444 - 23 Mar 2022
Cited by 4 | Viewed by 2189
Abstract
In this work, we introduce a generalized measure of cumulative residual entropy and study its properties. We show that several existing measures of entropy, such as cumulative residual entropy, weighted cumulative residual entropy and cumulative residual Tsallis entropy, are all special cases of [...] Read more.
In this work, we introduce a generalized measure of cumulative residual entropy and study its properties. We show that several existing measures of entropy, such as cumulative residual entropy, weighted cumulative residual entropy and cumulative residual Tsallis entropy, are all special cases of this generalized cumulative residual entropy. We also propose a measure of generalized cumulative entropy, which includes cumulative entropy, weighted cumulative entropy and cumulative Tsallis entropy as special cases. We discuss a generating function approach, using which we derive different entropy measures. We provide residual and cumulative versions of Sharma–Taneja–Mittal entropy and obtain them as special cases this generalized measure of entropy. Finally, using the newly introduced entropy measures, we establish some relationships between entropy and extropy measures. Full article
(This article belongs to the Special Issue Measures of Information II)
18 pages, 3041 KiB  
Article
Cumulative Residual q-Fisher Information and Jensen-Cumulative Residual χ2 Divergence Measures
by Omid Kharazmi, Narayanaswamy Balakrishnan and Hassan Jamali
Entropy 2022, 24(3), 341; https://0-doi-org.brum.beds.ac.uk/10.3390/e24030341 - 27 Feb 2022
Cited by 6 | Viewed by 1752
Abstract
In this work, we define cumulative residual q-Fisher (CRQF) information measures for the survival function (SF) of the underlying random variables as well as for the model parameter. We also propose q-hazard rate (QHR) function via q-logarithmic function as a [...] Read more.
In this work, we define cumulative residual q-Fisher (CRQF) information measures for the survival function (SF) of the underlying random variables as well as for the model parameter. We also propose q-hazard rate (QHR) function via q-logarithmic function as a new extension of hazard rate function. We show that CRQF information measure can be expressed in terms of the QHR function. We define further generalized cumulative residual χ2 divergence measures between two SFs. We then examine the cumulative residual q-Fisher information for two well-known mixture models, and the corresponding results reveal some interesting connections between the cumulative residual q-Fisher information and the generalized cumulative residual χ2 divergence measures. Further, we define Jensen-cumulative residual χ2 (JCR-χ2) measure and a parametric version of the Jensen-cumulative residual Fisher information measure and then discuss their properties and inter-connections. Finally, for illustrative purposes, we examine a real example of image processing and provide some numerical results in terms of the CRQF information measure. Full article
(This article belongs to the Special Issue Measures of Information II)
Show Figures

Figure 1

15 pages, 446 KiB  
Article
Weighted Relative Group Entropies and Associated Fisher Metrics
by Iulia-Elena Hirica, Cristina-Liliana Pripoae, Gabriel-Teodor Pripoae and Vasile Preda
Entropy 2022, 24(1), 120; https://0-doi-org.brum.beds.ac.uk/10.3390/e24010120 - 13 Jan 2022
Cited by 6 | Viewed by 1498
Abstract
A large family of new α-weighted group entropy functionals is defined and associated Fisher-like metrics are considered. All these notions are well-suited semi-Riemannian tools for the geometrization of entropy-related statistical models, where they may act as sensitive controlling invariants. The main result [...] Read more.
A large family of new α-weighted group entropy functionals is defined and associated Fisher-like metrics are considered. All these notions are well-suited semi-Riemannian tools for the geometrization of entropy-related statistical models, where they may act as sensitive controlling invariants. The main result of the paper establishes a link between such a metric and a canonical one. A sufficient condition is found, in order that the two metrics be conformal (or homothetic). In particular, we recover a recent result, established for α=1 and for non-weighted relative group entropies. Our conformality condition is “universal”, in the sense that it does not depend on the group exponential. Full article
(This article belongs to the Special Issue Measures of Information II)
Show Figures

Figure 1

16 pages, 361 KiB  
Article
On a 2-Relative Entropy
by James Fullwood
Entropy 2022, 24(1), 74; https://0-doi-org.brum.beds.ac.uk/10.3390/e24010074 - 31 Dec 2021
Cited by 1 | Viewed by 1268
Abstract
We construct a 2-categorical extension of the relative entropy functor of Baez and Fritz, and show that our construction is functorial with respect to vertical morphisms. Moreover, we show such a ‘2-relative entropy’ satisfies natural 2-categorial analogues of convex linearity, vanishing under optimal [...] Read more.
We construct a 2-categorical extension of the relative entropy functor of Baez and Fritz, and show that our construction is functorial with respect to vertical morphisms. Moreover, we show such a ‘2-relative entropy’ satisfies natural 2-categorial analogues of convex linearity, vanishing under optimal hypotheses, and lower semicontinuity. While relative entropy is a relative measure of information between probability distributions, we view our construction as a relative measure of information between channels. Full article
(This article belongs to the Special Issue Measures of Information II)
14 pages, 420 KiB  
Article
Kernel Estimation of Cumulative Residual Tsallis Entropy and Its Dynamic Version under ρ-Mixing Dependent Data
by Muhammed Rasheed Irshad, Radhakumari Maya, Francesco Buono and Maria Longobardi
Entropy 2022, 24(1), 9; https://0-doi-org.brum.beds.ac.uk/10.3390/e24010009 - 21 Dec 2021
Cited by 4 | Viewed by 2348
Abstract
Tsallis introduced a non-logarithmic generalization of Shannon entropy, namely Tsallis entropy, which is non-extensive. Sati and Gupta proposed cumulative residual information based on this non-extensive entropy measure, namely cumulative residual Tsallis entropy (CRTE), and its dynamic version, namely dynamic cumulative residual Tsallis entropy [...] Read more.
Tsallis introduced a non-logarithmic generalization of Shannon entropy, namely Tsallis entropy, which is non-extensive. Sati and Gupta proposed cumulative residual information based on this non-extensive entropy measure, namely cumulative residual Tsallis entropy (CRTE), and its dynamic version, namely dynamic cumulative residual Tsallis entropy (DCRTE). In the present paper, we propose non-parametric kernel type estimators for CRTE and DCRTE where the considered observations exhibit an ρ-mixing dependence condition. Asymptotic properties of the estimators were established under suitable regularity conditions. A numerical evaluation of the proposed estimator is exhibited and a Monte Carlo simulation study was carried out. Full article
(This article belongs to the Special Issue Measures of Information II)
Show Figures

Figure 1

Back to TopTop