Special Issue "Measures of Information II"

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: 18 March 2022.

Special Issue Editor

Prof. Dr. Maria Longobardi
E-Mail Website
Guest Editor
Dipartimento di Biologia, Università di Napoli Federico II, 80126 Napoli NA, Italy
Interests: stochastic orders; reliability theory; measures of discrimination (in particular entropy, extropies, inaccuracy, Kullback-Leibler); coherent systems; inference
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

How important is uncertainty in the life of a human being? Certainly an existence in which everything is deterministic is not worth living.

In 1948, Claude Shannon developed the general concept of entropy, a “measure of uncertainty”, a fundamental cornerstone of information theory, coming out from the idea of quantifying how much information there is in a message. In his paper “A Mathematical Theory of Communication”, he set out to mathematically quantify the statistical nature of “lost information” in phone-line signals, while working at Bell Telephone Laboratories.

Entropy in information theory is directly analogous to entropy in statistical thermodynamics.

In information theory, the entropy of a random variable is the average level of “information”, “uncertainty” or “surprise”, inherent in the variable’s possible outcomes.

The entropy was originally a part of his theory of communication, in which a data communication system is composed of three elements: a source of data, a communication channel, and a receiver. In Shannon’s theory, the “fundamental problem of communication” is for the receiver to be able to identify what data were generated by the source, based on the signal it receives through the channel. Thus, the basic idea is that the “informational value” of a communicated message depends on the degree to which the content of the message is surprising.

Entropy has relevance to other areas of mathematics. The definition comes from a set of axioms establishing that entropy should be a measure of how “surprising” the average outcome of a variable is. For a continuous random variable, differential entropy is analogous to entropy.

If an event is very probable, it is uninteresting when that event happens as expected; hence, transmission of such a message carries very little new information. However, if an event is unlikely to occur, it is much more informative to learn if the event happened or will happen.

In the last decades, several new measures of information and of discrimination have been defined and studied, and it is clear that many other ones (with applications in different fields) will be introduced. This Special Volume has the aim of enriching notions related to measures of discrimination.

Prof. Dr. Maria Longobardi
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Related Special Issue

Published Papers (3 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Article
Weighted Relative Group Entropies and Associated Fisher Metrics
Entropy 2022, 24(1), 120; https://0-doi-org.brum.beds.ac.uk/10.3390/e24010120 - 13 Jan 2022
Viewed by 108
Abstract
A large family of new α-weighted group entropy functionals is defined and associated Fisher-like metrics are considered. All these notions are well-suited semi-Riemannian tools for the geometrization of entropy-related statistical models, where they may act as sensitive controlling invariants. The main result [...] Read more.
A large family of new α-weighted group entropy functionals is defined and associated Fisher-like metrics are considered. All these notions are well-suited semi-Riemannian tools for the geometrization of entropy-related statistical models, where they may act as sensitive controlling invariants. The main result of the paper establishes a link between such a metric and a canonical one. A sufficient condition is found, in order that the two metrics be conformal (or homothetic). In particular, we recover a recent result, established for α=1 and for non-weighted relative group entropies. Our conformality condition is “universal”, in the sense that it does not depend on the group exponential. Full article
(This article belongs to the Special Issue Measures of Information II)
Show Figures

Figure 1

Article
On a 2-Relative Entropy
Entropy 2022, 24(1), 74; https://0-doi-org.brum.beds.ac.uk/10.3390/e24010074 - 31 Dec 2021
Viewed by 118
Abstract
We construct a 2-categorical extension of the relative entropy functor of Baez and Fritz, and show that our construction is functorial with respect to vertical morphisms. Moreover, we show such a ‘2-relative entropy’ satisfies natural 2-categorial analogues of convex linearity, vanishing under optimal [...] Read more.
We construct a 2-categorical extension of the relative entropy functor of Baez and Fritz, and show that our construction is functorial with respect to vertical morphisms. Moreover, we show such a ‘2-relative entropy’ satisfies natural 2-categorial analogues of convex linearity, vanishing under optimal hypotheses, and lower semicontinuity. While relative entropy is a relative measure of information between probability distributions, we view our construction as a relative measure of information between channels. Full article
(This article belongs to the Special Issue Measures of Information II)
Article
Kernel Estimation of Cumulative Residual Tsallis Entropy and Its Dynamic Version under ρ-Mixing Dependent Data
Entropy 2022, 24(1), 9; https://0-doi-org.brum.beds.ac.uk/10.3390/e24010009 - 21 Dec 2021
Viewed by 380
Abstract
Tsallis introduced a non-logarithmic generalization of Shannon entropy, namely Tsallis entropy, which is non-extensive. Sati and Gupta proposed cumulative residual information based on this non-extensive entropy measure, namely cumulative residual Tsallis entropy (CRTE), and its dynamic version, namely dynamic cumulative residual Tsallis entropy [...] Read more.
Tsallis introduced a non-logarithmic generalization of Shannon entropy, namely Tsallis entropy, which is non-extensive. Sati and Gupta proposed cumulative residual information based on this non-extensive entropy measure, namely cumulative residual Tsallis entropy (CRTE), and its dynamic version, namely dynamic cumulative residual Tsallis entropy (DCRTE). In the present paper, we propose non-parametric kernel type estimators for CRTE and DCRTE where the considered observations exhibit an ρ-mixing dependence condition. Asymptotic properties of the estimators were established under suitable regularity conditions. A numerical evaluation of the proposed estimator is exhibited and a Monte Carlo simulation study was carried out. Full article
(This article belongs to the Special Issue Measures of Information II)
Show Figures

Figure 1

Back to TopTop