entropy-logo

Journal Browser

Journal Browser

Rényi Entropy: Sixty Years Later

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (28 February 2023) | Viewed by 13661

Special Issue Editors


E-Mail Website
Guest Editor
Department of Applied Physics and Astronomy, University of Sharjah, P.O. Box, Sharjah 27272, United Arab Emirates
Interests: quantum ring; Rényi entropy; Tsallis entropy; magnetic field; Aharonov-Bohm effect; mathematical physics

grade E-Mail Website
Guest Editor
Institute of Fundamental and Frontier Sciences, University of Electronic Science and Technology of China, Chengdu 611731, China
Interests: uncertainty measure; Shannon entropy; Tsallis entropy; Renyi entropy; Deng entropy; evidence theory; fuzzy sets; fractal; complex network; time series
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Recently, Rényi entropy celebrated its sixtieth anniversary, the namesake of the Hungarian mathematician who introduced the measure in 1960. This measure of uncertainty was conceived as a one-parameter generalization of its Shannon counterpart; in particular, one of the main requirements was an additivity. During this time, impressive progress was made in studying the properties of Rényi entropy and its application in physics, mathematics, astronomy, computer science, engineering, medicine, material science and many other disciplines. When compared to the human lifespan, Rényi entropy is approaching (or, for some countries, has already reached) the age of retirement. However, from a scientific point of view, it is a rapidly developing field of vibrant exploration with a fast-growing number of researchers scrutinizing the advantages that it has to offer.

For this Special Issue, original contributions are invited that present the most recent developments and applications of Rényi entropy, as well as its derivatives and generalizations in all of the aforementioned disciplines and any other fields of human activity.  Tentative topics to be discussed include, but by no means are limited to:

  • Mathematical and physical foundations of Rényi entropy and related measures;
  • Rényi entropy perspectives in quantum information processing;
  • Rényi entropy and machine learning;
  • Multidiscipline applications of Rényi entropy

Dr. Oleg Olendski
Prof. Dr. Yong Deng
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (8 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

18 pages, 526 KiB  
Article
Towards More Efficient Rényi Entropy Estimation
by Maciej Skorski
Entropy 2023, 25(2), 185; https://0-doi-org.brum.beds.ac.uk/10.3390/e25020185 - 17 Jan 2023
Viewed by 819
Abstract
Estimation of Rényi entropy is of fundamental importance to many applications in cryptography, statistical inference, and machine learning. This paper aims to improve the existing estimators with regard to: (a) the sample size, (b) the estimator adaptiveness, and (c) the simplicity of the [...] Read more.
Estimation of Rényi entropy is of fundamental importance to many applications in cryptography, statistical inference, and machine learning. This paper aims to improve the existing estimators with regard to: (a) the sample size, (b) the estimator adaptiveness, and (c) the simplicity of the analyses. The contribution is a novel analysis of the generalized “birthday paradox” collision estimator. The analysis is simpler than in prior works, gives clear formulas, and strengthens existing bounds. The improved bounds are used to develop an adaptive estimation technique that outperforms previous methods, particularly in regimes of low or moderate entropy. Last but not least, to demonstrate that the developed techniques are of broader interest, a number of applications concerning theoretical and practical properties of “birthday estimators” are discussed. Full article
(This article belongs to the Special Issue Rényi Entropy: Sixty Years Later)
Show Figures

Figure 1

13 pages, 2865 KiB  
Article
Residual and Past Discrete Tsallis and Renyi Extropy with an Application to Softmax Function
by Taghreed M. Jawa, Nahid Fatima, Neveen Sayed-Ahmed, Ramy Aldallal and Mohamed Said Mohamed
Entropy 2022, 24(12), 1732; https://0-doi-org.brum.beds.ac.uk/10.3390/e24121732 - 27 Nov 2022
Cited by 2 | Viewed by 1004
Abstract
In this paper, based on the discrete lifetime distribution, the residual and past of the Tsallis and Renyi extropy are introduced as new measures of information. Moreover, some of their properties and their relation to other measures are discussed. Furthermore, an example of [...] Read more.
In this paper, based on the discrete lifetime distribution, the residual and past of the Tsallis and Renyi extropy are introduced as new measures of information. Moreover, some of their properties and their relation to other measures are discussed. Furthermore, an example of a uniform distribution of the obtained models is given. Moreover, the softmax function can be used as a discrete probability distribution function with a unity sum. Thus, applying those measures to the softmax function for simulated and real data is demonstrated. Besides, for real data, the softmax data are fit to a convenient ARIMA model. Full article
(This article belongs to the Special Issue Rényi Entropy: Sixty Years Later)
Show Figures

Figure 1

9 pages, 288 KiB  
Article
Rényi Cross-Entropy Measures for Common Distributions and Processes with Memory
by Ferenc Cole Thierrin, Fady Alajaji and Tamás Linder
Entropy 2022, 24(10), 1417; https://0-doi-org.brum.beds.ac.uk/10.3390/e24101417 - 04 Oct 2022
Cited by 2 | Viewed by 1169
Abstract
Two Rényi-type generalizations of the Shannon cross-entropy, the Rényi cross-entropy and the Natural Rényi cross-entropy, were recently used as loss functions for the improved design of deep learning generative adversarial networks. In this work, we derive the Rényi and Natural Rényi differential cross-entropy [...] Read more.
Two Rényi-type generalizations of the Shannon cross-entropy, the Rényi cross-entropy and the Natural Rényi cross-entropy, were recently used as loss functions for the improved design of deep learning generative adversarial networks. In this work, we derive the Rényi and Natural Rényi differential cross-entropy measures in closed form for a wide class of common continuous distributions belonging to the exponential family, and we tabulate the results for ease of reference. We also summarise the Rényi-type cross-entropy rates between stationary Gaussian processes and between finite-alphabet time-invariant Markov sources. Full article
(This article belongs to the Special Issue Rényi Entropy: Sixty Years Later)
12 pages, 1053 KiB  
Article
Rényi Entropy, Signed Probabilities, and the Qubit
by Adam Brandenburger, Pierfrancesco La Mura and Stuart Zoble
Entropy 2022, 24(10), 1412; https://0-doi-org.brum.beds.ac.uk/10.3390/e24101412 - 03 Oct 2022
Cited by 3 | Viewed by 1646
Abstract
The states of the qubit, the basic unit of quantum information, are 2 × 2 positive semi-definite Hermitian matrices with trace 1. We contribute to the program to axiomatize quantum mechanics by characterizing these states in terms of an entropic uncertainty principle formulated [...] Read more.
The states of the qubit, the basic unit of quantum information, are 2 × 2 positive semi-definite Hermitian matrices with trace 1. We contribute to the program to axiomatize quantum mechanics by characterizing these states in terms of an entropic uncertainty principle formulated on an eight-point phase space. We do this by employing Rényi entropy (a generalization of Shannon entropy) suitably defined for the signed phase-space probability distributions that arise in representing quantum states. Full article
(This article belongs to the Special Issue Rényi Entropy: Sixty Years Later)
Show Figures

Figure 1

9 pages, 276 KiB  
Article
Rényi Entropy in Statistical Mechanics
by Jesús Fuentes and Jorge Gonçalves
Entropy 2022, 24(8), 1080; https://0-doi-org.brum.beds.ac.uk/10.3390/e24081080 - 05 Aug 2022
Cited by 7 | Viewed by 2512
Abstract
Rényi entropy was originally introduced in the field of information theory as a parametric relaxation of Shannon (in physics, Boltzmann–Gibbs) entropy. This has also fuelled different attempts to generalise statistical mechanics, although mostly skipping the physical arguments behind this entropy and instead tending [...] Read more.
Rényi entropy was originally introduced in the field of information theory as a parametric relaxation of Shannon (in physics, Boltzmann–Gibbs) entropy. This has also fuelled different attempts to generalise statistical mechanics, although mostly skipping the physical arguments behind this entropy and instead tending to introduce it artificially. However, as we will show, modifications to the theory of statistical mechanics are needless to see how Rényi entropy automatically arises as the average rate of change of free energy over an ensemble at different temperatures. Moreover, this notion is extended by considering distributions for isospectral, non-isothermal processes, resulting in relative versions of free energy, in which the Kullback–Leibler divergence or the relative version of Rényi entropy appear within the structure of the corrections to free energy. These generalisations of free energy recover the ordinary thermodynamic potential whenever isothermal processes are considered. Full article
(This article belongs to the Special Issue Rényi Entropy: Sixty Years Later)
5 pages, 221 KiB  
Article
Rényi Entropy and Free Energy
by John C. Baez
Entropy 2022, 24(5), 706; https://0-doi-org.brum.beds.ac.uk/10.3390/e24050706 - 16 May 2022
Cited by 9 | Viewed by 2100
Abstract
The Rényi entropy is a generalization of the usual concept of entropy which depends on a parameter q. In fact, Rényi entropy is closely related to free energy. Suppose we start with a system in thermal equilibrium and then suddenly divide the [...] Read more.
The Rényi entropy is a generalization of the usual concept of entropy which depends on a parameter q. In fact, Rényi entropy is closely related to free energy. Suppose we start with a system in thermal equilibrium and then suddenly divide the temperature by q. Then the maximum amount of work the system can perform as it moves to equilibrium at the new temperature divided by the change in temperature equals the system’s Rényi entropy in its original state. This result applies to both classical and quantum systems. Mathematically, we can express this result as follows: the Rényi entropy of a system in thermal equilibrium is without the ‘q1-derivative’ of its free energy with respect to the temperature. This shows that Rényi entropy is a q-deformation of the usual concept of entropy. Full article
(This article belongs to the Special Issue Rényi Entropy: Sixty Years Later)

Review

Jump to: Research

23 pages, 431 KiB  
Review
Rényi Entropies of Multidimensional Oscillator and Hydrogenic Systems with Applications to Highly Excited Rydberg States
by Jesús S. Dehesa
Entropy 2022, 24(11), 1590; https://0-doi-org.brum.beds.ac.uk/10.3390/e24111590 - 02 Nov 2022
Cited by 2 | Viewed by 1178
Abstract
The various facets of the internal disorder of quantum systems can be described by means of the Rényi entropies of their single-particle probability density according to modern density functional theory and quantum information techniques. In this work, we first show the lower and [...] Read more.
The various facets of the internal disorder of quantum systems can be described by means of the Rényi entropies of their single-particle probability density according to modern density functional theory and quantum information techniques. In this work, we first show the lower and upper bounds for the Rényi entropies of general and central-potential quantum systems, as well as the associated entropic uncertainty relations. Then, the Rényi entropies of multidimensional oscillator and hydrogenic-like systems are reviewed and explicitly determined for all bound stationary position and momentum states from first principles (i.e., in terms of the potential strength, the space dimensionality and the states’s hyperquantum numbers). This is possible because the associated wavefunctions can be expressed by means of hypergeometric orthogonal polynomials. Emphasis is placed on the most extreme, non-trivial cases corresponding to the highly excited Rydberg states, where the Rényi entropies can be amazingly obtained in a simple, compact, and transparent form. Powerful asymptotic approaches of approximation theory have been used when the polynomial’s degree or the weight-function parameter(s) of the Hermite, Laguerre, and Gegenbauer polynomials have large values. At present, these special states are being shown of increasing potential interest in quantum information and the associated quantum technologies, such as e.g., quantum key distribution, quantum computation, and quantum metrology. Full article
(This article belongs to the Special Issue Rényi Entropy: Sixty Years Later)
10 pages, 403 KiB  
Review
Rényi’s Entropy, Statistical Order and van der Waals Gas
by Flavia Pennini and Angelo Plastino
Entropy 2022, 24(8), 1067; https://0-doi-org.brum.beds.ac.uk/10.3390/e24081067 - 02 Aug 2022
Cited by 2 | Viewed by 1195
Abstract
The notion of statistical order derives from the disequilibrium concept introduced by López-Ruiz, Mancini, and Calbet thirty years ago. In this effort, it is shown that the disequilibrium is intimately linked to the celebrated Rényi entropy. One also explores this link in connection [...] Read more.
The notion of statistical order derives from the disequilibrium concept introduced by López-Ruiz, Mancini, and Calbet thirty years ago. In this effort, it is shown that the disequilibrium is intimately linked to the celebrated Rényi entropy. One also explores this link in connection with the van der Waals gas description. Full article
(This article belongs to the Special Issue Rényi Entropy: Sixty Years Later)
Show Figures

Figure 1

Back to TopTop