entropy-logo

Journal Browser

Journal Browser

Types of Entropies and Divergences with Their Applications

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Multidisciplinary Applications".

Deadline for manuscript submissions: closed (31 May 2022) | Viewed by 12131

Special Issue Editors


E-Mail Website
Guest Editor
Faculty of Mathematics and Computer Science, Transilvania University of Brasov, Iuliu Maniu Street 50, 500091 Brasov, Romania
Interests: inequalities; generalized entropies; Euclidean geometry; operator theory
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Information Science, College of Humanities and Sciences, Nihon University, Setagaya-ku, Tokyo 156-8550, Japan
Interests: generalized entropies; inequalities; matrix analysis; operator theory
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Entropy is an important concept in many fields related to communications. The entropy was originally created by Shannon as part of his theory of communication, in which a data communication system is composed of three elements: a source of data, a communication channel, and a receiver. Many types of entropies and divergences are studied in many works. The theory of entropy represents an old topic of many mathematical areas which still remains an attractive research domain with many applications. The research results presented in this Special Issue concern the properties of different types of entropies and divergences, highlight their applications, and promote the exchange of ideas between mathematicians from many parts of the world. Entropies quantify the diversity, uncertainty, and randomness of a system. Many important types of entropies of divergences have applications in statistical mechanics, operators theory, networks theory, quantum information theory, statistics, etc. By example, the concept of the Rényi entropy has been of great importance in statistics, ecology, theoretical computer science, etc.

Please note that all submitted papers should be within the scope of the journal.

Prof. Dr. Nicuşor Minculete
Prof. Dr. Shigeru Furuichi
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Shannon entropy
  • Rényi entropy
  • generalized entropies
  • Tsallis divergence
  • Rényi divergence
  • Kullback–Leibler divergence
  • Jeffreys–Tsallis divergence
  • Jensen–Shannon–Tsallis divergence
  • Csiszár f-divergence

Related Special Issue

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

3 pages, 181 KiB  
Editorial
Types of Entropies and Divergences with Their Applications
by Nicuşor Minculete and Shigeru Furuichi
Entropy 2023, 25(2), 198; https://0-doi-org.brum.beds.ac.uk/10.3390/e25020198 - 19 Jan 2023
Viewed by 1092
Abstract
Entropy is an important concept in many fields related to communications [...] Full article
(This article belongs to the Special Issue Types of Entropies and Divergences with Their Applications)

Research

Jump to: Editorial

23 pages, 382 KiB  
Article
Are Guessing, Source Coding and Tasks Partitioning Birds of A Feather?
by M. Ashok Kumar, Albert Sunny, Ashish Thakre, Ashisha Kumar and G. Dinesh Manohar
Entropy 2022, 24(11), 1695; https://0-doi-org.brum.beds.ac.uk/10.3390/e24111695 - 19 Nov 2022
Cited by 1 | Viewed by 1586
Abstract
This paper establishes a close relationship among the four information theoretic problems, namely Campbell source coding, Arikan guessing, Huleihel et al. memoryless guessing and Bunte and Lapidoth tasks’ partitioning problems in the IID-lossless case. We first show that the aforementioned problems are mathematically [...] Read more.
This paper establishes a close relationship among the four information theoretic problems, namely Campbell source coding, Arikan guessing, Huleihel et al. memoryless guessing and Bunte and Lapidoth tasks’ partitioning problems in the IID-lossless case. We first show that the aforementioned problems are mathematically related via a general moment minimization problem whose optimum solution is given in terms of Renyi entropy. We then propose a general framework for the mismatched version of these problems and establish all the asymptotic results using this framework. The unified framework further enables us to study a variant of Bunte–Lapidoth’s tasks partitioning problem which is practically more appealing. In addition, this variant turns out to be a generalization of Arıkan’s guessing problem. Finally, with the help of this general framework, we establish an equivalence among all these problems, in the sense that, knowing an asymptotically optimal solution in one problem helps us find the same in all other problems. Full article
(This article belongs to the Special Issue Types of Entropies and Divergences with Their Applications)
Show Figures

Figure 1

10 pages, 366 KiB  
Article
Tight and Scalable Side-Channel Attack Evaluations through Asymptotically Optimal Massey-like Inequalities on Guessing Entropy
by Andrei Tănăsescu, Marios O. Choudary, Olivier Rioul and Pantelimon George Popescu
Entropy 2021, 23(11), 1538; https://0-doi-org.brum.beds.ac.uk/10.3390/e23111538 - 18 Nov 2021
Cited by 4 | Viewed by 1677
Abstract
The bounds presented at CHES 2017 based on Massey’s guessing entropy represent the most scalable side-channel security evaluation method to date. In this paper, we present an improvement of this method, by determining the asymptotically optimal Massey-like inequality and then further refining it [...] Read more.
The bounds presented at CHES 2017 based on Massey’s guessing entropy represent the most scalable side-channel security evaluation method to date. In this paper, we present an improvement of this method, by determining the asymptotically optimal Massey-like inequality and then further refining it for finite support distributions. The impact of these results is highlighted for side-channel attack evaluations, demonstrating the improvements over the CHES 2017 bounds. Full article
(This article belongs to the Special Issue Types of Entropies and Divergences with Their Applications)
Show Figures

Figure 1

10 pages, 1911 KiB  
Article
Information Theory Based Evaluation of the RC4 Stream Cipher Outputs
by Evaristo José Madarro-Capó , Carlos Miguel Legón-Pérez , Omar Rojas and Guillermo Sosa-Gómez
Entropy 2021, 23(7), 896; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070896 - 14 Jul 2021
Cited by 3 | Viewed by 2281
Abstract
This paper presents a criterion, based on information theory, to measure the amount of average information provided by the sequences of outputs of the RC4 on the internal state. The test statistic used is the sum of the maximum plausible estimates of the [...] Read more.
This paper presents a criterion, based on information theory, to measure the amount of average information provided by the sequences of outputs of the RC4 on the internal state. The test statistic used is the sum of the maximum plausible estimates of the entropies H(jt|zt), corresponding to the probability distributions P(jt|zt) of the sequences of random variables (jt)tT and (zt)tT, independent, but not identically distributed, where zt are the known values of the outputs, while jt is one of the unknown elements of the internal state of the RC4. It is experimentally demonstrated that the test statistic allows for determining the most vulnerable RC4 outputs, and it is proposed to be used as a vulnerability metric for each RC4 output sequence concerning the iterative probabilistic attack. Full article
(This article belongs to the Special Issue Types of Entropies and Divergences with Their Applications)
Show Figures

Figure 1

22 pages, 637 KiB  
Article
Selecting an Effective Entropy Estimator for Short Sequences of Bits and Bytes with Maximum Entropy
by Lianet Contreras Rodríguez, Evaristo José Madarro-Capó , Carlos Miguel Legón-Pérez , Omar Rojas and Guillermo Sosa-Gómez
Entropy 2021, 23(5), 561; https://0-doi-org.brum.beds.ac.uk/10.3390/e23050561 - 30 Apr 2021
Cited by 8 | Viewed by 2462
Abstract
Entropy makes it possible to measure the uncertainty about an information source from the distribution of its output symbols. It is known that the maximum Shannon’s entropy of a discrete source of information is reached when its symbols follow a Uniform distribution. In [...] Read more.
Entropy makes it possible to measure the uncertainty about an information source from the distribution of its output symbols. It is known that the maximum Shannon’s entropy of a discrete source of information is reached when its symbols follow a Uniform distribution. In cryptography, these sources have great applications since they allow for the highest security standards to be reached. In this work, the most effective estimator is selected to estimate entropy in short samples of bytes and bits with maximum entropy. For this, 18 estimators were compared. Results concerning the comparisons published in the literature between these estimators are discussed. The most suitable estimator is determined experimentally, based on its bias, the mean square error short samples of bytes and bits. Full article
(This article belongs to the Special Issue Types of Entropies and Divergences with Their Applications)
Show Figures

Figure 1

15 pages, 296 KiB  
Article
Refined Young Inequality and Its Application to Divergences
by Shigeru Furuichi and Nicuşor Minculete
Entropy 2021, 23(5), 514; https://0-doi-org.brum.beds.ac.uk/10.3390/e23050514 - 23 Apr 2021
Cited by 9 | Viewed by 1755
Abstract
We give bounds on the difference between the weighted arithmetic mean and the weighted geometric mean. These imply refined Young inequalities and the reverses of the Young inequality. We also studied some properties on the difference between the weighted arithmetic mean and the [...] Read more.
We give bounds on the difference between the weighted arithmetic mean and the weighted geometric mean. These imply refined Young inequalities and the reverses of the Young inequality. We also studied some properties on the difference between the weighted arithmetic mean and the weighted geometric mean. Applying the newly obtained inequalities, we show some results on the Tsallis divergence, the Rényi divergence, the Jeffreys–Tsallis divergence and the Jensen–Shannon–Tsallis divergence. Full article
(This article belongs to the Special Issue Types of Entropies and Divergences with Their Applications)
Back to TopTop