Special Issue "Information and Entropy"

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (31 October 2009).

Special Issue Editor

Dr. Peter Harremoës *
E-Mail Website
Guest Editor
Copenhagen Business College, Rønne Alle 1, st., DK-2860 Søborg, Denmark
Interests: cause and effect; entropy; exponential families; graphical models; information divergence; minimum description length; quantum information; statistical mechanics
* Dr. Harremoës also serves as the Editor-in-Chief of Entropy
Special Issues, Collections and Topics in MDPI journals

Keywords

  • entropy
  • information
  • information theory

Published Papers (15 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review, Other

Article
Recovering Matrices of Economic Flows from Incomplete Data and a Composite Prior
Entropy 2010, 12(3), 516-527; https://0-doi-org.brum.beds.ac.uk/10.3390/e12030516 - 12 Mar 2010
Cited by 4 | Viewed by 6208
Abstract
In several socioeconomic applications, matrices containing information on flows-trade, income or migration flows, for example–are usually not constructed from direct observation but are rather estimated, since the compilation of the information required is often extremely expensive and time-consuming. The estimation process takes as [...] Read more.
In several socioeconomic applications, matrices containing information on flows-trade, income or migration flows, for example–are usually not constructed from direct observation but are rather estimated, since the compilation of the information required is often extremely expensive and time-consuming. The estimation process takes as point of departure another matrix which is adjusted until it optimizes some divergence criterion and simultaneously is consistent with some partial information-row and column margins–of the target matrix. Among all the possible criteria to be considered, one of the most popular is the Kullback-Leibler divergence [1], leading to the well-known Cross-Entropy technique. This paper proposes the use of a composite Cross-Entropy approach that allows for introducing a mixture of two types of a priori information–two possible matrices to be included as point of departure in the estimation process. By means of a Monte Carlo simulation experiment, we will show that under some circumstances this approach outperforms other competing estimators. Besides, a real-world case with a matrix of interregional trade is included to show the applicability of the suggested technique. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

Article
The Quantum-Classical Transition as an Information Flow
Entropy 2010, 12(1), 148-160; https://0-doi-org.brum.beds.ac.uk/10.3390/e12010148 - 26 Jan 2010
Cited by 5 | Viewed by 7302
Abstract
We investigate the classical limit of the semiclassical evolution with reference to a well-known model that represents the interaction between matter and a given field. This is done by recourse to a special statistical quantifier called the “symbolic transfer entropy”. We encounter that [...] Read more.
We investigate the classical limit of the semiclassical evolution with reference to a well-known model that represents the interaction between matter and a given field. This is done by recourse to a special statistical quantifier called the “symbolic transfer entropy”. We encounter that the quantum-classical transition gets thereby described as the sign-reversal of the dominating direction of the information flow between classical and quantal variables. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

Article
A Dynamic Model of Information and Entropy
Entropy 2010, 12(1), 80-88; https://0-doi-org.brum.beds.ac.uk/10.3390/e12010080 - 07 Jan 2010
Cited by 5 | Viewed by 5473
Abstract
We discuss the possibility of a relativistic relationship between information and entropy, closely analogous to the classical Maxwell electro-magnetic wave equations. Inherent to the analysis is the description of information as residing in points of non-analyticity; yet ultimately also exhibiting a distributed characteristic: [...] Read more.
We discuss the possibility of a relativistic relationship between information and entropy, closely analogous to the classical Maxwell electro-magnetic wave equations. Inherent to the analysis is the description of information as residing in points of non-analyticity; yet ultimately also exhibiting a distributed characteristic: additionally analogous, therefore, to the wave-particle duality of light. At cosmological scales our vector differential equations predict conservation of information in black holes, whereas regular- and Z-DNA molecules correspond to helical solutions at microscopic levels. We further propose that regular- and Z-DNA are equivalent to the alternative words chosen from an alphabet to maintain the equilibrium of an information transmission system. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

Article
Redundancy in Systems Which Entertain a Model of Themselves: Interaction Information and the Self-Organization of Anticipation
Entropy 2010, 12(1), 63-79; https://0-doi-org.brum.beds.ac.uk/10.3390/e12010063 - 06 Jan 2010
Cited by 24 | Viewed by 9263
Abstract
Mutual information among three or more dimensions (μ* = –Q) has been considered as interaction information. However, Krippendorff [1,2] has shown that this measure cannot be interpreted as a unique property of the interactions and has proposed an alternative measure of interaction information [...] Read more.
Mutual information among three or more dimensions (μ* = –Q) has been considered as interaction information. However, Krippendorff [1,2] has shown that this measure cannot be interpreted as a unique property of the interactions and has proposed an alternative measure of interaction information based on iterative approximation of maximum entropies. Q can then be considered as a measure of the difference between interaction information and redundancy generated in a model entertained by an observer. I argue that this provides us with a measure of the imprint of a second-order observing system—a model entertained by the system itself—on the underlying information processing. The second-order system communicates meaning hyper-incursively; an observation instantiates this meaning-processing within the information processing. The net results may add to or reduce the prevailing uncertainty. The model is tested empirically for the case where textual organization can be expected to contain intellectual organization in terms of distributions of title words, author names, and cited references. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

Article
On the Spectral Entropy of Thermodynamic Paths for Elementary Systems
Entropy 2009, 11(4), 1025-1041; https://0-doi-org.brum.beds.ac.uk/10.3390/e11041025 - 07 Dec 2009
Cited by 1 | Viewed by 6256
Abstract
Systems do not elect thermodynamic pathways on their own. They operate in tandem with their surroundings. Pathway selection and traversal require coordinated work and heat exchanges along with parallel tuning of the system variables. Previous research by the author (Reference [1]) focused on [...] Read more.
Systems do not elect thermodynamic pathways on their own. They operate in tandem with their surroundings. Pathway selection and traversal require coordinated work and heat exchanges along with parallel tuning of the system variables. Previous research by the author (Reference [1]) focused on the information expressed in thermodynamic pathways. Examined here is how spectral entropy is a by-product of information that depends intricately on the pathway structure. The spectral entropy has proven to be a valuable tool in diverse fields. This paper illustrates the contact between spectral entropy and the properties which distinguish ideal from non-ideal gases. The role of spectral entropy in the first and second laws of thermodynamics and heat → work conversions is also discussed. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

Communication
Dispersal (Entropy) and Recognition (Information) as Foundations of Emergence and Dissolvence
Entropy 2009, 11(4), 993-1000; https://0-doi-org.brum.beds.ac.uk/10.3390/e11040993 - 03 Dec 2009
Cited by 8 | Viewed by 4699
Abstract
The objective of this essay is to reflect on a possible relation between entropy and emergence. A qualitative, relational approach is followed. We begin by highlighting that entropy includes the concept of dispersal, relevant to our enquiry. Emergence in complex systems arises from [...] Read more.
The objective of this essay is to reflect on a possible relation between entropy and emergence. A qualitative, relational approach is followed. We begin by highlighting that entropy includes the concept of dispersal, relevant to our enquiry. Emergence in complex systems arises from the coordinated behavior of their parts. Coordination in turn necessitates recognition between parts, i.e., information exchange. What will be argued here is that the scope of recognition processes between parts is increased when preceded by their dispersal, which multiplies the number of encounters and creates a richer potential for recognition. A process intrinsic to emergence is dissolvence (aka submergence or top-down constraints), which participates in the information-entropy interplay underlying the creation, evolution and breakdown of higher-level entities. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

Article
Equiprobability, Entropy, Gamma Distributions and Other Geometrical Questions in Multi-Agent Systems
Entropy 2009, 11(4), 959-971; https://0-doi-org.brum.beds.ac.uk/10.3390/e11040959 - 02 Dec 2009
Cited by 6 | Viewed by 6674
Abstract
A set of many identical interacting agents obeying a global additive constraint is considered. Under the hypothesis of equiprobability in the high-dimensional volume delimited in phase space by the constraint, the statistical behavior of a generic agent over the ensemble is worked out. [...] Read more.
A set of many identical interacting agents obeying a global additive constraint is considered. Under the hypothesis of equiprobability in the high-dimensional volume delimited in phase space by the constraint, the statistical behavior of a generic agent over the ensemble is worked out. The asymptotic distribution of that statistical behavior is derived from geometrical arguments. This distribution is related with the Gamma distributions found in several multi-agent economy models. The parallelism with all these systems is established. Also, as a collateral result, a formula for the volume of high-dimensional symmetrical bodies is proposed. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

Article
A Lower-Bound for the Maximin Redundancy in Pattern Coding
Entropy 2009, 11(4), 634-642; https://0-doi-org.brum.beds.ac.uk/10.3390/e11040634 - 22 Oct 2009
Cited by 6 | Viewed by 5295
Abstract
We show that the maximin average redundancy in pattern coding is eventually larger than 1.84 (n/log n)1/3 for messages of length n. This improves recent results on pattern redundancy, although it does not fill the gap between known [...] Read more.
We show that the maximin average redundancy in pattern coding is eventually larger than 1.84 (n/log n)1/3 for messages of length n. This improves recent results on pattern redundancy, although it does not fill the gap between known lower- and upper-bounds. The pattern of a string is obtained by replacing each symbol by the index of its first occurrence. The problem of pattern coding is of interest because strongly universal codes have been proved to exist for patterns while universal message coding is impossible for memoryless sources on an infinite alphabet. The proof uses fine combinatorial results on partitions with small summands. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

Article
Landauer’s Principle and Divergenceless Dynamical Systems
Entropy 2009, 11(4), 586-597; https://0-doi-org.brum.beds.ac.uk/10.3390/e11040586 - 13 Oct 2009
Cited by 4 | Viewed by 7582
Abstract
Landauer’s principle is one of the pillars of the physics of information. It constitutes one of the foundations behind the idea that “information is physical”. Landauer’s principle establishes the smallest amount of energy that has to be dissipated when one bit of information [...] Read more.
Landauer’s principle is one of the pillars of the physics of information. It constitutes one of the foundations behind the idea that “information is physical”. Landauer’s principle establishes the smallest amount of energy that has to be dissipated when one bit of information is erased from a computing device. Here we explore an extended Landauerlike principle valid for general dynamical systems (not necessarily Hamiltonian) governed by divergenceless phase space flows. Full article
(This article belongs to the Special Issue Information and Entropy)
Article
Scale-Based Gaussian Coverings: Combining Intra and Inter Mixture Models in Image Segmentation
Entropy 2009, 11(3), 513-528; https://0-doi-org.brum.beds.ac.uk/10.3390/e11030513 - 24 Sep 2009
Cited by 3 | Viewed by 6100
Abstract
By a “covering” we mean a Gaussian mixture model fit to observed data. Approximations of the Bayes factor can be availed of to judge model fit to the data within a given Gaussian mixture model. Between families of Gaussian mixture models, we propose [...] Read more.
By a “covering” we mean a Gaussian mixture model fit to observed data. Approximations of the Bayes factor can be availed of to judge model fit to the data within a given Gaussian mixture model. Between families of Gaussian mixture models, we propose the Rényi quadratic entropy as an excellent and tractable model comparison framework. We exemplify this using the segmentation of an MRI image volume, based (1) on a direct Gaussian mixture model applied to the marginal distribution function, and (2) Gaussian model fit through k-means applied to the 4D multivalued image volume furnished by the wavelet transform. Visual preference for one model over another is not immediate. The Rényi quadratic entropy allows us to show clearly that one of these modelings is superior to the other. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

Article
Information, Deformed қ-Wehrl Entropies and Semiclassical Delocalization
Entropy 2009, 11(1), 32-41; https://0-doi-org.brum.beds.ac.uk/10.3390/e11010032 - 27 Jan 2009
Cited by 4 | Viewed by 6441
Abstract
Semiclassical delocalization in phase space constitutes a manifestation of the Uncertainty Principle, one indispensable part of the present understanding of Nature and the Wehrl entropy is widely regarded as the foremost localization-indicator. We readdress the matter here within the framework of the celebrated [...] Read more.
Semiclassical delocalization in phase space constitutes a manifestation of the Uncertainty Principle, one indispensable part of the present understanding of Nature and the Wehrl entropy is widely regarded as the foremost localization-indicator. We readdress the matter here within the framework of the celebrated semiclassical Husimi distributions and their associatedWehrl entropies, suitably қ-deformed. We are able to show that it is possible to significantly improve on the extant phase-space classical-localization power. Full article
(This article belongs to the Special Issue Information and Entropy)
Article
Generalized Measure of Departure from No Three-Factor Interaction Model for 2 x 2 x K Contingency Tables
Entropy 2008, 10(4), 776-785; https://0-doi-org.brum.beds.ac.uk/10.3390/e10040776 - 22 Dec 2008
Viewed by 4900
Abstract
For 2 x 2 x K contingency tables, Tomizawa considered a Shannon entropy type measure to represent the degree of departure from a log-linear model of no three-factor interaction (the NOTFI model). This paper proposes a generalization of Tomizawa's measure for 2 x [...] Read more.
For 2 x 2 x K contingency tables, Tomizawa considered a Shannon entropy type measure to represent the degree of departure from a log-linear model of no three-factor interaction (the NOTFI model). This paper proposes a generalization of Tomizawa's measure for 2 x 2 x K tables. The measure proposed is expressed by using Patil-Taillie diversity index or Cressie-Read power-divergence. A special case of the proposed measure includes Tomizawa's measure. The proposed measure would be useful for comparing the degrees of departure from the NOTFI model in several tables. Full article
(This article belongs to the Special Issue Information and Entropy)

Review

Jump to: Research, Other

Review
Quantum Entropy and Its Applications to Quantum Communication and Statistical Physics
Entropy 2010, 12(5), 1194-1245; https://0-doi-org.brum.beds.ac.uk/10.3390/e12051194 - 07 May 2010
Cited by 19 | Viewed by 6067
Abstract
Quantum entropy is a fundamental concept for quantum information recently developed in various directions. We will review the mathematical aspects of quantum entropy (entropies) and discuss some applications to quantum communication, statistical physics. All topics taken here are somehow related to the quantum [...] Read more.
Quantum entropy is a fundamental concept for quantum information recently developed in various directions. We will review the mathematical aspects of quantum entropy (entropies) and discuss some applications to quantum communication, statistical physics. All topics taken here are somehow related to the quantum entropy that the present authors have been studied. Many other fields recently developed in quantum information theory, such as quantum algorithm, quantum teleportation, quantum cryptography, etc., are totally discussed in the book (reference number 60). Full article
(This article belongs to the Special Issue Information and Entropy)
Review
Processing Information in Quantum Decision Theory
Entropy 2009, 11(4), 1073-1120; https://0-doi-org.brum.beds.ac.uk/10.3390/e11041073 - 14 Dec 2009
Cited by 48 | Viewed by 8667
Abstract
A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures [...] Read more.
A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures the effect of superposition of composite prospects, including many incorporated intended actions. The theory characterizes entangled decision making, non-commutativity of subsequent decisions, and intention interference. The self-consistent procedure of decision making, in the frame of the quantum decision theory, takes into account both the available objective information as well as subjective contextual effects. This quantum approach avoids any paradox typical of classical decision theory. Conditional maximization of entropy, equivalent to the minimization of an information functional, makes it possible to connect the quantum and classical decision theories, showing that the latter is the limit of the former under vanishing interference terms. Full article
(This article belongs to the Special Issue Information and Entropy)

Other

Jump to: Research, Review

Comment
Comment on “Equiprobability, Entropy, Gamma Distributions and Other Geometrical Questions in Multi-Agent Systems”, Entropy 2009, 11, 959-971
Entropy 2009, 11(4), 1121-1122; https://0-doi-org.brum.beds.ac.uk/10.3390/e11041121 - 22 Dec 2009
Cited by 1 | Viewed by 5150
Abstract
The volume of the body enclosed by the n-dimensional Lamé curve defined by Ʃni=1 xbi = E is computed. Full article
(This article belongs to the Special Issue Information and Entropy)
Back to TopTop