Next Article in Journal
Structural Entropy to Characterize Small Proteins (70 aa) and Their Interactions
Next Article in Special Issue
Eigenvalue Estimates Using the Kolmogorov-Sinai Entropy
Previous Article in Journal
Euclidean Quantum Mechanics and Universal Nonlinear Filtering
Previous Article in Special Issue
Thermal Contact
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Concepts of Entropy and Their Applications

by
Phil Broadbridge
1,* and
Anthony J. Guttmann
2
1
Australian Mathematical Sciences Institute, c/o University of Melbourne VIC 3010, Australia
2
ARC Centre of Excellence for Mathematics and Statistics of Complex Systems, c/o University of Melbourne VIC 3010, Australia
*
Author to whom correspondence should be addressed.
Submission received: 8 February 2009 / Accepted: 9 February 2009 / Published: 12 February 2009

Introduction to the Special Issue

Since a connection was made in the 19th Century between increase of entropy and earlier expressions of the Second Law of Thermodynamics, the topic has continued to fascinate engineers, physicists, chemists, computer scientists, mathematicians and philosophers. The topic of entropy is very much alive, as witnessed by the highly cited proceedings of a lively conference on the subject, held in Dresden Germany in 2000 [1]. Our intention in running a theme program seven years after the Dresden conference was to stimulate connections between entropy theory and broader applications. The papers in this special issue arose from a meeting of the AMSI-MASCOS Theme Program, Concepts of Entropy and their Applications, which took place in Melbourne Australia, November 26- December 12, 2007.
An introduction to the general physical concepts and issues is given in the paper by Ingo Müller, Entropy and Energy, – a Universal Competition [2]. A general and clear mathematical framework is developed in the paper by Derek Robinson, Entropy and Uncertainty [3]. The pillars of thermodynamics are the equilibrium distributions. The quasi-static thermodynamic theory that most of us have been taught, involves mainly adiabatic processes that notionally connect different equilibrium states. However, the irreversible processes that happen in reality continue to be a source of intense speculation, mathematical modelling, experimentation and debate. At our conference, debate on this topic was lively, intense and long-lasting. In a second paper by Müller, some theory and evidence are shown for a version of irreversible modelling known as Extended Thermodynamics [4]. In the paper by Tommaso Ruggeri, this theory is developed further as it applies to systems of hyperbolic conservation laws in continuum mechanics [5]. Alternative formulations of irreversible thermodynamics, each claiming some merits, are given in the papers by Phil Attard [6] and Alexander Fradkov [7].
The standard formulation has naturally led to questions about open systems, and the extension to quantum mechanical systems. These topics were given a more concrete setting with the development of kinetic theory and statistical mechanics which have been of central concern to mathematical physicists since the late 19th and early 20th Centuries.
Since the very first demonstration by Lars Onsager in 1942 of a lattice phase transition in the two dimensional Ising model, researchers have investigated the behaviour of the free energy derivatives of many other lattice models that invariably lead to complicated combinatorial calculations and asymptotic analysis near critical temperatures and in the thermodynamic limit. Ludwig Boltzmann made the connection between statistical mechanics and entropy, with his famous formula S=k log W that adorns his tombstone in Vienna. The concept of entropy, interpreted in this way, allows for a broad application. In particular it helps describe combinatorial problems in the powerful language of statistical mechanics. The paper by John Dethridge and Tony Guttmann [8] gives an example of a computer algorithm to test a hypothesis in lattice combinatorics. The required level of computer power has been available only since the late 20th C. Molecular dynamic computer simulations with limited numbers of particles have given us some insights but only after the implementation of ingenious Monte-Carlo algorithms. These simulations are most trusted when equilibrium state distributions are assumed. In non-equilibrium systems, we need to choose a non-equilibrium formulation, as in the paper by Gary Morriss et al. that focuses on thermal contact phenomena [9].
A major impetus for entropy theory occurred in the mid 20th Century when it was related to communications theory, spawning the subject of information theory. The paper by Uwe Grimm applies entropy to coding theory in the spirit of lattice combinatorics [10]. Modern questions of communications theory are addressed in the papers by Terence Chan and Alex Grant [11], by Julian Sorenson [12] and by John Kitchen et al. [13]. Extraction of information by signal processing of environmental atmosphere and ocean data, a strong theme of the conference, is considered by Ian Enting [14] and in the two papers by Jørgen Frederiksen and Terence O’Kane [15, 16]. Bob Dewar uses entropy concepts to elucidate the structures of magnetohydrodynamic flow fields [17]. The scope of entropy as a diagnostic tool in higher order partial differential equations, is illustrated by Phil Broadbridge’s paper [18].
It seems therefore that acquaintance with entropy concepts is an important part of the education of modern scientists. The varied and evolving concepts of entropy are so far-reaching that this education must take place partly in mid-career as well as extending into late-career, beyond formal education. It is hoped that the theme program was educational for participants (many more than the authors listed here) and stimulating for future researchers.

Acknowledgements

We would like to thank all participants of the workshop for their very significant contributions, and Ms. Parvin Ahadi of AMSI, as well as the editorial office of Entropy for their patience, and careful production of this special issue. We gratefully acknowledge the support of the Australian Research Council Centre of Excellence for Mathematics and Statistics of Complex Systems, the Australian Mathematical Sciences Institute and the Australian Research Council Complex Open Systems Research network.

References

  1. Greven, A.; Keller, G.; Warnecke, G. (Eds.) Entropy; Princeton University Press: Princeton, 2003. [Google Scholar]
  2. Müller, I. Entropy and Energy, – a Universal Competition. Entropy 2008, 10, 462–476. [Google Scholar] [CrossRef]
  3. Robinson, D. W. Entropy and Uncertainty. Entropy 2008, 10, 493–506. [Google Scholar] [CrossRef]
  4. Müller, I. Extended Thermodynamics: a Theory of Symmetric Hyperbolic Field Equations. Entropy 2008, 10, 477–492. [Google Scholar] [CrossRef]
  5. Ruggeri, T. The Entropy Principle from Continuum Mechanics to Hyperbolic Systems of Balance Laws: The Modern Theory of Extended Thermodynamics. Entropy 2008, 10, 319–333. [Google Scholar] [CrossRef]
  6. Attard, P. The Second Entropy: A Variational Principle for Time-dependent Systems. Entropy 2008, 10, 380–390. [Google Scholar] [CrossRef]
  7. Fradkov, A. Speed-gradient Entropy Principle for Nonstationary Processes. Entropy 2008, 10, 757–764. [Google Scholar] [CrossRef]
  8. Dethridge, J. C.; Guttmann, Anthony J. Prudent Self-Avoiding Walks. Entropy 2008, 10, 309–318. [Google Scholar] [CrossRef]
  9. Morriss, G. P.; Chung, T.; Angstmann, C. Thermal Contact. Entropy 2008, 10, 786–798. [Google Scholar] [CrossRef]
  10. Grimm, U.; Heuer, M. On the Entropy and Letter Frequencies of Powerfree Words. Entropy 2008, 10, 590–612. [Google Scholar] [CrossRef]
  11. Chan, T.; Grant, A. Non-linear Information Inequalities. Entropy 2008, 10, 765–775. [Google Scholar] [CrossRef]
  12. Sorensen, J. An Assessment of Hermite Function Based Approximations of Mutual Information Applied to Independent Component Analysis. Entropy 2008, 10, 745–756. [Google Scholar] [CrossRef]
  13. Kitchen, J.; Moran, B.; Howard, S. D. Intercept Capacity: Unknown Unitary Transformation. Entropy 2008, 10, 722–735. [Google Scholar] [CrossRef]
  14. Enting, I. G. Assessing the Information Content in Environmental Modelling: A Carbon Cycle Perspective. Entropy 2008, 10, 556–575. [Google Scholar] [CrossRef]
  15. Frederiksen, J. S.; O’Kane, T. J. Entropy, Closures and Subgrid Modeling. Entropy 2008, 10, 635–683. [Google Scholar] [CrossRef]
  16. O’Kane, T. J.; Frederiksen, J. S. Comparison of Statistical Dynamical, Square Root and Ensemble Kalman Filters. Entropy 2008, 10, 684–721. [Google Scholar] [CrossRef]
  17. Dewar, R. L.; Hole, M. J.; McGann, M.; Mills, R.; Hudson, S. R. Relaxed Plasma Equilibria and Entropy-Related Plasma Self-Organization Principles. Entropy 2008, 10, 621–634. [Google Scholar] [CrossRef]
  18. Broadbridge, P. Entropy Diagnostics for Fourth Order Partial Differential Equations in Conservation Form. Entropy 2008, 10, 365–379. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Broadbridge, P.; Guttmann, A.J. Concepts of Entropy and Their Applications. Entropy 2009, 11, 59-61. https://0-doi-org.brum.beds.ac.uk/10.3390/e11010059

AMA Style

Broadbridge P, Guttmann AJ. Concepts of Entropy and Their Applications. Entropy. 2009; 11(1):59-61. https://0-doi-org.brum.beds.ac.uk/10.3390/e11010059

Chicago/Turabian Style

Broadbridge, Phil, and Anthony J. Guttmann. 2009. "Concepts of Entropy and Their Applications" Entropy 11, no. 1: 59-61. https://0-doi-org.brum.beds.ac.uk/10.3390/e11010059

Article Metrics

Back to TopTop