Next Article in Journal
Maximum Entropy Approaches to Living Neural Networks
Next Article in Special Issue
The Quantum-Classical Transition as an Information Flow
Previous Article in Journal / Special Issue
Redundancy in Systems Which Entertain a Model of Themselves: Interaction Information and the Self-Organization of Anticipation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Dynamic Model of Information and Entropy

by
Michael C. Parker
* and
Stuart D. Walker
School of Computer Sciences and Electronic Engineering, University of Essex, Wivenhoe Park, Colchester, CO3 4HG, UK
*
Author to whom correspondence should be addressed.
Submission received: 29 October 2009 / Accepted: 14 December 2009 / Published: 7 January 2010
(This article belongs to the Special Issue Information and Entropy)

Abstract

:
We discuss the possibility of a relativistic relationship between information and entropy, closely analogous to the classical Maxwell electro-magnetic wave equations. Inherent to the analysis is the description of information as residing in points of non-analyticity; yet ultimately also exhibiting a distributed characteristic: additionally analogous, therefore, to the wave-particle duality of light. At cosmological scales our vector differential equations predict conservation of information in black holes, whereas regular- and Z-DNA molecules correspond to helical solutions at microscopic levels. We further propose that regular- and Z-DNA are equivalent to the alternative words chosen from an alphabet to maintain the equilibrium of an information transmission system.
PACS Codes:
89.70.+c; 65.40.Gr; 42.25.Bs; 03.30.+p ....

1. Introduction

The close relationship between information and entropy is well recognised [1,2], e.g., Brillouin considered a negative change in entropy to be equivalent to a change in information (negentropy [3]), and Landauer considered the erasure of information I to be associated with an increase in entropy S , via the relationship I = S [1,4]. In previous work, we qualitatively indicated the dynamic relationship between information and entropy, stating that the movement of information is accompanied by the increase of entropy in time [5,6], and suggesting that information propagates at the speed of light in vacuum c [5,6,7,8,9]. Opinions on the physical nature of information have tended to be contradictory. One view is that information is inherent to points of non-analyticity (discontinuities) [5,7,9] that do not allow “prediction”, whereas other researchers consider the information to be more distributed (delocalised) in nature [10,11,12]. Such considerations are akin to the paradoxes arising from a wave-particle duality, with the question of which of the two complements best characterises information. In the following analysis, by initially adopting the localised, point of non-analyticity definition of information, we ultimately find that information and entropy may indeed also exhibit wavelike properties, and may be described by a pair of coupled wave equations, analogous to the Maxwell equations for electromagnetic (EM) radiation, and a distant echo of the simple difference relationship above.

2. Analysis

We start our analysis by consideration of a meromorphic [13] function ψ ( z ) in a reduced 1+1 (i.e., one space dimension and one time dimension) complex space-time z = x + i c t , with a simple, isolated pole at z 0 = x 0 + i c t 0 , as indicated in Figure 1:
ψ ( z ) = c t 0 π 1 z z 0
Figure 1. Information I (integration over imaginary ct-axis) and entropy S (integration over real x-axis) due to a point of non-analyticity (pole) at the space-time position z0.
Figure 1. Information I (integration over imaginary ct-axis) and entropy S (integration over real x-axis) due to a point of non-analyticity (pole) at the space-time position z0.
Entropy 12 00080 g001
The point of non-analyticity z 0 travels at the speed of light in vacuum c [5,6,7,8,9]. We note that ψ ( z ) acts as an inverse-square law function, e.g., as proportional to the field around an isolated charge, or the gravitational field around a point mass. It is also square-integrable, such that ρ | t = 0 d x = 1 , and i c ρ | x = 0 d t = 1 , where ρ = | ψ ( z ) | 2 , and obeys the Paley-Wiener criterion for causality [14]. We now calculate the differential information [5,15] (entropy) of the function ψ ( z ) . By convention, in this paper, we define the logarithmic integration along the spatial x-axis to yield the entropy S of the function; whereas the same integration along the imaginary t-axis yields the information I of the function. This can be qualitatively understood to arise from the entropy of a system being related to its spatial permutations (e.g., the spatial fluxions of a volume of gas), whereas the information of a system is related to its temporal distribution (e.g., the arrival of a data train of light pulses down an optical fibre.) Considering the differential entropy first, substituting Equation (1) with t = 0 and calculating the sum of the residues of the appropriate closed contour integral in the z-plane, S is given by:
S = k ρ log 2 ρ d x = k log 2 c t 0
where k is the Boltzmann constant. We have defined S using a positive sign in front of the integral, as opposed to the conventional negative sign [15], so that both the entropy and information of a space-time function are calculated in a unified fashion. In addition, this has the effect of aligning equation (2) with the 2nd Law of Thermodynamics, since S increases monotonically with increasing time t 0 . Performing the appropriate closed contour integral in the z-plane with x = 0 we find that the differential information is given by:
I = k ρ log 2 ρ c d t = k log 2 x 0
Note that in (2) and (3) some constant background values of integration have been ignored, since they disappear in the following differential analyses. We see that the entropy and information are given by surprisingly simple expressions. The information is the logarithm of the space-time distance x 0 of the point of non-analyticity from the temporal t-axis; whilst the entropy content is simply the logarithm of the distance (i.e., time c t 0 ) of the pole from the spatial x-axis. The overall info-entropy Σ of the function ψ ( z ) is given by the summation of the two orthogonal logarithmic integrals, Σ = I + i S , where the entropy S is in quadrature to the information I, since I depends on the real axis quantity x 0 , whereas S depends on the imaginary axis quantity c t 0 . Since the choice of reference axes is arbitrary, we could have chosen an alternative set of co-ordinate axes to calculate the information and entropy. For example, consider a co-ordinate set of axes x c t , rotated an angle θ about the origin with respect to the original x c t axes. In this case, the unitary transformation describing the position of the pole z 0 in the new co-ordinate framework is:
( c t 0 x 0 ) = ( cos θ sin θ sin θ cos θ ) ( c t 0 x 0 )
Using (2) and (3), the resulting values of the information and entropy for the new frame of reference are given by:
I = k log 2 x 0
S = k log 2 c t 0
with overall info-entropy Σ of the function again given by the summation of the two quadrature logarithmic integrals, Σ = I + i S . We next perform a dynamic calculus on the equations (5), with respect to the original x c t axes, using (4) to calculate:
I x = k x 0 x 0 x = k x 0 x 0 x 0 = k x 0 cos θ
1 c I t = k x 0 1 c x 0 t = k x 0 1 c x 0 t 0 = k x 0 sin θ
and:
1 c S t = k t 0 1 c t 0 t = k c t 0 t 0 t 0 = k c t 0 cos θ
S x = k c t 0 t 0 x = k c t 0 1 c t 0 x 0 = k c t 0 sin θ
where we have ignored the common factor log 2 , and we have assumed the equality of the calculus operators / x = / x 0 and / t = / t 0 , since the trajectory of the pole in the z-plane is a straight line with d x = d x 0 , and d t = d t 0 . Since the point of non-analyticity is moving at the speed of light in vacuum c [5,7,9], such that x 0 = c t 0 , we must also have x 0 = c t 0 , as c is the same constant in any frame of reference [16]. We can now see that equations (6) and (7) can be equated to yield a pair of Cauchy-Riemann equations [17]:
I x = 1 c S t
S x = 1 c I t
where we have dropped the primes, since equations (8) are true for all frames of reference. Hence, the info-entropy function Σ is analytic (i.e., holographic in nature [13]), and the Cauchy-Riemann equations (8) indicate that information and entropy (as we have defined them) may propagate as waves travelling at the speed c. In the Appendix we extend the analysis from 1+1 complex space-time to the full 3+1 (three-space and one-time dimensions) case, so that it can be straightforwardly shown that Equations (8) generalise to:
× I _ = 1 c d S _ d t
× S _ = 1 c d I _ dt
where I _ and S _ are in 3D vector form. Further basic calculus manipulations reveal that the scalar gradient (divergence) of our information and entropy fields are both equal to zero:
I _ = 0
S _ = 0
Together, Equations (9) and (10) form a set of equations analogous to the Maxwell equations [16] (with no sources or isolated charges, and in an isotropic medium). Equations (9) can be combined to yield wave equations for the information and entropy fields:
2 I _ 1 c 2 d 2 I _ d t 2 = 0
2 S _ 1 c 2 d 2 S _ d t 2 = 0

3. Discussion

As defined previously, the information and entropy fields are mutually orthogonal, since I _ S _ = 0 . Analogous to the Poynting vector describing energy flow of an EM wave, the direction of propagation of the wave is given by I _ × S _ , e.g., see Figure 2. The dynamic relationship between I and S implies that information flow tends to be accompanied by entropy flux [5,6], such that the movement of data is dissipative in order to satisfy the 2nd Law of Thermodynamics. Analytic functions such as I and S only have points of inflection, so that, again, the 2nd Law with respect to the S-field is a natural consequence of its monotonicity. However, in analogy to an EM-wave, Equations (9) and (10) in combination imply that the sum of information and entropy | I _ | 2 + | S _ | 2 may be a conserved quantity. Hence, the 2nd Law of Thermodynamics should perhaps be viewed as a conservation law for info-entropy, similar to the 1st Law for energy. This has implications for the treatment of information falling into a black hole.
Figure 2. (a) Right-handed polarisation helical info-entropy wave, propagating in positive x i -direction given by I _ × S _ . (b) Left-handed polarisation helical info-entropy wave travelling in same direction.
Figure 2. (a) Right-handed polarisation helical info-entropy wave, propagating in positive x i -direction given by I _ × S _ . (b) Left-handed polarisation helical info-entropy wave travelling in same direction.
Entropy 12 00080 g002
The reason that the differential Equations (6) and (7) obey the laws of relativity is that they are simple spatial reciprocal quantities, with their ratios equivalent to velocities, such that the relativistic laws are applicable. This reciprocal space aspect implies that calculation of Fourier-space info-entropy quantities result in quantities proportional to real space, which are therefore also relativistically-invariant and holographic.
Equation (9a) shows that a right-handed helical spatial information distribution may be associated with an increase of entropy with time (Figure 2a); in contrast to a left-handed chirality where entropy may decrease with time (Figure 2b). A link is therefore possibly revealed between the high information density of right-handed DNA [18] and the overall increase of entropy with time in our universe [19]. A molecule such as DNA is an efficient system for information storage. However, our dynamic model suggests that information would radiate away, unless a localisation mechanism were present, e.g., the existence of a standing wave. Such waves require forward and backward wave propagation of the same polarisation. We see that the complementary anti-parallel (C2 spacegroup) symmetry of DNA’s double helix means that information waves of the same polarisation (chirality) would travel in both directions along the helix axis to form a standing wave, thus localising the information. Small left-handed molecules (e.g., amino acids, enzymes and proteins, as well as short sections of Z-DNA [20]) are known to exist alongside DNA within the cell nucleus. Their chirality may be thus associated with a decrease of entropy with time. In conjunction with right-handed molecules they can be understood to possibly regulate the thermodynamic function of the DNA molecule. As is well known, highly dynamic systems require negative feedback to direct and control their output, without which they become unstable. We can draw parallels to the high entropic potential of DNA, with left-handed molecules (as well as right-handed molecules) potentially acting as damping agents to control its thermodynamic action.

4. Conclusions

In conclusion, our model grounds information and entropy onto a physical, relativistic basis, and provides thermodynamic insights into the double-helix geometry of DNA. It offers the prediction that non-DNA-based life forms will still tend to have helical structures, and suggests that black holes conserve info-entropy.

References

  1. Bennett, C.H. Demons, engines and the second law. Sci. Am. 1987, 257, 88–96. [Google Scholar] [CrossRef]
  2. Landauer, R. Computation: A fundamental physical view. Phys. Scr. 1987, 35, 88–95. [Google Scholar] [CrossRef]
  3. Brillouin, L. Science & Information Theory; Academic: New York, NY, USA, 1956. [Google Scholar]
  4. Landauer, R. Information is physical. Phys. Today 1991, 44, 23–29. [Google Scholar] [CrossRef]
  5. Parker, M.C.; Walker, S.D. Information transfer and Landauer’s principle. Opt. Commun. 2004, 229, 23–27. [Google Scholar] [CrossRef]
  6. Parker, M.C.; Walker, S.D. Is computation reversible? Opt. Commun. 2007, 271, 274–277. [Google Scholar] [CrossRef]
  7. Garrison, J.C.; Mitchell, M.W.; Chiao, R.Y.; Bolda, E.L. Superluminal signals: Causal loop paradoxes revisited. Phys. Lett. A 1998, 245, 19–25. [Google Scholar] [CrossRef]
  8. Brillouin, L. Wave Propagation and Group Velocity; Academic: New York, NY, USA, 1960. [Google Scholar]
  9. Stenner, M.D.; Gauthier, D.J.; Neifeld, M.A. The speed of information in a “fast-light” optical medium. Nature 2003, 425, 695–697. [Google Scholar] [CrossRef] [PubMed]
  10. Wynne, K. Causality and the nature of information. Opt. Commun. 2002, 209, 85–100. [Google Scholar] [CrossRef]
  11. Carey, J.J.; Zawadzka, J.; Jaroszynski, D.A.; Wynne, K. Noncausal time response in frustrated total internal reflection. Phys. Rev. Lett. 2000, 84, 1431–1434. [Google Scholar] [CrossRef] [PubMed]
  12. Heitmann, W.; Nimtz, G. On causality proofs of superluminal barrier traversal of frequency band limited wave packets. Phys. Lett. A 1994, 196, 154–158. [Google Scholar] [CrossRef]
  13. Arfken, G.B.; Weber, H.J. Mathematical Methods for Physicists; Academic: New York, NY, USA, 1995; Chapter 6. [Google Scholar]
  14. Primas, H. Time, Temporality, Now: The Representation of Facts in Physical Theories; Springer: Berlin, Germany, 1997; pp. 241–263. [Google Scholar]
  15. Gershenfeld, N. The Physics of Information Technology; Cambridge University Press: Cambridge, UK, 2000; Chapter 4. [Google Scholar]
  16. Jackson, J.D. Classical Electrodynamics; John Wiley & Sons: New York, NY, USA, 1999; Chapter 11. [Google Scholar]
  17. Peiponen, K.-E.; Vartianen, E.M.; Asakura, T. Dispersion, Complex Analysis, and Optical Spectroscopy: Classical Theory; Springer: Berlin, Germany, 1999. [Google Scholar]
  18. Watson, J.D.; Crick, F.H.C. A structure for deoxyribose nucleic acid. Nature 1953, 171, 737–738. [Google Scholar] [CrossRef] [PubMed]
  19. Zurek, W.H. Complexity, Entropy and the Physics of Information; Addison-Wesley: Redwood City, CA, USA, 1989; Volume 8. [Google Scholar]
  20. Siegel, J.S. Single-handed cooperation. Nature 2001, 409, 777–778. [Google Scholar] [CrossRef] [PubMed]

Appendix

We generalise the x-dimension to the ith space-dimension ( i = 1,2,3 ) such that for a pole travelling in the x i _ -direction (i.e., equivalent to a plane wave travelling in the x i _ -direction) the info- and entropy-fields vibrate in the mutually-orthogonal x j _ - and x k _ -directions respectively, where again j , k = 1,2,3 and i j k . The vector descriptions of the I and S fields are:
Entropy 12 00080 i001
Entropy 12 00080 i002
In analogy to an EM plane-wave, two plane-wave polarisations are therefore possible:
Entropy 12 00080 i003
Entropy 12 00080 i004
where we have generalised the position of the point of non-analyticity (pole) to a position z = x i ' + i c t , i.e., dropped the subscript “0”. We see that the information and entropy fields are mutually orthogonal, since I S = 0 . Also, the direction of propagation of the wave is given by I _ × S _ with the sign of I j , k chosen so that flow is in the positive x i _ -direction. We make use of the 4 × 4 transformation matrix g , relating one relativistic frame of reference to another [16]:
Entropy 12 00080 i005
where β 1 , β 2 and β 3 are are the normalised velocities (with respect to c) in the co-ordinate directions, with Entropy 12 00080 i006 in this case, since the overall wave velocity is equal to c. In addition the parameters γ and A are related by Entropy 12 00080 i007, and also Entropy 12 00080 i008. We note that equation (A3) is made equivalent to (4), by substituting Entropy 12 00080 i009, and Entropy 12 00080 i010, with β 2 = β 3 = 0 . Performing a dynamic calculus on equations (A2a), we derive the following expressions:
Entropy 12 00080 i011
Entropy 12 00080 i012
Entropy 12 00080 i013
Entropy 12 00080 i014
Consideration of the second polarisation (b) allows us to derive the additional following expressions:
Entropy 12 00080 i015
Entropy 12 00080 i016
Entropy 12 00080 i017
Entropy 12 00080 i018
Since the direction of propagation is in the x i _ -direction, we must have that β j = β k = 0 , so that β i = β , and therefore γ = 1 + A β i 2 , such that g ii = g 00 , as well as g i 0 = g 0 i . Given x i ' = c t as usual, in the x j _ -direction the following pair of equations holds:
Entropy 12 00080 i019
Entropy 12 00080 i020
Likewise, in the x k _ -direction we find the following pair of equations holds:
Entropy 12 00080 i021
Entropy 12 00080 i022
We see that both pairs of equations (A6) and (A7) respectively obey the Cauchy-Riemann symmetries evident in Equations (8). Considering the three cyclic permutations of i,j and k, we can write:
Entropy 12 00080 i023
Entropy 12 00080 i024
Equations (A8) can be alternatively written in the compact vector notation of equations (9).

Share and Cite

MDPI and ACS Style

Parker, M.C.; Walker, S.D. A Dynamic Model of Information and Entropy. Entropy 2010, 12, 80-88. https://0-doi-org.brum.beds.ac.uk/10.3390/e12010080

AMA Style

Parker MC, Walker SD. A Dynamic Model of Information and Entropy. Entropy. 2010; 12(1):80-88. https://0-doi-org.brum.beds.ac.uk/10.3390/e12010080

Chicago/Turabian Style

Parker, Michael C., and Stuart D. Walker. 2010. "A Dynamic Model of Information and Entropy" Entropy 12, no. 1: 80-88. https://0-doi-org.brum.beds.ac.uk/10.3390/e12010080

Article Metrics

Back to TopTop