Next Article in Journal
Topological Classification of Limit Cycles of Piecewise Smooth Dynamical Systems and Its Associated Non-Standard Bifurcations
Next Article in Special Issue
Stochastic Dynamics of Proteins and the Action of Biological Molecular Machines
Previous Article in Journal
The Entropy Production Distribution in Non-Markovian Thermal Baths
Previous Article in Special Issue
Entropy Principle and Recent Results in Non-Equilibrium Theories
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Information in Biological Systems and the Fluctuation Theorem

Department of Chemical and Biomolecular Engineering, University of Nebraska-Lincoln, 820 N. 16th Street, Lincoln, NE, 68588, USA
Entropy 2014, 16(4), 1931-1948; https://0-doi-org.brum.beds.ac.uk/10.3390/e16041931
Submission received: 17 January 2014 / Revised: 27 March 2014 / Accepted: 28 March 2014 / Published: 1 April 2014
(This article belongs to the Special Issue Advances in Methods and Foundations of Non-Equilibrium Thermodynamics)

Abstract

: Some critical trends in information theory, its role in living systems and utilization in fluctuation theory are discussed. The mutual information of thermodynamic coupling is incorporated into the generalized fluctuation theorem by using information theory and nonequilibrium thermodynamics. Thermodynamically coupled dissipative structures in living systems are capable of degrading more energy, and processing complex information through developmental and environmental constraints. The generalized fluctuation theorem can quantify the hysteresis observed in the amount of the irreversible work in nonequilibrium regimes in the presence of information and thermodynamic coupling.

1. Introduction

Definition and quantification of information have created broad discussions; particularly, “information theory” in living systems is an evolving field [114]. Evolution is an axiomatic consequence of organismic information obeying the second law of thermodynamics; as entropy increases, the information within a biological system becomes more complex and diversifies at bifurcation points [1518]. Behavior of living systems is decomposable into functional modes and the ensemble of modes at an agent’s disposal constitutes functional repertoire. The modes may be subjected to operational signals and a mechanism that sequentially selects a mode so that it controls the functional dynamics [18,19].

In conventional thermodynamics, the amount of entropy is independent of how the process is regarded as being divided into irreducible subsystems; the entropy of a system can be calculated from the entropies of its subsystems. If the subsystems are statistically independent that is not correlated by the mutual information, the entropy is additive. Shannon’s entropy of the total system is given by the sum of those of the subsystems [2,3,11], and is restricted to random variables taking discrete values. In small thermodynamics systems, such as at mesoscopic or nanoscopic scale, the mutual information can serve as a resource of the work or the free energy through information processing such as feedback control. Use of the mutual information as a source of entropy decrease accompanies no heat dissipation [3]. This is contrary to the conventional thermodynamics where the decrease in entropy is only because of heat dissipation.

An over damped motion x(τ) of a system in contact with a heat bath and a single continuous degree of freedom can be described by the Langevin equation = μF(x,y) + ζ. The systematic force F(x,λ) can arise from a conservative potential and/or be applied to the system directly as a nonconservative force, while ξ is the stochastic force, which is not affected by a time-dependent force, and μ is a positive constant. The Langevin dynamics generates trajectories x(τ) starting at x0. For an arbitrary number of degrees of freedom, x and F become vectors. The Langevin equation is the generic equation of motion for a single fluctuating thermodynamic quantity such as the concentrations of the chemical species in the vicinity of equilibrium [4,16,17].

For a system in contact with a heat bath, symmetry of the probability distribution of entropy production in the steady state is known as the fluctuation theorem. Crook’s fluctuation theorem compares probability distributions for the work required in the original process with that of the time-reversed process. The probabilistic approach has gained broader appeal due to the advances in experimental techniques, such as atomic force microscopes and optical tweezers, for tracking and manipulation of single particles and molecules [2025]. This study discusses some emerging efforts in information theory, its role in living systems, and its utilization in fluctuation theory. Relationship of thermodynamic coupling with the information theory and its consequence on the fluctuation theorem has been emphasized.

2. Information Theory

Information may be defined as the capacity to reduce statistical uncertainty in the communication of messages between a sender and a receiver [36]. Consider the number of ways W in which N distinguishable entities can be assigned to M distinguishable states such that there are ni entities in state i:

W = N ! n 1 ! n 2 ! n M !

Probability pi is related to maximum entropy H in the limit of large N and ni. The asymptotic result from Stirling’s approximation (ln N! ≈ N ln N) yields:

1 N ln W = - i M p i ln p i = H

where the occupation frequency of state i is pi = ni/N [25].

In Shannon’s theory, entropy represents the amount of uncertainty one particular observer has about the state of the system [3,5]. For a variable X with the x1, x2,…, xN of its N possible states, the probability of finding X in state xi would be pi and Shannon’s entropy H of X is H ( X ) = - i N p i ln p i. If nothing is known about X, we have H(X) = ln N, which is the maximum value that H(X) can be; this occurs if all the states are equally likely pi = 1/N. However, for example, if X = x5 then the uncertainty about X becomes smaller, and therefore H(X) represents the closest description of X. The probability distribution using prior knowledge or measurements can teach us something about a system. The difference between the maximal and the actual entropy after our measurements or analysis is the amount of information we have for the system. As it measures the difference of uncertainty, information is a relative quantity [5]. Using the problem of ion channel thermodynamics, for example, complex thermodynamic models can be built by successively adding physical information [23].

If we define another random variable Y with its states y1, y2,…,yM and probabilities p1, p2,…, pM, then the joint entropy H(X,Y) measures our uncertainty about the joint system XY in N·M states. If X and Y are somehow connected, such as two molecules that can bind to each other, the information that one molecule has about the other is:

I ( X : Y ) = H ( X ) + H ( Y ) - H ( X Y )

Here “:” shows that information is symmetric; X and Y equally knows each other. If the state of Y is known, then the so called “conditional entropy” becomes:

H ( X / Y ) = H ( X Y ) - H ( Y )

For independent variables X and Y, we get H(XY) = H(X) + H(Y). With the conditional entropy, Equation (3) becomes:

I ( X : Y ) = H ( X ) - H ( X / Y )

The equation above shows that information measures deviation from independence that is the amount by which the entropy of X or Y is reduced by knowing the other [5]. Maximization of the information entropy (IE) determines the probability of a particular state of the system. This leads to the relation between the probability of a nonequilibrium system and the number of microscopic trajectories [2326].

2.1. Information and Thermodynamics

Maximum entropy production (MEP) may be an organizational principle applicable to physical and biological systems. Nonequilibrium thermodynamics and MEP can be used to derive overall evolutionary trends of Earth’s past; for example, they can identify the role that life plays in driving thermodynamic states far from equilibrium [3,613,2735]. Various derivations of MEP by using the maximum information entropy (MIE) procedure by Jaynes [36] exist in the literature [2]. In these derivations the information entropy (IE) is not defined by a probability measure on phase space, but on path space for the stationary nonequilibrium systems [26].

Implementation of the MEP principle also includes the steepest entropy ascent formalism based on quantum thermodynamics approach [3739]. Beretta [37] has examined the steepest entropy ascent formalism for transition between states and adopted a geometrical view of the dynamics in probability space based on identifying a unique path of locally maximal entropy increase.

Consider M sites with a variable ni(t) (i = 1,2,…,M) at each site with t = 0,1,…,τ. The flux (time asymmetric) occurring randomly at every time step, Jij = − Jji from i to j depends on a parameter cij = cji, such that Jij(t) = ± cij with stochastic sign. A microscopic path a is a set of values ± cij so that ni,a (t+1) −ni,a(t)= −∑j Jij,a(t). The path dependent time average flux is ij,a = (1/τ)∑t Jij(t) and ni(0) does not depend on the complete path. With the microscopic path dependent probability pa, the path ensemble average is 〈ij〉 = ∑apaij,a. By using Jaynes’ information theory and maximizing the path information entropy [2,24,25]:

S I = - a p a ln p a

with the constraints [2,25]:

1 = a p a
n i ( 0 ) = a p a n i , a ( 0 )
N i j = - a p a J ¯ i j , a

the most likely probability on the path space is estimated as:

p a = 1 Z exp A a

where Nij is the numerical value of the time and path ensemble average of the flux Jij, Aa is the path action Aa = ∑i λi nia(0) + ∑ij nijij,a in which λi and nij = − nji are the Lagrange multipliers of constraints (8) and (9), respectively, and Z is the partition function [2,24,25].

However, a trajectory of a quantity possessed by a system may fluctuate wildly (if far from equilibrium) or weakly; than they would not have the same probabilities as long as they have the same initial and final states. Here the path trajectory a is a sequence of p over some time interval with M sites:

a [ p ( 0 ) , p ( d t ) , p ( 2 d t ) , , ( M d t ) ]

where dt is the coarse graining corresponding to the time scale of experimental observations [4].

The partition sum and the constitutive equation of motion have the relations:

ln Z n i j = N i j
X i j = n i j τ

The forward and backward components of the time and ensemble averaged fluxes are:

N i j = N i j f - N i j b = c i j e m e m + e - m - c i j e - m e m + e - m

where m = Xijcij and 2 m = ln ( N i j f / N i j b ).

The entropy production of a microscopic path a is [2]:

σ a = i j X i j N i j

By using the most likely probability on the path space pa (given in Equation (10)) in Equation (6), the maximum information entropy as a function of the force X becomes:

S I , max ( X ) = ln Z ( X ) - A ( x ) ln W ( A ( X ) )

where W (〈A(X)〉) is the density of paths, W is the number of paths, and (〈A(X)〉) is the average path action.

The entropy curvature (response) matrix and the probability distribution for the time averaged flux are [2]:

A i j , k l ( N ) = X i j N k l = - 2 S I , max ( X ( N ) ) τ N i j N k l
p ( J ¯ ) exp ( - τ 2 i j , k l [ J ¯ i j - N i j ] A i j , k l ( N ) [ J ¯ i j - N k l ] )

In near equilibrium regime that is for small X, the maximum path information becomes:

S I , max ( X ) = ln ( W ) ln 2 τ ( M 2 - M ) + τ σ / 2

The first part on the right side of the equation above is the logarithm of the total number of paths for uniform probability distribution, while the second term is the entropy production. In the MEP, the assumption is that the number of paths W should be an increasing function of the averaged action [2426]. Here, for the higher value of entropy production, the value of SI is lowered [2].

MEP principle states that if thermodynamic force Xi is preset, then the true thermodynamic flow Ji satisfying the condition for the local rate of entropy production στ:

σ τ = i J i X i 0

yields the maximum value of the στ(J). This can be written using the Lagrange multiplier λ and fixed force:

δ j [ σ τ ( J k ) - λ ( σ τ ( J k ) - i J i X i ) ] X = 0

In the meantime, the relationship between the fluxes and forces:

X i = σ ( J ) i J i ( σ / J i ) ( σ / J i )

indicates that this relationship can be both linear and nonlinear [23].

The same entropy production can be both maximum and minimum depending on the constraints used in the entropy production variation. However, it is widely published that the MEP principle may be a critical link in the explanation of the direction of the biological evolution under the imposed constraints of the environment [22,23,26]. If X is fixed, the MEP leads to maximum J that is the selection of fastest process. MEP principle has proved to be valuable for understanding and describing of various nonequilibrium systems in physics, biology, and environment. The local equilibrium hypothesis of a nonequilibrium system and the representation of the entropy production as a bilinear form of flows and forces are mandatory conditions for the use of MEP principle [2331].

In the cortex, populations of neurons continuously receive input signals from other neurons, interpret their ongoing activity, and generate output signals destined for other neurons. This information processing and transmission is limited by the repertoire of different activated configurations available to the population. The extent of this repertoire may be estimated by its entropy H characterizing the information capacity as the upper limit on the aspects of information processing of the population. When the information transmitted from the input to the output by a population that has only two states in its repertoire (H = 1 bit), then regardless the information the input contains, the output information content cannot exceed 1 bit. Therefore, a network with low entropy population may limit information transmission. Activity in the cortex depends on the ratio of fast excitatory Ex to inhibitory In synaptic signals to neurons. The Ex/In ratio remains fixed at an average in various events during highly fluctuating activity levels; yet a small Ex/In ratio (caused, for example, by weak excitation drive) may reduce the correlations as well as the overall level of activity [7].

For a number of unique binary patterns, the entropy is:

H = - i = 1 n p i log 2 p i

where pi is the probability that pattern i occurs. Equation (23) estimates the occurrence probability for each pattern. Maximization of entropy may be an organizing principle of neural information processing systems [7]. The information capacity IC in binary units may be expressed as a function of the probability p:

I C = 1 ln 2 ( j = 1 Ω p j ln p j - j = 1 Ω p j o ln p j o )

where Ω is the number of possibilities, p∘ is the probability at equilibrium (i.e., no knowledge), and p is the probability when some information are available about the system. Information here is used as a measure of structure [1,16].

3. Biological Systems

3.1. Genetic Information Use in Protein Synthesis

Genes are sequences of deoxyribonucleic acids (DNA) nucleotides composed of four bases (guanine G, adenine A, cytosine C, thymine T). Genes carry and transmit the information to specify 20 amino acids for protein synthesis. The genome refers collectively to the total genetic information coded in a cell. Ribonucleic acid (RNA) molecules transfer information from DNA to the site of protein synthesis [814]:

DNA transcription mRNA translation Protein

During transcription, one of the strands of the DNA acts as a template and determines the sequence of RNA nucleotides. RNA also contains the base uracil U in place of thymine T. The transcribed RNA binds to ribosomes in the cytoplasm where proteins are synthesized from the encoded information. Only exon segments actually code for amino acids, and a continuous sequence of exons becomes mRNA, which binds to a ribosome. In translation, each ribosome is composed of proteins and ribosomal RNA called rRNA. Transfer RNA (tRNA) is the link between an amino acid and its mRNA codon. Each amino acid brought to a tRNA molecule is linked by a peptide bond to the end of the growing protein chain. The completed protein then undergoes folding. Signals from within or outside the cell can turn on or off the transcription of genes. Homeostatic control systems regulate compensatory responses in which the set point of a variable is controlled. In case of a continuing perturbation, the error signal (negative or positive feedback) may lead to resetting the set point [814].

A stimulus is a detectable change in the internal or external environment. A receptor detects the stimulus and produces a signal that is relayed through various pathways to the integrating center, which receives signals from many receptors. The output of the integrating center reflects the net effect of the total afferent input. The output is sent through the efferent pathway to an effector.

Some of the intercellular chemical messengers are hormones, neurotransmitters, and paracrine agents, which are synthesized by cells and diffuse to target cells upon receipt of an appropriate stimulus. Autocrine agents are local chemical messengers that act upon the same cells that secreted them. Many homeostatic systems add and remove different chemicals in response to a particular stress; it is mostly reversible without the involvement of genetic change.

Receptors are glycoproteins on plasma membranes of target cells and bind specific messengers. The binding leads to response through the signal transduction pathways that would vary in different cell types. For example, neurons generate electric signals that pass from one cell to another and release chemical messengers called neurotransmitters to communicate with other cells. A synapse is a junction between two neurons and its influence can be either excitatory or inhibitory. If many presynaptic cells affect a single postsynaptic cell it allows information from many sources to influence the activity of one cell. On the other hand, if a single presynaptic cell affects many postsynaptic cells it allows one information source to affect multiple pathways [8,10].

3.2. Information and Biological Systems

DNA is a code, and codes from sequence alone do not reveal information. DNA replication starts with an initiator protein, which unwinds a short stretch where another protein known as “helicase” attaches to and breaks apart the hydrogen bonds between the bases on the DNA strands. The nonconditional entropy for DNA sequence or proteins is about two bits per base; a random protein would have log2(20) = 4.32 bits of entropy per site. Due to repetitions, pair, and triplet correlations the actual entropy would be lower [5]. This entropy per symbol only allows us to estimate our uncertainty about the sequence identity; it will not reveal the “function” of the genes. Estimates of the entropy reduction and information aspects of a living cell are becoming possible in the context of cell’s metabolic activity and hence its maintenance of the nonequilibrium state [1014,40,41].

In equilibrium thermodynamics, isolated systems have the maximum entropy and there are no correlations between the states; hence there is no information. The information as the amount of correlations between two systems stored in living system (biological genomes) points out that they are far away from equilibrium. Consequently, the information theory becomes a part of nonequilibrium thermodynamics in living cells. Information measures the amount of entropy shared between two systems. Also, information enables us to make predictions about other systems, since only in reference to another ensemble entropy can become information. Therefore, what is described by the correlations between the sequences stores information not the sequence itself. On the other hand, what information a genomic sequence represents depends on the interpreter environment. Therefore, if a sequence means something it can create a function (response) necessary for its environment [6,14].

The information theory introduced “functional information” that leads to self-organizing capabilities of living systems, and “instructional information” that is a physical array. However, linkages with the field of semiotics established a much more compatible approach to biological information [8]. Within this trend “control information” is defined as the capacity to control the acquisition, disposition, and utilization of matter, energy, and information flows functionally.

Functional information I(Ex) represents the probability that an arbitrary configuration of a system (i.e., complex biological systems of many interacting components) will achieve a specific function to a specified degree; therefore, it is as a measure of system complexity. For a given system and function x (e.g., a folded RNA sequence that binds to guanosine triphosphate GTP), and degree of function Ex (e.g., the RNA-GTP binding energy), functional information is I(Ex) = − log2[F(Ex)], where F(Ex) is the fraction of all possible configurations of the system [14,3234]. Functional information, for example, analyzes RNA structures that bind target ligands and RNA structures that catalyze specific reactions. Thus, the degree of function Ex of these linear sequences of RNA letters (the four bases adenine, cytosine, guanine, and uracil or thymine A, C, G, and U or T) can be defined quantitatively as the binding energy to a particular molecule or the catalytic activity increase in a specific reaction rate [10,14].

Instead of a single RNA nucleotide, a minimum sequence length (nmin nucleotides) would achieve any significant degree of ribozyme function, Ex > Emin. Increasing the number of nucleotides (n > nmin) will generally lead to more functional sequences. Consequently, for a maximum possible degree of specific function Emax there will be an optimal RNA sequence of length nopt possessing the maximum functional information of Imax (Emax) = −log2 [1/(∑1−nopt(4n))]. For Ex < Emax, an intermediate functional information shows [I(Ex) < Imax(Emax)] [14].

Each position on the genome is four-base code and the uncertainty at each position is two bits; then the maximum entropy becomes:

H max = - i = G , C , A , T p ( i ) log 2 p ( i ) = log 2 ( 4 ) = 2 bits

since p(i) = ¼. The actual entropy is obtained from the actual probabilities pj(i) for each position j on the sequence. In N sequences, we have pj(i) = nj (i)/N by counting the number of nj(i) occurrences of nucleotide i at position j (this will be done for all positions j = 1,…,M on the sequence length M). When we ignore correlations between positions j, the information stored in the sequence becomes:

I = H max - H = 2 M - H bits

where H = - j = 1 M i = G , C , A , T p j ( i ) log 2 p j ( i ).

The thermodynamics of protein structures implies that sequence and structure are related. If a structural entropy of proteins H(str) is obtained for a given chain length and environment, the mutual entropy between structure and sequence becomes [8]:

I ( seq : str ) = H ( seq ) - H ( seq / str )

where H(seq) is the entropy of sequences of length M and H(seq/str) is the entropy of sequences based given the structure. If the environment requires a certain structure that will be functional in that environment then H (seq/str) ≈ H (seq/env) and I (seq:str) is approximately equal to the physical complexity. Assuming that any given sequence produces an exact structure H(str/seq) = 0, then Equation (28) becomes:

I ( seq : env ) I ( seq : str ) = H ( str )

Therefore, thermodynamic entropy of a protein structure is limited by the amount of information about the environment coded by the sequence. This may imply that sequences that encode more information about the environment may be more functional.

One of the consequences of the Human Genome project has proved that biology is an “informational science” [5,9]. The communication in living cells is based on the signals, such as electromagnetic-light, mechanical-touch, and chemicals received. In the signal-transduction pathway, a signal on a cell surface is converted into a specific cellular response in a series of functional steps [7]. This suggests that information is conceived as the communication of a form from object to interpreter through the sign. The evolution of ways of storing, transmitting, and interpreting information can be seen a major step in the increased capacity for collective behavior and robustness in living systems [10,11,29]. Information can be introduced into, or extracted from through the agencies intentionally. A sophisticated information organizing capacity of biological origin emerges from the uncertainty inherent in the complexity of all living systems [1214].

In semiotic understanding of living systems, interpreters of signs and information will often be an interpreter-dependent objective process. Genes should be regarded as signs in DNA, which can only have any effect on a cell function through a triadic-dependent process that is the communication of a form from the object to the interpretant through the mediation of the sign [8]. The object of sign in DNA is a functional, folded, and chemically configured protein production; when a particular gene product is necessary, a signal from the environment activates the expression of a certain gene. The cell as an interpreter alters its internal states triggered by a collective signal transduction pathway to establish the boundary conditions to processes and perform functional process with the genetic material [8,14].

There are two aspects of molecular and supramolecular information transduction: (i) sphingomyelinase activity leading to a lipid-mediated cross-communication between the sphingomyelinase and phospholipase A2 pathways, and (ii) the compositionally driven lateral organization of whole glial and neuronal membrane interfaces leading to the differential responses. The interfacial properties may express and regulate the generation, transduction, and amplification of molecular and submolecular properties on a temporal and structural scale mainly at mesoscopic level. However, the transduction of information mainly occurs between the microscopic (<10−6 s, <1 nm) and the macroscopic (>10−1 s, >500 nm) ranges [42].

A physiological mechanism may exist to ensure fast transmission of information to that hemisphere which is more efficient in its processing [43]. This shows how functional information is transmitted by parts to produce a response that serves a purpose for the whole. This is the result of coordinated awareness between the parts processing particular information and the parts receiving the transmitted information for preparing a functional output (response) for the whole.

Signal transduction includes the interrelation and information exchange between phosphohydrolytic pathways producing lipid mediators of signal transduction whose formation modifies the interfacial composition. At the molecular level, there exists cross-talk between a glial cell and the axonal membrane surface. There also exists the subtle intercommunication and response of functional living cells to information contained in the molecular organization of contacting cellular membranes [42].

3.3. Thermodynamically Coupled Biological Systems and Information

Thermodynamic coupling [16,17,32,4449] materialized by certain coupling mechanisms (i.e., configurations, ATPase) has the ability to process information and hence the lower the entropy production σ = ∑iJiXi > 0. This may lead to self-organization because a process with σ1 < 0 (unnatural) is thermodynamically coupled with another process with σ2 >> 0 (natural) to achieve a desired output function; therefore, the coupled system satisfies the second law of thermodynamics [(σ1 < 0) + (σ2 >> 0)] > 0. Here, the thermodynamic coupling refers that a flux J occurs without its primary thermodynamic driving force X, or opposite to the direction imposed by the driving force. This is consistent with the second law that states that a finite amount of organization may be obtained at the expense of a greater amount of disorganization in a series of coupled spontaneous processes [16,48,49]. The information processed in the thermodynamic coupling is the functional information, which in turn may lower the total entropy [32]. The “complexity” in (biological) systems that display some of the examples of thermodynamic coupling probably refers to the mechanisms enabling the discrete parts of the whole to be aware of each other hence leading to a coherent and collective behavior to achieve the best degree of desired function [1,3,16,44]. This also indicates that entropy productions of various coupled processes are additive.

Biochemical reactions coupled with diffusion of species can lead to molecular pumps and biochemical cycles, such as the adenosine triphosphate (ATP) synthesis coupled to the respiratory electron transport. This shows a functional process leading to organized structures where the ATP synthesis (σ < 0) has been made possible and the coupled processes satisfy ∑ > 0 [17,3135]. The ATP synthesis, in turn, is matched and synchronized to cellular ATP utilization processes, since energy and matter flows must be directed by the information for them to be functional and serve a purpose. For living systems, the information is the cause and the entropy reduction is the result [3436,4047]:

Energy + Matter + Information Locally reduced entropy ( Increase of order )

The general approach for incorporating thermodynamics into the information theory has been to derive probability distributions for nonequilibrium steady states by employing the variational principle. However, composing the appropriate constraints to be used in the variational principle is not clear, since there is no definite extremum quantity to characterize the state space of such steady nonequilibrium states. In the vicinity of equilibrium only, the linear phenomenological laws may be useful in that respect [4,16,17]. Therefore, a natural question is that how useful such an approach would be to describe the information processing in functionally coupled and self-organized biochemical cycles of living systems that are mainly far from equilibrium? The probabilistic measure of information derived from Jaynes’ information theory formalism of statistical mechanics is mainly indifferent to meaning [30].

The unified theory of evolution attempts to explain the origin of biological order as a manifestation of the flows of energy, matter, and information on various spatial and temporal scales. Genes originates the information to synthesize the required enzymes, regulatory and structural proteins. The genome is the source of cellular codes; also any cellular structure such as lipids and polysaccharides may store and transmit information. Besides, thermodynamic forces in the form of transmembrane gradients of H+, Na+, K+, Ca2+ and consequent electric potential cause significant displacements from equilibrium, and are therefore, potential sources of information. Genome-protein system may be a component of a large ensemble of cellular structures, which store, encode, and transmit the information [510].

Le Chatelier’s principle may be applied to analyze how a protein-signaling network at equilibrium returns to its equilibrium state after being slightly perturbed [20]. For a single cell or small cell colony, cell to cell perturbations are small, while the unperturbed state of a single cell may be unstable in the presence of many other cells. Experiments permit observations of the covariance in the fluctuations and evolution of these fluctuations of different proteins when a single cell is perturbed in the presence of other cells. The information theory helps analyze this covariance to understand the network of interacting proteins [36]. Signal transduction inside and across the cells is complex and cooperative, and occurs in a far-from thermodynamic equilibrium. It is also driven by the activation kinetics strongly dependent on the local energetics [7,20,45].

The composite immediate object of a protein coding gene is the sequence of amino acids of a polypeptide, which can be folded in different ways in different cellular contexts and represents dynamical objects. So sign is a sequence of nucleotides in DNA and becomes effective information only when it is used by an active, coordinated, and coupled system of interpretation in the cell. This means that a range of sequence of amino acids required by the environment (cell) has been constructed [8].

The use of maximum entropy formalism in biology is expanding [2331] toward detecting expression patterns in signal transduction. At the maximum entropy, the probabilities of the different proteins are not equal; each protein will be present in proportion to its partition function, which is the effective thermodynamic weight of a species at thermal equilibrium. MEP may be the unifying optimization for living systems and ecosystem function, in which entropy production might be a general objective function [2431,46]. It is stated that “the nonequilibrium state of MEP is the most probable as it can be realized microscopically in a greater number of ways than any other nonequilibrium state.” This statement is not universally accepted [46]. In this sense, MEP is a statistical principle, rather than a physical principle open to experimental validation. MEP may predict the behavior of optimal living system from the perspective of natural selection as well as offer a novel statistical reinterpretation of that behavior that is the survival of the likeliest [29].

For a multicomponent fluid system under mechanical equilibrium with n species and Nr number of chemical reactions, the rate of energy dissipation due to local rate of entropy production στ is [16,49,50]:

T σ τ = V ( - i J i · ( μ i ) T , P + i , j μ i ν i j J r j ) d V 0

where Ji is the vector of mass fluxes, μi is the chemical potential of species i, and A is the chemical affinity A = −∑νiμi. The local mass balance of chemical species i is [16,49]:

ρ w i t = - · j i + i , r ν i r J r

For a steady state system, we have ∇ · ji = ∑i,jνijJrj allowing the dissipation to be expressed fully in terms of chemical affinity [29] of many coupled chemical reactions of biochemical cycles.

Considering N number of linear flux-force system Ji = ∑ LikXk [4954] expressed in matrix form −J = LX in a biochemical cycle, the local rate of entropy production in terms of forces X becomes [16]:

σ τ = i , k = 1 N L i k X i X k = [ X 1 X 2 X N ] [ L 11 L 12 L 1 N L 21 L 22 L 2 N .. L N 1 L N 2 L N N ] [ X 1 X 2 X N ] 0

Onsager’s reciprocal relations state that the coefficient matrix L is symmetrical. The L will have N × N elements. The number of elements representing the cross coefficients responsible for the induced effects because of the thermodynamic coupling would be (N2N)/2. Through the coupling-uncoupling mechanisms, these cross coefficients may be on and off based on the biochemical path and its environment, and leads to (N2N)/2 bit information. In the absence of pertinent symmetries or invariances, all types of cross-couplings are possible and lead to nonvanishing cross coefficients. If the structure of the system is invariant with respect to some or all of the orthogonal transformations, then the invariance will eliminate certain cross-couplings and their cross-coefficients will vanish.

Thermodynamic coupling may lead to (N2N)/2 number of possibilities of coupled-uncoupled structures emerging in an N-flux-force system in a biochemical cycle depending on the environmental interpretations. This, in turn, brings out the challenge of implementing the trajectories belonging two or much more coupled processes (correlated with mutual information) with different initial and nonequilibrium states into the fluctuation and information theories.

4. Fluctuation Theorem

Fluctuation theorem (FT) relates the probability p(στ) of observing a phase-space trajectory with entropy production rate of στ over time interval τ, to that of observing a trajectory with entropy production rate of −στ:

p ( σ τ ) p ( - σ τ ) = exp ( τ σ τ )

where the Boltzmann constant kB = 1. The probability distribution of the rate of entropy production becomes p(τ) = p(−τ)exp(τστ). This result describes how the probability of violations of the second law of thermodynamics becomes exponentially small as τ or the system size increases. FT relates the work along nonequilibrium trajectories to the thermodynamic free energy differences, and applicable to single molecule force measurements. FT depends on the following assumptions: the system is finite and in contact with a set of thermal baths; the dynamics are required to be stochastic, Markovian, and microscopically reversible. The probabilities of the time-reversed paths decay faster than the probabilities of the paths themselves and the thermodynamic entropy production arises from the breaking of the time-reversal symmetry of the dynamical randomness. Fluctuations encode fundamental aspects of the physics of a system and are crucial to understand irreversibility and nonequilibrium behavior, which may be unique for that system [3,2125].

Crook’s FT can be used to determine free energies of folding and unfolding processes occurring in nonequilibrium systems. For that, the unfolding and refolding processes need to be related by time-reversal symmetry, i.e., the optical trap used to manipulate the molecule must be moved at the same speed during unfolding and refolding [21,22].

In processes that are microscopically reversible, Crook’s FT predicts a symmetry relation in the work fluctuations for forward and reverse changes as the system is driven away from thermal equilibrium by the action of an external field or perturbation. A consequence of Crook’s FT is Jarzynski’s equality exp(−ΔG/kBT) = 〈exp(−W/kBT). However, for processes that occur far from equilibrium the applicability of Jarzynski’s equality is hampered by large statistical uncertainty arising from the sensitivity of the exponential average to rare events [21].

Sagawa and Ueda [3] generalized the fluctuation theorem and the second law of thermodynamics by taking into account the mutual information between the subsystems (entropy becomes nonadditive), where the entropy production and the mutual information are treated on an equal footing. Their results are valid in Langevin systems and also far from equilibrium situations of classical stochastic dynamics.

In the absence of the initial of final correlations, entropy production satisfies the integral of FT (or Jarzynski’s equality): 〈(−σ)〉 =1 where 〈..〉 is the ensemble average over all microscopic trajectories.

Consider a typical information processing in a composite and stochastic system X, which is in contact with multiple heat baths (k = 1, 2,…) described at inverse temperature βk (1/kB) and interacting with system Y, which does not evolve in time during the interactions. The time evolution of X depending on the state of Y from t = 0 to t = τ can be described by classical stochastic dynamics along trajectory a. The system has the xi and xf initial and final phase-space points as well as the initial and final probability distributions of pia [xi] and pfa [xf]. The conditional probability of path a under the initial condition xi which depends on y such that the conditional probability becomes pa [Xa|x,y]. The systems X may be correlated with Y and the joint probability distributions are pia [xi,y] and pfa [xf,y], where pia [xi] = ∫ pia [xi,y]dy, pfa [xf] = ∫ pfa [xf,y]dy, and pa [y] = ∫ pia [xi,y]dx=pfa [xf,y]dx as the y is statistically independent. Then the ensemble averages 〈Iixiy〉 and 〈Ifx,fy〉 represent the mutual information [3] where the initial and final correlations between X and Y are I i x i y = ln p i a [ x i , y ] p i a [ x i ] p a [ y ] and I f x f y = ln p f a [ x f , y ] p f a [ x f ] p a [ y ].

In the presence of information I processing with initial and final correlations, the integral FT with energy dissipation and energy cost of information exchange becomes [3]:

exp ( - σ + Δ I ) = 1

where ΔI = IfxfyIixiy is the change in the mutual information, and pia [xi,y] ≠ 0 for any (x,y). Convexity of exp 〈x〉 ≤ 〈exp(x)〉 leads to 〈σ〉 ≥ 〈ΔI〉 [3]. With the correlation remaining after a feedback control (Irem) by system Y on system X, Equation (34) becomes:

exp ( - σ - ( I - I rem ) = 1 so σ - I - I f b

where 〈IIrem〉 may be an upper bound of the correlation that can be used.

The detailed FT in the presence of information processing becomes [5]:

p b [ X b , y ] p f [ X f , y ] = exp ( - σ + Δ I )

with the constraint p [x,y] ≠ 0 (x and y are the initial phase-space points), pb and pf are the joint probability distributions of the backward and forward processes, respectively, and (−σ + ΔI) shows the total entropy production of the composite system XY and the baths. Here, system x evolves from xi to xf along a path in such a manner that depends on the information about y, which does not evolve in time. To sustain a given fluctuation, a system traverses a precise optimal path in phase space [22,34].

FT allows a general orthogonality property of maximum information entropy MIE to be extended to entropy production EP. Maximum entropy production MEP and the FT are generic properties of MIE probability distributions. Physically, MEP applies to those macroscopic fluxes that are free to vary under the imposed constraints, and corresponds to the selection of the most probable macroscopic flux (flow) configuration [2831]. The constrained maximization of Shannon information entropy H is an algorithm for constructing probability distributions from partial information. MIE is a universal method for constructing the microscopic probability distributions of equilibrium and nonequilibrium statistical mechanics. The distribution of the microscopic phase space trajectories over a time τ satisfies pa∝exp(τστ/2). FTs are highly relevant for interpretation of measurements performed on single molecules and can be applied to channel facilitated transport of solutes through a membrane separating two reservoirs. The transport is characterized by the probability pn(t) that n solute particles have been transported from one reservoir to the other in time t. The fluctuation theorem establishes a relation between pn(t) and p-n(t): The ratio pn(t)/p-n(t) is independent of time and equal to exp(nA/kBT), where A/kBT is the affinity [13].

Equation (36) can be further expanded to include the influence of thermodynamic coupling through the information content of coupling (cross) coefficients (Lik, ik) given in Equation (32):

p b [ X b , y ] p f [ X f , y ] = exp ( - σ + Δ I + I TC )

where ITC represent the energy cost of information introduced because of thermodynamic coupling. In Equation (37), (−σI + ITC) shows the total entropy production of the composite coupled system XY and the baths. This may empower the FT to play a comprehensive role in estimating the fluctuations in living systems and eventually incorporating the information into physical sciences.

5. Conclusions

Various aspects of the thermodynamic relationships between information theory and living systems are discussed with emphasis on some new efforts toward defining and estimating the information. These efforts may lead to a deep understanding of the role played by the information particularly in living systems. Information theory can be used in investigating protein-protein interactions and the association of enzymes and proteins with their binding sites. If the subsystems are statistically dependent that is correlated by the mutual information, then the entropy becomes nonadditive; however, entropy production of the subsystems are additive. By treating mutual information and entropy production on equal footing a new integrated fluctuation theorem has already been suggested by Sagawa and Ueda [3]. Thermodynamic coupling, similar to the influence of information, can also lead to order. We may treat the information emerged from the thermodynamic coupling and entropy production on equal footing and use it in the generalized fluctuation theorem as Equation (37) suggests. Therefore, the generalized fluctuation theorem may play a comprehensive role in estimating the probability of the fluctuations in systems with information processing.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Demirel, Y. Energy coupling. In Information and Living Systems in Philosophical and Scientific Perspectives; Terzis, G., Arp, R., Eds.; MIT Press: Cambridge, MA, USA, 2011. [Google Scholar]
  2. Bruers, S. A discussion on maximum entropy production and information theory. J. Phys. A 2007, 40, 7441–7450. [Google Scholar]
  3. Sagawa, T.; Ueda, M. Fluctuation theorem with information exchange: Role of correlations in stochastic thermodynamics. Phys. Rev. Lett 2012, 109, 1806021–1806025. [Google Scholar]
  4. Paquette, G.C. Comment on the information theoretic approach to the study of non-equilibrium steady states. J. Phys. A 2011, 44, 368001–368006. [Google Scholar]
  5. Adami, C. Information theory in molecular biology. Phys. Life Rev 2004, 1, 3–22. [Google Scholar]
  6. Rogers, D.M.; Beck, T.L.; Rempe, S.B. An information theory approach to nonlinear, nonequilibrium thermodynamics. Stat. Phys 2011, 145, 385–409. [Google Scholar]
  7. Shew, W.L.; Yang, H.; Yu, S.; Roy, R.; Plenz, D. Information capacity and transmission are maximized in balanced cortial networks with neuronal avalanches. J. Neurosci 2011, 31, 55–63. [Google Scholar]
  8. El-Hani, C.N.; Queiroz, J.; Emmeche, C. A semiotic analysis of the genetic information system. Semiotica 2006, 160, 1–68. [Google Scholar]
  9. Amaral, P.P.; Dinger, M.E.; Mercer, T.R.; Mattick, J.S. The eukaryotic genome as an RNA machine. Science 2008, 319, 1787–1789. [Google Scholar]
  10. Genetic information and protein synthesis, Available online: http://www.biology-online.org/9/4_genetic_information.htm accessed on 30 December 2013.
  11. McIntosh, A.C. Functional information and entropy in living systems. In Design and Nature III: Comparing Design in Nature with Science and Engineering; WIT Press: Southampton, UK, 2006; pp. 115–126. [Google Scholar]
  12. Dunne, B.J.; Jahn, R.G. Consciousness, information, and living systems. Cell. Mol. Biol 2005, 51, 703–714. [Google Scholar]
  13. Kumar, P.; Ruddell, B.L. Information driven ecohydrologic self-organization. Entropy 2010, 12, 2085–2096. [Google Scholar]
  14. Hazen, R.M.; Griffin, P.L.; Carothers, J.M.; Szostak, J.W. Functional information and the emergence of biocomplexity. Proc. Natl. Acad. Sci. USA 2007, 104, 8574–8581. [Google Scholar]
  15. Pierce, S.E. Non-equilibrium thermodynamics: An alternative evolutionary hypothesis. Available online: http://www.academia.edu/1946079/Non-equilibrium_thermodynamics accessed on 31 March 2014.
  16. Demirel, Y. Nonequilibrium Thermodynamics: Transport and Rate Processes in Physical, Chemical and Biological Systems, 3rd ed; Elsevier: Amsterdam, The Netherlands, 2014. [Google Scholar]
  17. Demirel, Y. Nonequilibrium thermodynamics modeling of coupled biochemical cycles in living cells. J. Non-Newton. Fluid Mech 2010, 165, 953–972. [Google Scholar]
  18. Perdikis, D.; Huys, R.; Jirsa, V.K. Time scale hierarchies in the functional organization of complex behaviors. PLoS Comput. Biol 2011, 7, e1002198. [Google Scholar]
  19. Kurzynski, M. The Thermodynamic Machienery of Life; Springer: Berlin, Germany, 2006. [Google Scholar]
  20. Shin, Y.S.; Remacle, F.; Fan, R.; Hwang, K.; Wei, W.; Ahmad, H.; Levine, R.D. Protein signalling networks, from single cell fluctuations and information theory profiling. Biophys. J 2011, 100, 2378–2386. [Google Scholar]
  21. Collin, D.; Ritort, F.; Jarzynski, C.; Smith, S.B.; Tinoco, I., Jr.; Bustamante, C. Verification of the Crooks fluctuation thorem and recovery of RNA folding free energies. Nature 2005, 437, 231–234. [Google Scholar]
  22. Berezhkovskii, A.M.; Bezrukov, S.M. Fluctuation theorems in biological physics. AIP Conf. Proc 2009, 1129, 525–530. [Google Scholar]
  23. Seleznev, V.D.; Martyushev, L.M. Fluctuations, trajectory entropy, and Ziegler’s maximum entropy production. In Beyond the Second Law: Entropy Production and Non-Equilibrium Systems; Dewar, R.C., Lineweaver, C.H., Niven, R.K., Regenauer-Lieb, K., Eds.; Springer: Heidelberg, Germany, 2012. [Google Scholar]
  24. Odum, H.T. Self-organization, transformity and information. Science 1988, 242, 1132–1139. [Google Scholar]
  25. Dewar, R.C. Maximum entropy production and the fluctuation theorem. J. Phys. A: Math. Gen 2005, 38, L371–L381. [Google Scholar]
  26. Martyushev, L.M. Entropy and entropy production: Old misconception and new breakthroughs. Entropy 2013, 15, 1152–1170. [Google Scholar]
  27. Volk, T.; Pauluis, O. It is not the entropy you produce, rather, how you produce it. Philos. Trans. R. Soc. B 2010, 365, 1317–1322. [Google Scholar]
  28. Kleidon, A. Non-equilibrium thermodynamics, maximum entropy production and earth-system evolution. Philos. Trans. R. Soc. A 2010, 368, 181–196. [Google Scholar]
  29. Dewar, R.C. Maximum entropy production and plant optimization theories. Philos. Trans. R. Soc. B 2010, 363, 1429–1435. [Google Scholar]
  30. Martyushev, L.M. The maximum entropy production principle: Two basic questions. Philos. Trans. R. Soc. B 2010, 365, 1333–1334. [Google Scholar]
  31. Dewar, R.C. Maximum entropy production as an inference algorithm that translates physical assumption into macroscopic predictions: Don’t shoot the messenger. Entropy 2009, 11, 931–944. [Google Scholar]
  32. Davies, P.C.W.; Rieper, E.; Tuszynski, J.A. Self-organization and entropy reduction in a living cell. Biosystems 2013, 111, 1–10. [Google Scholar]
  33. Tseng, C-Y.; Tuszynski, J.A. Using entropy leads to a better understanding of biological systems. Entropy 2010, 12, 2450–2469. [Google Scholar]
  34. Hurtado, P.I.; Pérez-Espigares, C.; del Pozo, J.J.; Garrido, P.L. Symmetries in fluctuations far from equilibrium. Proc. Natl. Acad. Sci. USA 2011, 108, 7704–7709. [Google Scholar]
  35. Zupanović, P.; Kulic, D.; Juretić, D.; Dobovisek, A. On the problem of formulating principles in nonequilibrium thermodynamics. Entropy 2010, 12, 926–931. [Google Scholar]
  36. Jaynes, E.T. Probability Theory: The Logic of Science; Brentthorst, G.L., Ed.; Cambridge University Press: Cambridge, UK, 2003. [Google Scholar]
  37. Beretta, G.P. Modeling non-equilibrium dynamics of a discrete probability distribution: General rate equation for maximal entropy generation in a maximum-entropy landscape with time-dependent constraints. Entropy 2008, 10, 160–182. [Google Scholar]
  38. Beretta, G.P. Steepest-entropy-ascent and maximal-entropy-production dynamical models of irreversible relaxation to stable equilibrium from any non-equilibrium state. Unified treatment for six non-equilibrium frameworks 2013. arXiv:1306.3173v3:[cond-mat.stat-mech]. [Google Scholar]
  39. Beretta, G.P. Maximum entropy production rate in quantum thermodynamics. J. Phys. Conf. Ser 2010, 237, 012004. [Google Scholar]
  40. Hegde, R.S.; Kang, S.W. The concept of translocational regulation. J. Cell Biol 2008, 182, 225–232. [Google Scholar]
  41. Kurzynski, M. Statistical properties of the dichotomous noise generated in biochemical processes. Cell. Mol. Biol. Lett 2008, 13, 502–513. [Google Scholar]
  42. Maggio, B.; Fanani, M.L.; Oliveira, R.G. Biochemical and structural information transduction at the mesoscopic level in biointerfaces containing sphingolipids. Neurochem. Res 2002, 27, 547–557. [Google Scholar]
  43. Nowicka, A.; Grabowska, A.; Fersten, E. Interhemispheric transmission of information and functional asymmetry of the human brain. Neuropsychologia 1996, 34, 147–151. [Google Scholar]
  44. Dewar, R.C.; Juretić, D.; Zupanović, P. The functional design of the rotary enzyme atp synthase is consistent with maximum entropy production. Chem. Phys 2006, 30, 177–182. [Google Scholar]
  45. Hernandez-Lemus, E. Nonequilibrium thermodynamics of cell signaling. J. Thermodyn 2012, 2012, 432143. [Google Scholar]
  46. Weaver, I.S.; Dyke, J.G.; Oliver, K.I.C. Can the principle of maximum entropy production be used to predict the steady states of a Rayleigh-Bernard convective system? In Beyond the Second Law: Entropy Production and Non-Equilibrium Systems; Springer: Berlin, Germany, 2013. [Google Scholar]
  47. Del Castillo, L.F.; Vera-Cruz, P. Thermodynamic formulation of living systems and their evolution. J. Mod. Phys 2011, 2, 379–391. [Google Scholar]
  48. Piersa, J.; Piekniewski, F.; Schreiber, T. Theoretical model for mesoscopic-level scale-free self-organization of functional brain networks. IEEE Trans. Neural Netw 2010, 21, 1747–1758. [Google Scholar]
  49. Demirel, Y. Thermodynamically coupled heat and mass flows in a reaction-transport system with external resistances. Int. J. Heat Mass Transf 2009, 52, 2018–2025. [Google Scholar]
  50. Demirel, Y.; Sandler, S.I. Linear-nonequilibrium thermodynamics theory for coupled heat and mass transport. Int. J. Heat Mass Transf 2001, 44, 2439–2451. [Google Scholar]
  51. Cortassa, S.; Aon, M.A.; Westerhoff, H.V. Linear nonequilibrium thermodynamics describes the dynamics of an autocatalytic system. Biophys. J 1991, 60, 794–803. [Google Scholar]
  52. Rothschild, K.J.; Ellias, S.A.; Essig, A.Q.; Stanley, H.E. Nonequilibrium linear behavior of biological systems. Existence of enzyme-mediated multidimensional inflection points. Biophys. J 1980, 30, 209–220. [Google Scholar]
  53. Demirel, Y.; Sandler, S.I. Thermodynamics and bioenergetics. Biophys. Chem 2002, 97, 87–111. [Google Scholar]
  54. Caplan, S.R.; Essig, A. Bioenergetics and Linear Nonequilibrium Thermodynamics—The Steady State; Harvard University Press: New York, NY, USA, 1999. [Google Scholar]

Share and Cite

MDPI and ACS Style

Demirel, Y. Information in Biological Systems and the Fluctuation Theorem. Entropy 2014, 16, 1931-1948. https://0-doi-org.brum.beds.ac.uk/10.3390/e16041931

AMA Style

Demirel Y. Information in Biological Systems and the Fluctuation Theorem. Entropy. 2014; 16(4):1931-1948. https://0-doi-org.brum.beds.ac.uk/10.3390/e16041931

Chicago/Turabian Style

Demirel, Yaşar. 2014. "Information in Biological Systems and the Fluctuation Theorem" Entropy 16, no. 4: 1931-1948. https://0-doi-org.brum.beds.ac.uk/10.3390/e16041931

Article Metrics

Back to TopTop