Next Article in Journal
Morphogenesis of Urban Water Distribution Networks: A Spatiotemporal Planning Approach for Cost-Efficient and Reliable Supply
Next Article in Special Issue
Analysis of Heat Dissipation and Reliability in Information Erasure: A Gaussian Mixture Approach
Previous Article in Journal
Information Theory and Cognition: A Review
Previous Article in Special Issue
Writing, Proofreading and Editing in Information Theory
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Energy Dissipation and Information Flow in Coupled Markovian Systems

by
Matthew E. Quenneville
1,2,* and
David A. Sivak
1,*
1
Department of Physics, Simon Fraser University, Burnaby, BC V5A 1S6, Canada
2
Department of Physics, University of California, Berkeley, CA 94720, USA
*
Authors to whom correspondence should be addressed.
Entropy 2018, 20(9), 707; https://doi.org/10.3390/e20090707
Submission received: 30 June 2018 / Revised: 11 September 2018 / Accepted: 13 September 2018 / Published: 14 September 2018
(This article belongs to the Special Issue Thermodynamics of Information Processing)

Abstract

:
A stochastic system under the influence of a stochastic environment is correlated with both present and future states of the environment. Such a system can be seen as implicitly implementing a predictive model of future environmental states. The non-predictive model complexity has been shown to lower-bound the thermodynamic dissipation. Here we explore these statistical and physical quantities at steady state in simple models. We show that under quasi-static driving this model complexity saturates the dissipation. Beyond the quasi-static limit, we demonstrate a lower bound on the ratio of this model complexity to total dissipation, that is realized in the limit of weak driving.

1. Introduction

Information theory has long been recognized as fundamentally linked to statistical mechanics [1]. Perhaps most prominently, Landauer showed that information processing can require unavoidable dissipative costs [2]; for example, bit erasure requires that some free energy be dissipated [3,4].
A stochastic system processes information through interaction with its environment: through environment-dependent dynamics the system responds to environmental changes and thereby gains information about the environment [5,6]. For an environment exhibiting temporal correlations, the system carries information about the past, present, and future environmental states. In this way, the system implicitly implements a predictive model of future environmental states [7].
One can quantify this model’s inefficiency by the unnecessary model complexity: information the model retains about the past that does not aid in predicting the future. Recent work established the equivalence between this predictive inefficiency and thermodynamic inefficiency [7], providing another fundamental connection between information theory and statistical mechanics. This connection hints at a design principle for molecular machines operating out of equilibrium [8,9].
These results are potentially applicable to many systems. Biology (where there is a presumptive selective advantage associated with energetic efficiency) furnishes examples of organisms [10], neurons [11], and reaction networks [12] that are capable of learning statistical patterns in their respective environments.
To further illuminate this abstract connection between model complexity and thermodynamic dissipation, here we analytically and numerically explore these statistical and physical quantities in illustrative models. We demonstrate the information learned by the system about its environment per unit energy dissipated (equivalently the ratio of dissipation during system and environmental dynamics) in the limits of quasi-static driving (Table 1) and weak driving (8), which forms the lower bound for generic driving. The dependence of these quantities on the system and environmental parameters motivates a potential guiding principle for functional performance.

2. Theoretical Background

Consider a stochastic process { X t | t { 0 , Δ t , , τ Δ t , τ } } representing the dynamical evolution of some environmental variable. At a given time, the environment can occupy any of the states X . The time evolution of the environment, X t , is governed by the transition probabilities p ( x t | { x t } t = 0 t Δ t ) p ( X t = x t | { X t = x t } t = 0 t Δ t ) for x t , x t X . Let another stochastic process { Y t | t { 0 , Δ t , , τ Δ t , τ } } represent the system of interest, which can occupy states Y . Take the dynamics of Y t to depend on the environmental state via the time-independent conditional transition probabilities p ( y | y , x ) p ( Y t + Δ t = y | Y t = y , X t + Δ t = x ) , where y , y Y and x X . We model the evolution of these two stochastic processes using an alternating time step pattern illustrated in Figure 1. For computational simplicity, we take the system and environment to evolve in discrete time steps. However, we set these time steps to be very small compared to the system and environment evolution time-scales in order to closely approximate continuous time evolution. One complete time step is composed of two sub-steps: one work step of environmental dynamics, when the environment does work on the system, followed by one relaxation step of system dynamics, when the system exchanges heat with a thermal bath maintained at temperature T and inverse temperature β ( k B T ) 1 .
System dynamics Y t obey the principle of microscopic reversibility [13]. Reference [7] used such a framework to study the relationship between thermodynamic and information-theoretic quantities. One prominent information-theoretic quantity is the nostalgia I nos ( t ) I mem ( t ) I pred ( t ) , where the mutual information I mem ( t ) I [ X t , Y t ] [14] between the current system state and past environmental state represents the memory stored by the system about the environment, and the mutual information I pred ( t ) I [ X t + Δ t , Y t ] between current system state and future environmental state represents the ability of the system to predict future environmental states. Reference [7] showed that
β W diss ( t ) = I mem ( t ) I pred ( t ) β Δ F neq relax ( t ) ,
where W diss ( t ) is the average total dissipation (defined as the average work done on the system, minus the average change in nonequilibrium free energy of the system) over consecutive work and relaxation steps from t to t + Δ t , and Δ F neq relax ( t ) is the average change in nonequilibrium free energy of the system over the relaxation step from t to t + Δ t . The angled brackets indicate that the average system energy and the system entropy are calculated for a particular environmental state, followed by averaging over environmental states. Upon calculating these quantities at each time step, they are combined to calculate the average free energy difference, average work, and average dissipation. Since β Δ F neq relax ( t ) 0 [15],
β W diss ( t ) I mem ( t ) I pred ( t ) .

3. Results

We explore the tightness of the bound (2) through the ratio of nostalgia to dissipation,
ϕ ( t ) I mem ( t ) I pred ( t ) β W diss ( t ) .
This nostalgia-dissipation ratio is bounded by 0 ϕ ( t ) 1 and (after substituting Equation (14) from [7]) can be interpreted as the fraction of dissipation which occurs over work steps,
ϕ ( t ) = W diss [ x t x t + Δ t ] W diss ( t )
where W diss [ x t x t + Δ t ] is the average dissipation during the work step x t x t + Δ t , and W diss ( t ) is the sum of the average dissipation during consecutive single work and relaxation ( y t y t + Δ t ) steps. When the environment and system reach steady state, ϕ can be rewritten as:
ϕ ss = ( t ) β Q ,
where ( t ) I [ X t + Δ t , Y t + Δ t ] I [ X t + Δ t , Y t ] is a learning rate which quantifies the information gained by the system about the current environmental state [16]. The denominator follows from the facts that at steady state − Q = W (due to energy conservation) and W = W diss [7]. Refs. [17,18] identify the ratio in Equation (5) as an informational efficiency quantifying the rate at which the system learns about the environment, relative to the total thermodynamic entropy production. By considering (4), these results can be recast in terms of dissipative energy flows.
In order to explore the physical implications of (1) and (2), we investigate the behavior of the relevant information-theoretic and thermodynamic quantities in concrete models that provide physical intuition. We initially restrict our attention to a simple environment model, consisting of two states with a constant transition probability κ env Δ t in each time step.

3.1. Alternating Energy Levels

One of the simplest possible system models with non-trivial behavior is a two-state system with dynamics described by two kinetic rates, k + and k (Figure 2a). Since we are using discrete time steps, we define the rate k of a given transition to be that of a continuous time model, which is then discretized by choosing Δ t such that k Δ t is small, and this transition occurs with probability k Δ t in each time step. This model possesses a symmetry such that it is unchanged when both the system-state labels and environment-state labels are interchanged. Due to this symmetry, we take k + k without loss of generality.
Given the constraint of detailed balance [13], such a model describes a two-state system with an energy gap (normalized by temperature) β Δ E = ln k + k that flips according to the environment state. System states y 1 and y 2 are separated by Δ E 12 A = Δ E when the environment is in state x A and Δ E 12 B = Δ E for environmental state x B . A characteristic rate at which the system reaches equilibrium, and thus becomes correlated with the current environment (and decorrelated with past environmental states), is the minimum transition rate,
k sys k ,
the rate of the rate-limiting step for rearrangement of system probability among its states. The transition ratio k sys / κ env expresses this rate relative to the environmental transition rate. Figure 3 shows the steady-state nostalgia I nos ss , which increases with both k sys / κ env and β Δ E , and tends to 0 as either k sys / κ env or β Δ E approach 0.
The dissipation ratio ϕ ( t ) approaches a steady-state value ϕ ss for each choice of parameters. Figure 4 shows that ϕ ss follows the same general trends as I nos ss , increasing with both energy gap magnitude β Δ E and transition ratio k sys / κ env .
In the limit of large temperature, when the energy gap is small compared to the ambient thermal energy ( β Δ E 1 ), ϕ ss reduces to a positive function of the equilibration rates of the system ( k sys ) and environment ( κ env ):
ϕ ss = 1 κ env Δ t 1 2 κ env Δ t + κ env / k sys , β Δ E 1 .
This is found by explicitly calculating the steady-state probability distribution. In moving from discrete-time steps to a continuous-time parameter, time step size becomes small compared to system and environment transition times, reducing (7) to
ϕ ss = 1 1 + κ env / k sys , κ env Δ t , k sys Δ t , β Δ E 1 .
Thus, in the weak driving (high-temperature) limit ( β Δ E 1 ), if the system evolves quickly compared to the environment, most of the dissipation occurs during work steps, the learning rate approaches the total thermodynamic entropy production, and the bound (2) approaches saturation. Conversely (still restricting to high temperature), when the system evolves slowly compared to the environment, most of the dissipation occurs during relaxation steps, the learning rate is small compared to the total thermodynamic entropy production, and the nostalgia is small compared to the bound in (2).
Further, Figure 4 shows that ϕ ss increases with β Δ E . Thus, this weak-driving limit gives a non-zero lower bound on ϕ ss ,
1 κ env Δ t 1 2 κ env Δ t + κ env / k sys ϕ ss 1 ,
or in the limit of small time steps,
1 1 + κ env / k sys ϕ ss 1 , κ env Δ t , k sys Δ t 1 .
If the system evolves quickly compared to its environment, nostalgia is the dominant form of dissipation, regardless of β Δ E . The limit of quasi-static driving is defined by k sys / κ env 1 . In this limit, ϕ ss = 1 , and therefore the nostalgia (the implicit predictive model inefficiency) is equal to the total dissipation (the thermodynamic inefficiency). The bounds in Equations (9) and (10) therefore hold beyond the quasi-static limit. The bound in Equation (2) can be looser for systems farther from the limit of quasi-static driving. These limits on ϕ ss are laid out in Table 1.
The transition ratio k sys / κ env is also equal to the ratio of characteristic timescales τ env / τ sys . Thus the bound for steady-state dissipation ratio (10) can be recast as
1 1 + N ϕ ss 1 , κ env Δ t , k sys Δ t 1 ,
for N independent ‘measurements’ the system makes during each environment state [19]. From this perspective, the bound is proportional (up to a multiplicative constant) to the Berg-Purcell lower bound on environmental measurement precision of a single receptor [20].

3.2. Arbitrary System Rates

The results of the previous section were derived for a simple two-state system, in which the energy difference between system states flips with environment transitions, and the system’s equilibration rate is independent of the environment state. We generalize this model to a two-state system with arbitrary rates and hence—by detailed balance—arbitrary energies (Figure 2b). Given the four transition rates k 12 A , k 21 A , k 12 B , and k 21 B , when the environment is in state X = x A the system has energy gap (normalized by temperature) β Δ E 12 A = ln k 21 A k 12 A between state y 1 and y 2 , and a characteristic equilibration rate k A = min ( k 12 A , k 21 A ) . Similarly, when the environment is in state X = x B , the corresponding parameters are β Δ E 12 B = ln k 21 B k 12 B and k B = min ( k 12 B , k 21 B ) . Let Δ E A = | Δ E 12 A | and Δ E B = | Δ E 12 B | be the magnitudes of the energy gaps in environment states x A and x B , respectively. The energy gaps Δ E A and Δ E B are free to be aligned ( Δ E 12 A Δ E 12 B > 0 ) or anti-aligned ( Δ E 12 A Δ E 12 B < 0 ). A characteristic equilibration rate of the system is thus
k sys = 2 1 k A + 1 k B .
Equations (7) and (8) also apply in this case of arbitrary system rates. Figure 5 shows that across the explored parameter space, the steady-state dissipation ratio ϕ ss lies above the bound (9), with ϕ ss approaching the bound in the weak-driving limit, β ( Δ E A + Δ E B ) 1 . We conclude that Equations (9) and (10) apply for arbitrary system rates.

3.3. Arbitrary Environment Rates

Here we generalize our previous assumption of a fixed environmental transition rate κ env , independent of the present environmental state. We now allow for two different transition rates, κ A B and κ B A , out of the two states A and B (Figure 2c).
As above, we define the characteristic system equilibration rates k A and k B when the environment is in states X = x A and X = x B , respectively. A characteristic equilibration rate for the system is the harmonic mean of the system transition rates for each environment state, weighted by the rate of switching out of that environmental state:
k sys = κ A B + κ B A κ A B k A + κ B A k B .
For a uniform environmental transition rate (independent of environment state), this reduces to the previous un-weighted harmonic mean (12). Here we define a characteristic environmental rate κ ¯ env as the arithmetic mean of the transition rates between the environment states
κ ¯ env = κ A B + κ B A 2 .
With these definitions, Equations (7) and (8) (replacing κ env with κ ¯ env ) apply to this case of arbitrary transition probabilities. Figure 6 shows that across a range of system and environment parameter values, bounds (9) and (10) hold. The proposed bound depends on the system only through k sys , and hence the environmental-state-dependent equilibration rates k A , k B .

4. Discussion

Reference [7] described a relationship between dissipation and nostalgia, a novel abstract information-theoretical concept quantifying the information the system stores about its environment that fails to be predictive of future environmental states. Energetically efficient performance requires avoiding this nostalgia. This framework suggests applications in biology, where living things are influenced by, and thus learn about, their environments. Recent explorations of the implications of this relationship have illuminated its behavior in model neurons [21], its relation to sensor performance [18], and the variation of it and related quantities across several biophysical model systems [16].
Here we focused on a physical understanding of the relationships between the information-theoretic and thermodynamic quantities. We calculated the nostalgia and nostalgia-dissipation ratio in some model systems. Calculating these quantities over the parameter space of simple systems helps to establish an intuitive picture: when the system is quick to relax and strongly driven by the environment (energy gaps vary strongly with environment state), the nostalgia provides a tight lower bound on the steady-state dissipation (8); equivalently, the system learns more about the environment per unit heat dissipated.
For fixed equilibration rates, we found the ratio of nostalgia to total dissipation is minimized in the weak-driving limit. Further, the ratio of nostalgia to total dissipation is bounded from below by this weak-driving limit (10), which depends on the system only through its overall equilibration rate. If the system is driven quasi-statically by its environment, this bound dictates that the predictive inefficiency (nostalgia) is responsible for all thermodynamic inefficiency (dissipation). Contexts further from the quasi-static limit can be further from saturating the bound in Equation (2), and hence have a smaller relative contribution from model inefficiency.
One could explore more complex models than the simple Markovian two-state systems and environments described here. One could expand the system to more states [17], or expand the environmental behavior through additional states or non-Markovian dynamics, since this theoretical framework does not restrict the form of these transitions.

Author Contributions

M.E.Q. and D.A.S. conceived and designed the study; M.E.Q. performed the analytic and numerical calculations; M.E.Q. analyzed the data; M.E.Q. and D.A.S. wrote the paper.

Funding

This work is supported by a Natural Sciences and Engineering Research Council of Canada (NSERC) Discovery Grant (D.A.S.) and a Tier-II Canada Research Chair (D.A.S.).

Acknowledgments

The authors thank Volodymyr Polyakov (Physics and Technology, Igor Sikorsky Kyiv Polytechnic Institute) for insightful comments on the manuscript.

Conflicts of Interest

The authors declare no conflict of interest. The funding sponsors had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, and in the decision to publish the results.

References

  1. Jaynes, E.T. Information theory and statistical mechanics. Phys. Rev. 1957, 106, 620–630. [Google Scholar] [CrossRef]
  2. Landauer, R. Irreversibility and heat generation in the computing process. IBM J. Res. Dev. 1961, 5, 183–191. [Google Scholar] [CrossRef]
  3. Bérut, A.; Arakelyan, A.; Petrosyan, A.; Ciliberto, S.; Dillenschneider, R.; Lutz, E. Experimental verification of Landauer’s principle linking information and thermodynamics. Nature 2012, 483, 187–189. [Google Scholar] [CrossRef] [PubMed]
  4. Jun, Y.; Gavrilov, M.; Bechhoefer, J. High-precision test of Landauer’s principle in a feedback trap. Phys. Rev. Lett. 2014, 113, 190601. [Google Scholar] [CrossRef] [PubMed]
  5. Cheong, R.; Rhee, A.; Wang, C.J.; Nemenman, I.; Levchenko, A. Information transduction capacity of noisy biochemical signaling networks. Science 2011, 334, 354–358. [Google Scholar] [CrossRef] [PubMed]
  6. Mehta, P.; Schwab, D.J. Energetic costs of cellular computation. Proc. Natl. Acad. Sci. USA 2012, 109, 17978–17982. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Still, S.; Sivak, D.A.; Bell, A.J.; Crooks, G.E. Thermodynamics of prediction. Phys. Rev. Lett. 2012, 109, 120604. [Google Scholar] [CrossRef] [PubMed]
  8. Hess, H. Engineering applications of biomolecular motors. Annu. Rev. Biomed. Eng. 2011, 13, 429–450. [Google Scholar] [CrossRef] [PubMed]
  9. Brown, A.I.; Sivak, D.A. Toward the design principles of molecular machines. Phys. Can. 2017, 73, 61–66. [Google Scholar]
  10. Tagkopoulos, I.; Liu, Y.C.; Tavazoie, S. Predictive Behavior within Microbial Genetic Networks. Science 2008, 320, 1313–1317. [Google Scholar] [CrossRef] [PubMed]
  11. Laughlin, S. A simple coding procedure enhances a neuron’s information capacity. Z. Naturforsch. C 1981, 36, 910–912. [Google Scholar] [CrossRef] [PubMed]
  12. McGregor, S.; Vasas, V.; Husbands, P.; Fernando, C. Evolution of Associative Learning in Chemical Networks. PLoS Comput. Biol. 2012, 8, e1002739. [Google Scholar] [CrossRef] [PubMed]
  13. Chandler, D. Introduction to Modern Statistical Mechanics; Oxford University Press: Oxford, UK, 1987. [Google Scholar]
  14. Cover, T.M.; Thomas, J.A. Elements of Information Theory; John Wiley & Sons: Hoboken, NJ, USA, 2012. [Google Scholar]
  15. Schnakenberg, J. Network theory of microscopic and macroscopic behavior of master equation systems. Rev. Mod. Phys. 1976, 48, 571–585. [Google Scholar] [CrossRef]
  16. Brittain, R.A.; Jones, N.S.; Ouldridge, T.E. What we learn from the learning rate. J. Stat. Mech. Theory Exp. 2017, 2017, 063502. [Google Scholar] [CrossRef] [Green Version]
  17. Barato, A.C.; Hartich, D.; Seifert, U. Efficiency of cellular information processing. New J. Phys. 2014, 16, 103024. [Google Scholar] [CrossRef] [Green Version]
  18. Hartich, D.; Barato, A.C.; Seifert, U. Sensory capacity: An information theoretical measure of the performance of a sensor. Phys. Rev. E 2016, 93, 022116. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  19. Govern, C.C.; ten Wolde, P.R. Energy Dissipation and Noise Correlations in Biochemical Sensing. Phys. Rev. Lett. 2014, 113, 258102. [Google Scholar] [CrossRef] [PubMed]
  20. Berg, H.C.; Purcell, E.M. Physics of chemoreception. Biophys. J. 1977, 20, 193–219. [Google Scholar] [CrossRef] [Green Version]
  21. McIntosh, L. Information Processing and Energy Dissipation in Neurons. Ph.D. Thesis, University of Hawaii at Manoa, Honolulu, HI, USA, 2012. [Google Scholar]
Figure 1. Discrete-time system and environmental dynamics. The system Y t and environment X t alternate steps, with system evolution during relaxation steps, and environment evolution during work steps.
Figure 1. Discrete-time system and environmental dynamics. The system Y t and environment X t alternate steps, with system evolution during relaxation steps, and environment evolution during work steps.
Entropy 20 00707 g001
Figure 2. Model kinetics. States and transition rates for models with two system states and two environment states. (a) System equilibration rate and energy gap magnitude and environment transition rate are independent of environment state, but the direction of the energy gap switches with environment state; (b) System equilibration rate and energy gap vary with environment state. Environment transition rate is fixed; (c) System equilibration rate and energy gap and environment transition rate vary with environment state.
Figure 2. Model kinetics. States and transition rates for models with two system states and two environment states. (a) System equilibration rate and energy gap magnitude and environment transition rate are independent of environment state, but the direction of the energy gap switches with environment state; (b) System equilibration rate and energy gap vary with environment state. Environment transition rate is fixed; (c) System equilibration rate and energy gap and environment transition rate vary with environment state.
Entropy 20 00707 g002
Figure 3. Nostalgia increases with energy gap and system equilibration rate. Nostalgia I nos ss as a function of the energy gap β Δ E and transition ratio k sys / κ env ( κ env Δ t = 10 12 ).
Figure 3. Nostalgia increases with energy gap and system equilibration rate. Nostalgia I nos ss as a function of the energy gap β Δ E and transition ratio k sys / κ env ( κ env Δ t = 10 12 ).
Entropy 20 00707 g003
Figure 4. Dissipation ratio increases with energy gap and system equilibration rate. Steady-state dissipation ratio ϕ ss I nos ss / β W diss s s as a function of the energy gap β Δ E and transition ratio k sys / κ env ( κ env Δ t = 10 12 ).
Figure 4. Dissipation ratio increases with energy gap and system equilibration rate. Steady-state dissipation ratio ϕ ss I nos ss / β W diss s s as a function of the energy gap β Δ E and transition ratio k sys / κ env ( κ env Δ t = 10 12 ).
Entropy 20 00707 g004
Figure 5. Lower bound on dissipation ratio for fixed environment transition rate. The steady-state dissipation ratio ϕ ss is lower-bounded by the black curve (9) for all values of the transition ratio k sys / κ env . Each point corresponds to a particular set of parameters k A , k B , β Δ E A , and β Δ E B . (a) Models in which the energy gaps Δ E A and Δ E B are anti-aligned; (b) Models in which the energy gaps Δ E A and Δ E B are aligned ( κ env Δ t = 10 12 ).
Figure 5. Lower bound on dissipation ratio for fixed environment transition rate. The steady-state dissipation ratio ϕ ss is lower-bounded by the black curve (9) for all values of the transition ratio k sys / κ env . Each point corresponds to a particular set of parameters k A , k B , β Δ E A , and β Δ E B . (a) Models in which the energy gaps Δ E A and Δ E B are anti-aligned; (b) Models in which the energy gaps Δ E A and Δ E B are aligned ( κ env Δ t = 10 12 ).
Entropy 20 00707 g005
Figure 6. Lower bound on dissipation ratio for varying environment transition rate. The steady-state dissipation ratio ϕ ss is lower-bounded by the black curve (9) for all values of the transition ratio k sys / κ env . Each point corresponds to a particular set of parameters k A , k B , β Δ E A , and β Δ E B . The environment transition rates are κ A B = 1 . 8 κ ¯ env and κ B A = 0 . 2 κ ¯ env . (a) Models in which the energy gaps Δ E A and Δ E B are anti-aligned; (b) Models in which the energy gaps Δ E A and Δ E B are aligned ( κ ¯ env Δ t = 10 12 ).
Figure 6. Lower bound on dissipation ratio for varying environment transition rate. The steady-state dissipation ratio ϕ ss is lower-bounded by the black curve (9) for all values of the transition ratio k sys / κ env . Each point corresponds to a particular set of parameters k A , k B , β Δ E A , and β Δ E B . The environment transition rates are κ A B = 1 . 8 κ ¯ env and κ B A = 0 . 2 κ ¯ env . (a) Models in which the energy gaps Δ E A and Δ E B are anti-aligned; (b) Models in which the energy gaps Δ E A and Δ E B are aligned ( κ ¯ env Δ t = 10 12 ).
Entropy 20 00707 g006
Table 1. Limiting behavior of dissipation ratio. Steady-state dissipation ratio ϕ ss in the various limits of driving strength and speed. These limits are given by the bound in Equation (10), valid in the limit of continuous time.
Table 1. Limiting behavior of dissipation ratio. Steady-state dissipation ratio ϕ ss in the various limits of driving strength and speed. These limits are given by the bound in Equation (10), valid in the limit of continuous time.
Driving StrengthWeakStrong
Driving Speed ( β Δ E 1 )( β Δ E 1 )
Quasi-static( κ env k sys ) ϕ ss = 1 ϕ ss = 1
Intermediate( κ env k sys ) ϕ ss = ( 1 + κ env / k sys ) 1 ( 1 + κ env / k sys ) 1 ϕ ss 1
Fast( κ env k sys ) ϕ ss = k sys / κ env k sys / κ env ϕ ss 1

Share and Cite

MDPI and ACS Style

Quenneville, M.E.; Sivak, D.A. Energy Dissipation and Information Flow in Coupled Markovian Systems. Entropy 2018, 20, 707. https://0-doi-org.brum.beds.ac.uk/10.3390/e20090707

AMA Style

Quenneville ME, Sivak DA. Energy Dissipation and Information Flow in Coupled Markovian Systems. Entropy. 2018; 20(9):707. https://0-doi-org.brum.beds.ac.uk/10.3390/e20090707

Chicago/Turabian Style

Quenneville, Matthew E., and David A. Sivak. 2018. "Energy Dissipation and Information Flow in Coupled Markovian Systems" Entropy 20, no. 9: 707. https://0-doi-org.brum.beds.ac.uk/10.3390/e20090707

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop