Next Article in Journal / Special Issue
Extended Thermodynamics: a Theory of Symmetric Hyperbolic Field Equations
Previous Article in Journal
Spacetime Foam: From Entropy and Holography to Infinite Statistics and Nonlocality
Previous Article in Special Issue
The Second Entropy: A Variational Principle for Time-dependent Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Entropy and Energy, – a Universal Competition

Technical University Berlin, Berlin, Germany
Submission received: 31 July 2008 / Revised: 9 September 2008 / Accepted: 22 September 2008 / Published: 15 October 2008

Abstract

:
When a body approaches equilibrium, energy tends to a minimum and entropy tends to a maximum. Often, or usually, the two tendencies favour different configurations of the body. Thus energy is deterministic in the sense that it favours fixed positions for the atoms, while entropy randomizes the positions. Both may exert considerable forces in the attempt to reach their objectives. Therefore they have to compromise; indeed, under most circumstances it is the available free energy which achieves a minimum. For low temperatures that free energy is energy itself, while for high temperatures it is determined by entropy. Several examples are provided for the roles of energy and entropy as competitors: – Planetary atmospheres; – osmosis; – phase transitions in gases and liquids and in shape memory alloys, and – chemical reactions, viz. the Haber Bosch synthesis of ammonia and photosynthesis. Some historical remarks are strewn through the text to make the reader appreciate the difficulties encountered by the pioneers in understanding the subtlety of the concept of entropy, and in convincing others of the validity and relevance of their arguments.

1. First and second laws of thermodynamics

The mid-nineteenth century saw the discovery of the two laws of thermodynamics, virtually simultaneously. The first law states that the rate of change of energy of a body – internal potential and kinetic energy – is due to heating Q ˙ and working W ˙
d ( U + E pot + K ) d t = Q ˙ + W ˙   .
The second law is an inequality. It states that the rate of change of the entropy of a body is larger than the heating divided by the homogeneous surface temperature T0 of the body.
d S d t Q ˙ T 0   .
The equality holds when the heating occurs slowly, or reversibly in the jargon of thermodynamics.
Clausius formulated both these laws more or less in the above form and he was moved to summarize them in the slogan
Die Energie der Welt ist constant.
Die Entropie der Welt strebt einem Maximum zu.
Obviously he assumed that die Welt, the universe, is not subject to either heating or working.
The second law reveals a teleological tendency of nature in that the entropy of an adiabatic body can only grow. And, if equilibrium is reached, a stationary state, that state is characterized by a maximum of entropy. Nothing can happen in equilibrium, Clausius thought, so that he could postulate the eventual heat death of the universe. Says he:
Entropy 10 00462 i001
...when the maximum of entropy is reached, no change can occur...anymore. The world is then in a dead stagnant state.
Not everybody could bring himself to like the bleak prospect, and Loschmidt deplored the situation most poignantly when he complained about
Entropy 10 00462 i002 Entropy 10 00462 i003
the terroristic nimbus of the second law …, which lets it appear as the destructive principle of all life in the universe.
There was quite some interest in the circles of the natural philosophers at the time, but now – after a century and a half – scientists take a relaxed view of the heat death. They simply do not know whether it will occur or not and – unlike 19th century physicists – modern physicists have become resigned to the concept of ignoramus, -- and perhaps even to the concept of ignorabimus.

2. Available free energy

However, adiabatic systems are rare and it is appropriate to ask whether there is another teleological quantity when a body is not adiabatic. In general the answer to that question is: No! The answer is negative, because under general boundary conditions we cannot expect an equilibrium to be approached. Yet there are special circumstances where that is the case. A necessary condition is that the boundary temperature is not only homogeneous but also constant. In that case the heating Q ˙ may be eliminated between the first and second laws and we obtain
d ( U + E pot + K T 0 S ) d t W ˙   .
Thus, before we may make a statement about growth or decay we need to specify the role of the working W ˙ which is the power of the stress applied to the surface (tij is the stress tensor, vi the velocity and ni the outer unit normal of the surface ∂V).,
W ˙ = V t i j v i n j d A  .
The most popular case, -- and the one to which I restrict the attention in this presentation, -- is the one where the surface is at rest so that W ˙ = 0 holds.
In that case U + E pot + K T 0 S can obviously only decrease so that it tends to a minimum as equilibrium is approached, a stationary state in which all change has come to an end. Generically we call this quantity the available free energy and denote it by A
A = U + E pot + K T 0 S  .
The available free energy, or availability A is a generic expression for the quantity that becomes minimal in a body. It may be different from U + E pot + K T 0 S ,if W ˙ = V t i j v i n j d A does not vanish. An important special case is the one with A = U + p 0 V + E pot + K T 0 S , where p0 is the constant and homogeneous pressure on a movable part of V , provided that all points of that movable part have the same gravitational potential.
Note that inside V anything and everything may occur initially: turbulent motion, friction, heat conduction, phase changes, chemical reactions, etc. Temperature and pressure inside V are arbitrary fields initially. However, as long as the boundary conditions – To constant and homogeneous on V and vi = 0 on V -- are satisfied, the available free energy tends to a minimum. Thus we conclude that a decrease of energy is conducive to equilibrium, and so is an increase of entropy. In a manner of speaking we may say that the energy wants to reach a minimum and that the entropy wants to reach a maximum, but they have to compromise and so it is A which achieves a minimum.
If T0 is small, so that the entropic part of A may be neglected, the available free energy becomes minimal because the energy becomes minimal. If, however, T0 is large, so that the energetic part may be neglected, the available free energy becomes minimal because the entropy approaches a maximum.
In general, i.e. for intermediate values of T0 it is neither the energy that achieves a minimum, nor the entropy that becomes maximal. The two tendencies compete and find a compromise in which A is minimal.
In the nineteenth century – after the formulation of the second law – there was a noisy controversy between energetics, represented by Ostwald, and thermodynamics favoured by Boltzmann. Energeticists maintained that the entropy was not needed. They were wrong, but they did have a point, albeit only at small temperatures. Planck was involved – in a minor role – in the discussion as an unappreciated supporter of Boltzmann´s thermodynamic view. It was this controversy which prompted Planck to issue his oft-quoted dictum: The only way to get revolutionary advances in science accepted is to wait for all old scientists to die.
Note added in proof (in response to a remark by a reviewer): It is tempting to think that the available free energy A = U + E pot + K T 0 S is the Helmholtz free energy, but it is not, or not in general. A does reduce to the Helmholtz free energy when the temperature has become homogeneous inside the whole volume of the body, and is then equal to the surface temperature T0. That is not generally the case, but it may happen in the last stages of the approach to equilibrium. A similar remark refers to the available free energy A = U + p 0 V + E pot + K T 0 S and the Gibbs free energy of conventional “close to equilibrium”-thermodynamics.

3. Entropic growth - the strategy of nature

So, what is entropy; that somewhat mysterious quantity which wants to grow? The energy has a good interpretation, -- it has become a household word – in particular the gravitational potential energy: If a body is high up, it can potentially (sic) produce a big impact upon falling. But entropy? Few people outside physics understand the concept, even well-educated ones, and therefore I shall take the time to explain it and – in doing so -- reveal an important part of the strategy of nature. I apologize to the colleagues who do know.
First of all, the second law d S d t Q ˙ T 0 does not say what entropy is; it gives a property of entropy, but not its definition, or interpretation. That interpretation was found by Boltzmann who was eventually capable to fully understand the subtle concept and to explain it to a skeptical world. Boltzmann´s interpretation is encapsulated in the formula
S = k ln w  ,
which is either the most important formula of physics or is a close second in importance – next to E=mc2 - depending on personal preference. w is he number of possibilities in which the atoms of a body can be distributed. I proceed to give an example and then I shall discuss it.
Let there be N atoms in a volume with P points {x1,x2,….xP} which a particle may occupy. A distribution is the set of numbers { N x 1 N x 2 ... N x P } of atoms on the occupiable points and – by the rules of combinatorics - there are w = N ! i = 1 P N x i ! possible realizations for that distribution.
Entropy 10 00462 i004
Now, according to Boltzmann we have to proceed with a sequence of three arguments as follows.
  • In the course of the thermal motion of the atoms a realization of the gas changes trillions of times in each second.
  • Each realization occurs just as often as any other one. That is the only possible unbiasedassumption about the realizations. And it implies that a distribution with many realizations occurs more frequently than a distribution with few realizations. Most often is the distribution with most realizations, and rather obviously – from w = N ! / i = 1 P N x i ! - that is the equi-distribution, in which each point is occupied by the equal number N/P of atoms, so that the entropy reads ( The Stirling formula is used here and it is assumed that N/P is a large number. This is a somewhat precarious assumption as the later development of statistical thermodynamics has shown, but I shall not go into this at this point.) The interested reader is referred to the book [1]).
    S=kNlnP.
    That is the entropy of an equilibrium state, a state corresponding to a stationary homogeneous distribution.
  • The number of points in a volume V is unknown, but it should be proportional to the size of V so that we have
    P = α V ,  hence  S = k ( N ln V + N ln α ) .
    Therefore S grows with V and we may say that S – in growing -- “wishes ” to distribute the atoms of a body homogeneously over as big a volume as possible.
So, what happens if we start with a distribution of only a few realizations, for instance when all atoms lie in one position, where there is only one realization so that w=1 and S=0 hold? Under the random thermal motion this orderly distribution is very quickly messed up and replaced by another one which has more realizations. Eventually the body will find itself in the homogeneous equi-distribution which has most realizations, and therefore highest entropy. That in fact is the nature of entropic growth. Thus there is an element of probability – not certainty! – in the entropic growth. After all, the initial orderly distribution may still reappear. However, the probability for the entropic growth is overwhelming when the body consists of many atoms.
So, here we obtain a glimpse of the strategy of nature; or how nature can produce growth from random thermal motion. To be sure it is not much of a strategy. It is the “strategy” of the roulette wheel, i.e. the reign of randomness and chance, but fairly well predictable. The gambling halls live well on that type of predictability and so do insurance companies. Nor is the strategy of nature limited to gambling and to the distribution of atom; it occurs everywhere: from the evolution of species to the fate of planetary atmospheres.
The probabilistic aspects of gambling and insurance were known since time immemorial. The great discovery of the nineteenth century was the realization that the same probabilistic aspects govern physical processes, and that they are connected with the concept of entropy.
The eminent physicist J.C.Maxwell was puzzled by the stochastic nature of Boltzmann´s new physics. In a letter to his friend Tait he muses [probability calculus], of which we usually assume that it refers to gambling, dicing, and betting, and should therefore be wholly immoral, is the only mathematics for practical people which we should be.
True to this recommendation Maxwell was one of the first to employ probabilistic methods successfully in his kinetic theory.
I have tried to be non-technical in this presentation. But for those who have found the presentation still too cumbersome, let me summarize by saying this: The growth of entropy in nature may quite reasonably be likened to the growth of the losses of the roulette addict as he continues to gamble. In that way we really do obtain an intuitive understanding for entropy and a plausible interpretation.
Let us next look at planetary atmospheres. But first this:
Having mentioned Planck, of course quantization comes to our minds, the everlasting claim to fame of Max Planck. It is unclear where he got the idea, but Planck knew Boltzmann´s work intimately. And Boltzmann , -- by assuming P = αV – was a forerunner of quantization. Indeed, obviously, if this argument holds, 1/α is the volume of the smallest cell that can accommodate a point. Boltzmann discusses this but dismisses his idea as an “auxiliary tool” for his calculations, cf. [2]. In modern physics quantization is considered as a real phenomenon and 1/α grows with decreasing temperature. In fact 1 / α 3 is the mean de Broglie wave length of the particles in their thermal motion.

4. Gravitational potential energy and entropy

4.1. Planetary atmosphere

We investigate an isothermal atmosphere of a planet which – for the purposes of the argument – we envisage under a spherical roof of height H over the planetary surface. The atmosphere below the roof is in equilibrium, but the roof keeps it from unconstrained equilibrium which occurs for the value of H that makes the available free energy A = E p o t T 0 S minimal.
The available free energy of the atmosphere is a functional of the density distribution. The “roof-argument” is a tool to reduce that functional to a function of a single variable, the height of the roof. Such a procedure is a common practice in variational calculus.
We skip all calculations and plot only the potential energy of the atmosphere and its entropy, both as functions of H; cf. Fig. 1a: The calculation is a student´s exercise. And Mathematica® helps with the plots of the elliptic functions which emerge. The potential energy is minimal for H=0, so that energy prefers all atoms or molecules to lie at the bottom, on the planet´s surface. But the entropy of the atmosphere grows monotonically with Hcf. Fig. 1b -- so that the entropy prefers the atmosphere to be distributed homogeneously throughout space. Who wins? The entropy wins, because th available free energy A has its minimum at H=∞, cf. Fig. 1c.
Figure 1. Planetary atmosphere a. Energy b. Entropy c. Available free energy.
Figure 1. Planetary atmosphere a. Energy b. Entropy c. Available free energy.
Entropy 10 00462 g001
Therefore a planet with an atmosphere is not a stable system; eventually the atmosphere of all planets will “evaporate” into space and we should not be surprised, because indeed, every particle will – in the course of its irregular thermal motion – occasionally reach the escape velocity and thus has a chance to leave the planet for good.
It is true that the process of “evaporation” may take a long time. The essential parameter is
β = γ M R k μ T      γ gravitatio n constant M -  planetary mass R -  planetary radius μ - molecular  mass of the atmosphere
A small β is a recipe for a bare planet. Thus the planet Mercury is hot because of its proximity to the sun and it has no atmosphere; nor has the moon which is very hot half of the time. But the big planets far from the sun – Jupiter, Saturn, Uranus – are so cold that they have a thick atmosphere. Actually those planets are so big, because they are cold. They have been able to hang on even to light gases – with a small μ – like hydrogen and oxygen, which were overabundant when the solar planetary system was formed.
Our earth stands in the middle: It is too hot to have kept hydrogen and helium but just cool enough to have kept the heavier gases oxygen and nitrogen. Therefore the earth hangs on to a thin atmosphere of thin gases of intermediate weight, – for the time being!

4.2. Osmosis and the Pfeffer tube

We fix a semi-permeable wall, -- permeable for water – to one end of a long tube and stick that end of the tube into a water reservoir. The water will then stand at the same level in the tube and in the reservoir as shown in Fig 2a. Now we let a little salt dissolve in the water inside the tube. The salt ions should attempt to increase their entropy by finding a homo-geneous distribution in the little bit of water available to them and that might be considered to be the end of it, since the salt ions cannot pass the semi-permeable wall
However, nature is more clever than that! In the effort to increase its entropy, hence its volume, the salt “pulls” water into the tube. Or else we may say that water “pushes” its way into the tube in order to help the salt to increase its entropy. As a result the level of solution in the tube rises and – for reasonable data – may reach dozens of meter.
Eventually the rise of potential energy of the system brings the process of osmosis to an end. Osmosis is the process of passage of a fluid through a semi-permeable wall. We can calculate the potential energy and the entropy of the system as a function of the height HT of the solution in the tube. Fig. 2b shows that the energy has its minimum when the levels of the liquid are essentially equal, and that the entropy has a maximum when all the water has been pulled into the solution. The available free energy has a minimum at some intermediate value. Thus in this case none of the tendencies wins – neither energy nor entropy --, they find a true compromise. As a result the salt solution in the tube is considerable diluted and the height of the water column has grown. In the state of equilibrium the osmotic pressure – the pressure difference across the semi-permeable wall – may easily reach several bars.
Figure 2. On osmosis.
Figure 2. On osmosis.
Entropy 10 00462 g002
Of course “pulling” and “pushing” are anthropomorphic euphemisms. What happens is the effect of random motion of the molecules which find the distribution with most realizations under the prevailing conditions. Nothing but pure chance!
Note that the water pays the price for the process, because it is essentially its potential energy that grows. And the salt profits because it is its entropy that grows. We conclude that nature does not permit the constituents to be selfish. The mixture gains in the process.
Another interesting aspect of the osmotic rise of solution in the tube is that the system is not content with the shape that we gave it at the outset. It shapes its final form in the quest for a minimum of the available free energy.
Finally I remark that life depends on this, because it is osmosis that drives the sap into the tree tops; the entropic tendency for growth is powerful enough for that.

5. Phase transitions

5.1. Solid <-> liquid <-> vapor

If energy and entropy compete in the manner described above, the energy need not be the gravitational potential energy of the body, it may be the potential energy of the intermolecular forces, called van der Waals forces. Let us consider this:
Between the atoms of a body there is attraction at a large distance and repulsion at close range. The potential energy field of these inter-atomic forces provides the atoms with numerous potential wells in which the atoms of a solid may rest comfortably at low temperature, occupying a minimum of potential energy and filling a small volume, because all atoms are close together. That situation occurs in all bodies at low temperature. In a liquid, -- at a somewhat higher temperature --, the atoms are still close together, but there thermal motion does not allow them to remain in the minima of the potential wells.
When the liquid is heated, the thermal motion becomes more virulent, and eventually it will allow the atoms to jump out of the potential wells and make use of the whole available space. The body becomes a vapor or a gas. The vapor requires much more space than the liquid so that its entropy is bigger. And indeed, a phase transition may be seen as one aspect of the competition between energy tending toward a minimum, and entropy tending toward a maximum. The evaporation of a liquid – or its opposite, the condensation – may be viewed as the sudden overpowering of one of these tendencies by the other one.
A neat demonstration of the phenomena of melting and evaporation – or of freezing and condensation – may be obtained in the computer by looking at only a few atoms – each one interacting with all others. Upon heating or cooling by contact with the wall, the atoms will simulate a gas of free-flying atoms at a high temperature, or a liquid in which the atoms cluster together at an intermediate temperature, or a hexagonal solid lattice with all next neighbours having the same distance; the latter case occurs at small temperature. Fig. 3 shows screen shots of the three phases for as few as seven atoms. These are taken from an animation recorded on a CD that accompanies the book by Müller and Weiss [1].
Figure 3. Seven atoms simulating a gas or vapor, a liquid and a solid.
Figure 3. Seven atoms simulating a gas or vapor, a liquid and a solid.
Entropy 10 00462 g003

5.2. Austenite <-> martensite

Phase transitions may also occur in solids when a crystalline lattice undergoes a structural change, say from a hexagonal lattice at low temperature to a cubic lattice at high temperature. Such is the case for shape memory alloys which have been extensively studied in recent decades. In that case metallurgists speak of a martensite<->austenite transition.
Here as always in a phase transition we see the competition between energy and entropy at work. The potential energy of the martensite is lower than the potential energy of the austenite, and that holds true for low and high temperatures But the entropy of the cubic lattice is bigger than the entropy of the hexagonal one so that entropy favours the cubic lattice. At low temperature, however, the entropic preference for the cubic lattice is outweighed by the energetic preference for martensite; after all, in the available free energy A the entropic effect is weakened by a low temperature. But at high temperature the entropy dominates and the body becomes cubic.
Fig. 4 shows the result of a molecular simulation. It presents screen shots of a film that may be viewed on the CD attached to the book that was mentioned in footnote 9. The simulation was carried out by Kastner [3].
Such structural transitions have only been observed in alloys, and in Fig. 4 the atoms of the constituents are black and grey. We see that in a cell of the martensitic phase the black atom is squeezed into a corner of a grey rhombus. There is little space in that corner and consequently the entropy of the martensite is small. In the austenitic phase the black atom has a whole grey square cell at its disposition; therefore the entropy is large. In fact, we may say that the tendency of the body to increase its entropy “shapes” the lattice and makes it austenitic despite the fact that the energy favours martensite. Metallurgists speak of entropic stabilization in this case and similar ones.
The figure shows an austenitic lattice in the top left picture which – upon cooling – turns into martensitic twins in the bottom right picture. Upon heating the reverse happens, albeit usually with a hysteresis.
The leaning to the left or right of the martensitic twins comes about because the black atom in a grey cell may choose the left corner or the right one to move into when austenite becomes unstable. The twin structure emerges differently in each new experiment.
Figure 4. Formation of martensitic twins in a austenite<->martensite transition upon cooling.
Figure 4. Formation of martensitic twins in a austenite<->martensite transition upon cooling.
Entropy 10 00462 g004

6. Chemical reactions

6.1. Ammonia synthesis

When entropy and energy compete, the energy need not be the gravitational potential energy nor the potential energy of the intermolecular forces. It may be the potential energy of the chemical bonds. When atoms or molecules rearrange themselves in a chemical reaction, new ones appear while the old ones disappear. And their energies and entropies appear or disappear along with the masses.
Thus in the reaction:
N 2 + 3 H 2 2 N H 3
a nitrogen molecule and three hydrogen molecules disappear and two ammonia molecules emerge. The changes of volume, energy, and entropy involved in the reaction are given by (chemical handbooks provide molar quantities; therefore we do the same and denote mol specific properties by minuscules).
Δ v = 44.8 l mol ,   Δ e = 92.4 kJ mol ,   Δ s = 178.6 kJ molK .
Thus the volume drops and the energy drops, which is good for ammonia, but the entropy also drops which is bad. However, at room temperature we have
Δ a R T = Δ e T R T Δ s = 39.2 kJ mol ,
so that the available free energy goes down. Therefore the reaction should proceed and ammonia should be formed, which the world craves for the production of fertilizers and explosives, or explosives and fertilizers.
Chemists usually perform their experiments at fixed temperature T0 and for a fixed pressure p0 on the wall of the container. Under those circumstances the measured value denoted by Δe above is really Δ(e+poV), the heat of reaction.
In actual fact, however, no ammonia is produced at room temperature. The reason is that the molecules of nitrogen and hydrogen cannot react. They first have to be split into atoms. That may be done with the help of a catalyser which, however, requires a temperature of about 500°C. Now, at that temperature we have
Δ a 500 ° C = Δ e T 500 ° C Δ s = 50.1 kJ mol  .
The available free energy grows! This means that again no ammonia can be formed. The reason is obviously that the entropy drop is too big.
But chemists are clever; they know about entropy; and they know that the entropy of gases increases with increasing volume or – more appropriately – that it decreases with decreasing volume. Thus Fritz Haber, the pioneer of ammonia synthesis, put the gases under pressure, thus decreasing the volume drop and decreasing the entropy drop. Together with his colleague and co-worker Carl Bosch he was thus able to produce as much ammonia as he wanted – at 200bar and 500°C.
Figure 5. Available free energy for the ammonia reaction as a function of the extent of reaction.
Figure 5. Available free energy for the ammonia reaction as a function of the extent of reaction.
Entropy 10 00462 g005
Entropy 10 00462 i005
In 1900 all nitrates for industrial purposes were produced from guano, deposited by the birds over the millennia at the west coast of South America and imported into Europe by ship. Now it was clear that Germany in the case of a war would be cut off from those imports by a British naval blockade. So the Haber-Bosch synthesis of ammonia came just in time for the first world war. Without that invention the war could not have lasted more than a few months for lack of explosives on the German side. As it was – and as we know – the war lasted more than four years; until Germany ran out of men, and food, and morale – but never out of explosives.
Even nowadays the ammonia synthesis by high pressure catalytic chemistry is one of the big money-makers of the chemical industry.

6.2. Photosynthesis

Plants produce glucose C6H12O6 from the carbon dioxide of the air and from the water in the soil. In doing so they set free oxygen. The process is called photosynthesis, since it occurs only under light. As I understand it, not all the details of the synthesis are as yet fully understood by the biochemists, although they are getting close. Here we restrict the attention here to the thermodynamics of photosynthesis and balance influxes and effluxes of energy and entropy.
The stoichiometric equation reads:
C O 2 + H 2 O 1 6 C 6 H 12 O 6 + O 2
And, of course, the process must satisfy the first and second laws of thermodynamics. Measurements show that we have
Δ e = 466.3 kJ mol   and   Δ s = 40.1 kJ molK
This is the worst possible case: the energy goes up and the entropy comes down. Indeed, the first and second laws read
Q = Δ e | > 0    and    Q T Δ s < 0 ,
And they contradict each other: According to the first law we need to apply heat to the process and according to the second law heat must be withdrawn. The available free energy summarizes the dilemma, because at room temperature it changes by the amount
Δ a = Δ e - T Δ s = 478.3 kJ mol .
It grows (!) when we know that it must decrease!
Thus, if we did not know better, we should now conclude that the process is impossible. And indeed, the photosynthesis is a thermodynamically precarious process. Since it does occur, it must be accompanied by a secondary process which provides the energy and increases the entropy.
The supply of energy is the lesser problem, since energy is absorbed from radiation, primarily from the red and yellow part of the spectrum, which is why the plants are green. Actually the calculation shows that the absorbed energy from radiation is more than enough. By itself it would overheat the plant to a high temperature where no photosynthesis can occur anymore. That is the reason for the large water demand of plants – 100 to 1000 times more than needed for the stoichiometric reaction. The excess water is evaporated and cools the plant – just like animals cool themselves by sweating.
More subtle is the entropy balance. Evaporation of water does not help. It is true that the entropy grows in evaporation, but the available free energy remains constant. Actually it seems that no definite - and generally accepted -- resolution is available for the paradoxical growth of available free energy.
There are at least two propositions, one of them ennobled by its author, who was Erwin Schrödinger, no less. Says he:
The plants of course have their most powerful supply of negative entropy in the sunlight.
And indeed, it is true that between absorption and emission of radiation a plant produces entropy in sufficient quantity to offset the decrease of entropy in the stoichiometric reaction. Yet the “mechanism” of the absorption of entropy and its emission has never been studied in detail, at least not to my knowledge. Therefore there remains some lingering doubt about Schrödinger´s interpretation.
Another possibility is that the necessary entropy increase is due to the entropy of mixing of the evaporated water with the surrounding air. In that interpretation dry air “pulls” the water from the leaves and helps the plant to cool itself and -- at the same time -- to offset the entropy decrease of the stoichiometric reaction. That proposition is my own, published in the paper by Klippel and Müller [4].
Note added in proof (in response to a remark by a reviewer): I am given to understand that Schrödinger´s and my own considerations are not the only ones that deal with the thermodynamically precarious process of photosynthesis, and that I should abstain from discussing it. Well, I will not do that, but I do agree with the reviewer that there is a bulk of literature on the subject. And I will be well content, if the reader realizes that there is a problem. He is welcome to ignore my attempt at a solution.

7. Boltzmann´s interpretation of time

Whatever the correct understanding may be for the feasibility of photosynthesis, it is certain to be connected with entropy. The tendency of the entropy to grow is a powerful force of nature in living systems and inanimate ones alike. Yet, certain questions have never been answered, -- questions that loomed big in the minds of the scientists of the 19th century. The doctrine of the heat death was only one of the surprising notions, whose explanation was deferred and then omitted.
Another conundrum is due to Loschmidt again, a colleague of Boltzmann´s and the man who deplored the terroristic nimbus of entropy. Let us consider that:
If a system of atoms runs its course to more probable distributions and is then stopped and all its velocities are inverted, it should run backwards toward the less probable distributions from which it has come. This had to be so, because the equations of mechanics are invariant under replacement of time t by –t. Therefore Loschmidt argued that a motion of the system with decreasing entropy should occur just as often as one with increasing entropy.
In his reply Boltzmann did not dispute, of course, the reversibility of the atomic motions. He tried, however, to make the objection irrelevant in a probabilistic sense by emphasizing the importance of initial conditions. By our argument -- Boltzmann´s argument -- explained above, all realizations, or microstates, occur equally frequently; therefore we expect to see the distribution evolve in the direction in which it can be realized by more microstates, -- irrespective of initial conditions. This cannot be strictly true, however, since Loschmidt´s inverted initial conditions are among the possible ones and they lead to less probable distributions, i.e. those with less possible realizations. So Boltzmann argued that, among all conceivable initial conditions, there are only few that lead to less probable distributions among many that lead to more probable ones. Therefore, if we pick an initial condition at random, we nearly always pick one that leads to entropy growth and almost never one that lets the entropy decrease. Therefore the increase of entropy should occur more often than a decrease.
Most people were unconvinced at the time; they thought that the argument about initial conditions just rephrased the a priori assumption about equal probability of all microstates. Nowadays the discussion has faded away – like the discussion of the heat death – although there was never a really convincing answer to Loschmidt´s reversibility objection. Boltzmann tried again later and he came up with an interesting notion, when he speculated that
…in the universe, which is nearly everywhere in an equilibrium, and therefore dead, there must be relatively small regions of the dimensions of our star space (call them worlds) … which, during the relatively short period of eons, deviate from equilibrium and among these [there must be] equally many in which the entropy increases and decreases. …A creature that lives in such a period of time and in such a world will denote the direction of time toward lower entropy differently than the reverse direction: The former as the past, the latter as the future. With that convention the small regions, worlds, will “initially” always find themselves in an improbable state of low entropy
Boltzmann tried to make this mind-boggling idea acceptable by drawing an analogy to the notions of up and down on the earth: Men in Europe and her antipodes both think that they stand right-side-up, when objectively one of them is upside down. Yet, applied to time the notion was not taken seriously by anybody. It has the flair of science fiction.
Note added in proof (in response to a remark by a reviewer): Once again, I am informed that there are valid answers to Loschmidt´s reversibility objections which do not involve Boltzmann´s hair raising suggestion, and I have heard lectures about them. My problem is that I understand the proposed solutions even less than Boltzmann´s attempt. But I will give the references on the subject that were made known to me: [5,6,7].

References

  1. Müller, I.; Weiss, W. Entropy and Energy: a Universal Competition; Springer: Heidelberg, 2005. [Google Scholar]
  2. Boltzmann, L. Weitere Studien über das Wärmegleichgewicht unter Gasmolekülen. Sitzungsberichte der Akademie der Wissenschaften Wien (II) 1872, 66, 275–370. [Google Scholar]
  3. Kastner, O. Molecular dynamics of a 2D model fort the shape memory effect. Part I (Model and simulation). Cont. Mech. & Thermodyn. 2003, 15, 487–502. [Google Scholar]
  4. Klippel, A.; Müller, I. Plant growth – a thermodynamicist´s view. Cont. Mech. & Thermodyn. 1997, 9, 127–142. [Google Scholar]
  5. Evans, D.J.; Searles, D.J. The Fluctuation Theorem. Adv. Phys. 2002, 51, 1529–1585. [Google Scholar] [CrossRef]
  6. Bustamante, C.; Liphardt, J.; Ritort, F. The Nonequilibrium Thermodynamics of Small Systems. Phys. Today. 2005, 43–48. [Google Scholar] [CrossRef]
  7. Sevick, E.M.; Prabhaker, R.; Williams, S.R.; Searles, D.J. Fluctuation Theorems. Annu. Rev. Phys. Chem. 2008, 59, 603–633. [Google Scholar] [CrossRef] [PubMed]

Share and Cite

MDPI and ACS Style

Müller, I. Entropy and Energy, – a Universal Competition. Entropy 2008, 10, 462-476. https://0-doi-org.brum.beds.ac.uk/10.3390/e10040462

AMA Style

Müller I. Entropy and Energy, – a Universal Competition. Entropy. 2008; 10(4):462-476. https://0-doi-org.brum.beds.ac.uk/10.3390/e10040462

Chicago/Turabian Style

Müller, Ingo. 2008. "Entropy and Energy, – a Universal Competition" Entropy 10, no. 4: 462-476. https://0-doi-org.brum.beds.ac.uk/10.3390/e10040462

Article Metrics

Back to TopTop