Thermodynamic Probability W and Entropy
The section on atoms, molecules and probability has shown that if we want to predict whether a chemical changeA process in which one or more substances, the reactant or reactants, change into one or more different substances, the products; chemical change involves rearrangement, combination, or separation of atoms. Also referred to as chemical reaction. is spontaneous or not, we must find some general way of determining whether the final state is more probable than the initial. This can be done using a number W, called the thermodynamic probability. W is defined as the number of alternative microscopic arrangements which correspond to the same macroscopic state. The significance of this definition becomes more apparent once we have considered a few examples.
Figure 1a illustrates a crystalA solid with a regular polyhedral shape; for example, in sodium chloride (table salt) the crystal faces are all at 90° angles. A solid in which the atoms, molecules, or ions are arranged in a regular, repeating lattice structure. consisting of only eight atoms at the absolute zeroThe minimum possible temperature: 0 K, -273.15 °C, -459.67 °F. of temperatureA physical property that indicates whether one object can transfer thermal energy to another object.. Suppose that the temperature is raised slightly by supplying just enough energyA system's capacity to do work. to set one of the atoms in the crystal vibrating. There are eight possible ways of doing this, since we could supply the energy to any one of the eight atoms. All eight possibilities are shown in Fig. 1b.
Since all eight possibilities correspond to the crystal having the same temperature, we say that W = 8 for the crystal at this temperature. Also, we must realize that the crystal will not stay perpetually in any of these eight arrangements. Energy will constantly be transferred from one atom to the other, so that all the eight arrangements are equally probable.
Let us now supply a second quantity of energy exactly equal to the first, so that there is just enough to start two molecules vibrating. There are 36 different ways in which this energy can be assigned to the eight atoms (Fig. 1c). We say that W = 36 for the crystal at this second temperature. Because energy continually exchanges from one atom to another, there is an equal probability of finding the crystal in any of the 36 possible arrangements.
A third example of W is our eight-atom crystal at the absolute zero of temperature. Since there is no energy to be exchanged from atom to atom, only one arrangement is possible, and W = 1. This is true not only for this hypothetical crystal, but also presumably for a real crystal containing a large number of atoms, perfectly arranged, at absolute zero.
The thermodynamic probability W enables us to decide how much more probable certain situations are than others. Consider the flow of heat from crystal A to crystal B, as shown in Fig. 2. We shall assume that each crystal contains 100 atoms. Initially crystal B is at absolute zero. Crystal A is at a higher temperature and contains 64 units of energy-enough to set 64 of the atoms vibrating. If the two crystals are brought together, the molecules of A lose energy while those of B gain energy until the 64 units of energy are evenly distributed between both crystals.
In the initial state the 64 units of energy are distributed among 100 atoms. Calculations show that there are 1.0 × 1044 alternative ways of making this distribution. Thus W1, initial thermodynamic probability, is 1.0× 1044. The 100 atoms of crystal A continually exchange energy among themselves and transfer from one of these 1.0 × 1044 arrangements to another in rapid succession. At any instant there is an equal probability of finding the crystal in any of the 1.0 × 1044 arrangements.
When the two crystals are brought into contact, the energy can distribute itself over twice as many atoms. The number of possible arrangements rises enormously, and W2, the thermodynamic probability for this new situation, is 3.6 × 1060. In the constant reshuffle of energy among the 200 atoms, each of these 3.6 × 1060 arrangements will occur with equal probability. However, only 1.0 × 1044 of them correspond to all the energy being in crystal A. Therefore the probability of the heat flow reversing itself and all the energy returning to crystal A is
In other words the ratio of W1 to W2 gives us the relative probability of finding the system in its initial rather than its final state.
This example shows how we can use W as a general criterion for deciding whether a reaction is spontaneous or not. Movement from a less probable to a more probable molecular situation corresponds to movement from a state in which W is smaller to a state where W is larger. In other words W increases for a spontaneous change. If we can find some way of calculating or measuring the initial and final values of W, the problem of deciding in advance whether a reaction will be spontaneous or not is solved. If W2 is greater than W1, then the reaction will occur of its own accord. Although there is nothing wrong in principle with this approach to spontaneous processes, in practice it turns out to be very cumbersome. For real samples of matterAnything that occupies space and has mass; contrasted with energy. (as opposed to 200 atoms in the example of Fig. 2) the values of W are on the order of 101024—so large that they are difficult to manipulate. The logarithm of W, however, is only on the order of 1024, since log 10x = x. This is more manageable, and chemists and physicists use a quantity called the entropy which is proportional to the logarithm of W.
This way of handling the extremely large thermodynamic probabilities encountered in real systems was first suggested in 1877 by the Austrian physicist Ludwig Boltzmann (1844 to 1906). The equation
S = k ln W (1)
is now engraved on Boltzmann’s tomb. The proportionality constant k is called, appropriately enough, the Boltzmann constant. It corresponds to the gas constantA proportionality constant between the product of the pressure and volume of a gas and the product of the chemical amount (moles) and temperature of the gas. R divided by the Avogadro constant NA:
and we can regard it as the gasA state of matter in which a substance occupies the full volume of its container and changes shape to match the shape of the container. In a gas the distance between particles is much greater than the diameters of the particles themselves; hence the distances between particles can change as necessary so that the matter uniformly occupies its container. constant per molecule rather than per mole. In SI unitsThe international system of units (Système International d'Unité) based on seven fundamental units: meter, kilogram, second, ampere, kelvin, candela, mole., the Boltzmann constant k has the value 1.3805 × 10–23 J K–1. The symbol ln in Eq. (1) indicates a natural logarithm,i.e., a logarithm taken to the baseIn Arrhenius theory, a substance that increases the concentration of hydroxide ions in an aqueous solution. In Bronsted-Lowry theory, a hydrogen-ion (proton) acceptor. In Lewis theory, a species that donates a pair of electrons to form a covalent bond. e. Since base 10 logarithms and base e logarithms are related by the formula
ln x = 2.303 log x
it is easy to convert from one to the other. Equation (1), expressed in base 10 logarithms, thus becomes
S = 2.303k log W (1a)
EXAMPLE 1 The thermodynamic probability W for 1 mol propane gas at 500 K and 101.3 kPa has the value 101025. Calculate the entropy of the gas under these conditions.
SolutionA mixture of one or more substances dissolved in a solvent to give a homogeneous mixture. Since
W = 101025
log W = 1025
Thus S = 2.303k log W = 1.3805 × 10–23 J K–1 × 2.303 × 1025 = 318 J K–1
Note: The quantity 318 J K–1 is obviously much easier to handle than 101025.
Note also that the dimensions of entropy are energy/temperature.
One of the properties of logarithms is that if we increase a number, we also increase the value of its logarithm. It follows therefore that if the thermodynamic probability W of a system increases, its entropy S must increase too. Further, since W always increases in a spontaneous change, it follows that S must also increase in such a change.
The statement that the entropy increases when a spontaneous change occurs is called the second law of thermodynamicsA formal statement that any spontaneous (product-favored) process is accompanied by an increase in the entropy (dispersal of energy) of the universe.. (The first law is the law of conservation of energy.) The second law, as it is usually called, is one of the most fundamental and most widely used of scientific laws. In this book we shall only be able to explore some of its chemical implications, but it is of importance also in the fields of physics, engineering, astronomy, and biology. Almost all environmental problems involve the second law. Whenever pollutionThe contamination of the air, water, and earth by personal, industrial, and farm waste. increases, for instance, we can be sure that the entropy is increasing along with it.
The second law is often stated in terms of an entropy difference ΔS. If the entropy increases from an initial value of S1 to a final value of S2 as the result of a spontaneous change, then
ΔS = S2 – S1 (3)
Since S2 is larger than S1, we can write
ΔS > 0 (4)
Equation (4) tells us that for any spontaneous process, ΔS is greater than zero. As an example of this relationship and of the possibility of calculating an entropy change, let us find ΔS for the case of 1 mol of gas expanding into a vacuum. We have already argued for this process that the final state is 101.813 × 1023 times more probable than the initial state. This can only be because there are 101.813 × 1023 times more ways of achieving the final state than the initial state. In other words, taking logs, we have
log = 1.813 × 1023
ΔS = S2 – S1 = 2.303 × k × log W2 – 2.303 × k × log W1
= 2.303 × k × log
= 2.303 × 1.3805 × 10–23 J K–1 × 1.813 × 1023
S = 5.76 J K–1
As entropy changes go, this increase in entropy is quite small. Nevertheless, it corresponds to a gargantuan change in probabilities.