Measuring the Entropy

Submitted by ChemPRIME Staff on Thu, 12/16/2010 - 15:22

Up to this point we have often quoted values of the entropyA thermodynamic state function, symbol S, that equals the reversible heat energy transfer divided by temperature; higher entropy corresponds to greater dispersal of energy on the molecular scale. See also standard entropy. without giving any indication of how such values may be obtained. Alas, there is no convenient black box labeled entropy meterThe SI unit for distance or length. into which we can put a substanceA material that is either an element or that has a fixed ratio of elements in its chemical formula. and read off its entropy value on a dial. Determining the entropy turns out to be both difficult and laborious. In the case of a simple gasA state of matter in which a substance occupies the full volume of its container and changes shape to match the shape of the container. In a gas the distance between particles is much greater than the diameters of the particles themselves; hence the distances between particles can change as necessary so that the matter uniformly occupies its container., if we know enough about its molecular structure and enough quantum mechanics, we can actually calculate its entropy. For most substances, though, we are forced to derive the entropy from a series of calorimetric measurements, most of them at very low temperatures.

This method for determining the entropy centers around a very simple relationship between q, the heatEnergy transferred as a result of a temperature difference; a form of energy stored in the movement of atomic-sized particles. energyA system's capacity to do work. absorbed by a body, the temperature T at which this absorptionPermeation of a solid by a gas or liquid, or permeation of a liquid by a gas. Absorption differs from adsorption in that the substance absorbed is found throughout the absorbent. takes place, and ΔS, the resultant increase in entropy:

\Delta S=\frac{q}{T}      (1)

It is possible to derive this relationship from our original definition of entropy, namely, S = k ln W, but the proof is beyond the level of this text.

It is easy to see how Eq. (1) can be used to measure the entropy. We start with our substance as close to the absolute zeroThe minimum possible temperature: 0 K, -273.15 °C, -459.67 °F. of temperature as is technically feasible and heat it in many stages, measuring the heat absorbed at each stage, until we arrive at the desired temperature, say 298 K. The initial value of the entropy is zero, and we can calculate the entropy increase for each stage by means of Eq. (1) and so the sum of all these increases is the entropy value for 298 K. In the case of simple gases, values of entropy measured in this way agree very well with those calculated from knowledge of molecular structure.

Equation (1) was discovered long before the statistical nature of entropy was realized. Scientists and engineers began to appreciate the importance of the quantity q/T very early in the nineteenth century because of its connection with the efficiency of steam engines. These arguments were developed by both Lord KelvinA unit of temperature equal to 1/273.16 of the thermodynamic temperature of the triple point of water; the kelvin is the same size as the degree Celsius. The thermodynamic temperature scale (Kelvin scale) has absolute zero as its zero point. in England and Rudolf Clausius (1822 to 1888) in Germany. It was Clausius who first formulated the second law in terms of the entropy S, but Clausius had only a vague idea that entropy was in any way connected with molecules or probability. The statistical nature of entropy was first suggested by Boltzmann in 1877 and then developed into an elegant system in 1902 by Josiah Willard Gibbs (1839 to 1903), one of the real giants among American scientists.

An important feature of Eq. (1) is the inverse relationship between the entropy increase and the temperature. A given quantity of heat energy produces a very large change of entropy when absorbed at a very low temperature but only a small change when absorbed at a high temperature.

EXAMPLE 1 Calculate the increase in entropy which a substance undergoes when it absorbs 1 kJ of heat energy at the following temperatures: (a) 3 K; (b) 300 K; (c) 3000 K.

SolutionA mixture of one or more substances dissolved in a solvent to give a homogeneous mixture.

a) At 3 K we have

            \Delta S=\frac{\text{1000 J}}{\text{3 K}}=\text{333}\text{.3 J K}^{-\text{1}}

b) At 300 K, similarly,

            \Delta S=\frac{\text{1000 J}}{\text{300 K}}=\text{3}\text{.33 J K}^{-\text{1}}

c) At 3000K

            \Delta S=\frac{\text{1000 J}}{\text{3000 K}}=\text{0}\text{.33 J K}^{-\text{1}}

An amusing analogy to this behavior can be drawn from everyday life. If a 10-year-old boy is allowed to play in his bedroom for half an hour, the increase in disorder is scarcely noticeable because the room is already disordered (i.e., at a higher “temperature”). By contrast, if the boy is let loose for half an hour in a neat and tidy living room (i.e., at a lower “temperature”), the effect is much more dramatic.