Entropy maximizes in a heat-tight system, why?
Two iron blocks of equal size are to be heated. Initial temperature is 100° Kelvin. One is to be heated to 300° Kelvin, the other to 500° Kelvin.
This is approximately 0° Celsius and 200° Celsius, respectively.
The entropy change of a system is defined as
Delta S = (Delta Q)/T | Q: heat exchange through the surface, T: absolute temperature
through the surface.
The heat capacity is assumed to be constant in this range.
The factor of mass and heat capacity is assumed to be 1/100 overall.
The entropy thus integrates from 100 to 300 or 500° Kelvin.
The increases are thus ∫(Delta Q)/T•dT. The integral of 1/T is ln T. Because ln 1=0, the entropies are ln 3 and ln 5 above the entropy at 100° Kelvin.
Then, both blocks are placed side by side and completely insulated from the outside.
The mean value is the mixing temperature. That would be 400° Kelvin.
However, because entropy is a state variable, the entropy corresponds to the same temperature as if the blocks had been heated to 400° Kelvin.
The logarithm, however, is a concave function. The values always lie above each connecting line,
where the mean value lies. During temperature equalization, the entropy increased, without any heat input.
Simpler explanation:
At lower temperatures, heat exchange with the environment causes a greater change in entropy than at higher temperatures.
The warmer block loses less entropy during temperature equalization than the colder one gains.
Thus, the overall entropy of the system increases.
Note:
Entropy doesn't maximize when a wall consists of activation energy.
Sometimes, however, a spark is enough to break through the wall.
|