A.35 The third law

In the sim­plest for­mu­la­tion, the third law of ther­mo­dy­nam­ics says that the en­tropy at ab­solute zero tem­per­a­ture is zero.

The orig­i­nal the­o­rem is due to Nernst. A more re­cent for­mu­la­tion is

“The con­tri­bu­tion to the en­tropy of a sys­tem due to each com­po­nent that is in in­ter­nal equi­lib­rium dis­ap­pears at ab­solute zero.” [D. Ter Haar (1966) El­e­ments of Ther­mo­sta­tis­tics. Holt, Rine­hart & Win­ston.]
A more read­able ver­sion is
“The en­tropy of every chem­i­cally sim­ple, per­fectly crys­talline, body equals zero at the ab­solute zero of tem­per­a­ture.” [G.H. Wan­nier (1966) Sta­tis­ti­cal Physics. Wi­ley.]
These for­mu­la­tions al­low for the ex­is­tence of meta-sta­ble equi­lib­ria. The third law in its sim­ple form as­sumes that strictly speak­ing every ground state is rea­son­ably unique and that the sys­tem is in true ther­mal equi­lib­rium. Ex­per­i­men­tally how­ever, many sub­stances do not ap­pear to ap­proach zero en­tropy. Ran­dom mix­tures as well as ice are ex­am­ples. They may not be in true equi­lib­rium, but if true equi­lib­rium is not ob­served, it is aca­d­e­mic.

The zero of en­tropy is im­por­tant for mix­tures, in which you need to add the en­tropies of the com­po­nents to­gether cor­rectly. It also has im­pli­ca­tions for the be­hav­ior of var­i­ous quan­ti­ties at low tem­per­a­tures. For ex­am­ple, it im­plies that the spe­cific heats be­come zero at ab­solute zero. To see why, note that in a con­stant vol­ume or con­stant pres­sure process the en­tropy changes are given by

\begin{displaymath}
\int \frac{C}{T}{ \rm d}T
\end{displaymath}

If the spe­cific heat $C$ would not be­come zero at $T$ $\vphantom0\raisebox{1.5pt}{$=$}$ 0, this in­te­gral would give an in­fi­nite en­tropy at that tem­per­a­ture in­stead of zero.

An­other con­se­quence of the third law is that it is not pos­si­ble to bring a sys­tem to ab­solute zero tem­per­a­ture com­pletely even in ideal processes. That seems pretty self-ev­i­dent from a clas­si­cal point of view, but it is not so ob­vi­ous in quan­tum terms. The third law also im­plies that isother­mal processes be­come isen­tropic when ab­solute zero tem­per­a­ture is ap­proached.

It may seem that the third law is a di­rect con­se­quence of the quan­tum ex­pres­sion for the en­tropy,

\begin{displaymath}
S = - k_{\rm B}\sum P_q \ln(P_q)
\end{displaymath}

At ab­solute zero tem­per­a­ture, the sys­tem is in the ground state. As­sum­ing that the ground state is not de­gen­er­ate, there is then only one nonzero prob­a­bil­ity $P_q$ $\vphantom0\raisebox{1.5pt}{$=$}$ 1 and for that prob­a­bil­ity $\ln(P_q)$ is zero. So the en­tropy is zero.

Even if the ground state is not unique, of­ten it does not make much of a dif­fer­ence. For ex­am­ple, con­sider the case of a sys­tem of $I$ non­in­ter­act­ing spin 1 bosons in a box. If you could re­ally ig­nore the ef­fect of all par­ti­cle in­ter­ac­tions on the en­ergy, the $I$ spin states would be ar­bi­trary in the ground state. But even then there would be only about $\frac12I^2$ dif­fer­ent sys­tem states with the ground state en­ergy, chap­ter 5.7. That pro­duces an en­tropy of only about $-k_{\rm B}\ln(2/I^2)$. It would make the spe­cific en­tropy pro­por­tional to $\ln(I)$$\raisebox{.5pt}{$/$}$$I$, which is zero for a large-enough sys­tem.

On the other hand, if you ig­nore elec­tro­mag­netic spin cou­plings of nu­clei in a crys­tal, it be­comes a dif­fer­ent mat­ter. Since the nu­clear wave func­tions have no mea­sur­able over­lap, to any con­ceiv­able ac­cu­racy the nu­clei can as­sume in­de­pen­dent spa­tial states. That gets rid of the (anti) sym­metriza­tion re­stric­tions on their spin. And then the as­so­ci­ated en­tropy can be nonzero. But of course, if the nu­clear spin does not in­ter­act with any­thing, you may be able to ig­nore its ex­is­tence al­to­gether.

Even if a sys­tem has a unique ground state, the third law is not as triv­ial as it may seem. Ther­mo­dy­nam­ics deals not with fi­nite sys­tems but with ide­al­ized sys­tems of in­fi­nite size. A very sim­ple ex­am­ple il­lus­trates why it makes a dif­fer­ence. Con­sider the pos­si­bil­ity of a hy­po­thet­i­cal sys­tem whose spe­cific en­tropy de­pends on the num­ber of par­ti­cles $I$, tem­per­a­ture $T$, and pres­sure $P$ as

\begin{displaymath}
s_{\rm h.s.}(I,T,P) = \frac{IT}{1+IT}
\end{displaymath}

This sys­tem is con­sis­tent with the ex­pres­sion for the third law given above: for a given sys­tem size $I$, the en­tropy be­comes zero at zero tem­per­a­ture. How­ever, the ide­al­ized in­fi­nite sys­tem al­ways has en­tropy 1; its en­tropy does not go to zero for zero tem­per­a­ture. The third law should be un­der­stood to say that this hy­po­thet­i­cal sys­tem does not ex­ist.

If in­fi­nite sys­tems seem un­phys­i­cal, trans­late it into real-life terms. Sup­pose your test tube has say $I$ $\vphantom0\raisebox{1.5pt}{$=$}$ 10$\POW9,{20}$ par­ti­cles of the hy­po­thet­i­cal sys­tem in it in­stead of in­fi­nitely many. Then to re­duce the spe­cific en­tropy from 1 to 0.5 would re­quire the tem­per­a­ture to be re­duced to a com­pletely im­pos­si­ble 10$\POW9,{-20}$ K. And if you dou­ble the num­ber of par­ti­cles in the test tube, you would need an­other fac­tor two re­duc­tion in tem­per­a­ture. In short, while for­mally the en­tropy for the fi­nite hy­po­thet­i­cal sys­tem goes to zero at ab­solute zero, the tem­per­a­tures re­quired to do so have no ac­tual mean­ing.