In the Einstein/Debye models, the specific heat capacity goes as T^3 at low temperatures and approaches the Dulong-Petit law at higher temperatures. I undestand that some molecular motions (rotation, vibration etc.) is quantized and thus requires a certain amount of energy in order to become excited, and that this is the primary reason why the heat capacity increases as temp. increases (there is a distribution of the energy over more degrees of freedom).
What I don't understand physically, is why the heat capacity goes to zero as temp. goes to zero? There is still zero point motion and translational degrees of freedom which are not quantized, so surely there should still be some "contributions" here. This leads to me a second question: If the energy required to excite the modes or rotational movements of the molecules are quantized, why does the heat capacity not increase in discontinuous "steps" as a function of temp., rather than a continuous function?
Answer
Why does the heat capcity not jump at an energy threshold
To understand why heat capacity rises continuously and not step-wise consider a simple harmonic oscillator with eigenenergies $\epsilon_n = \hbar \omega \left(n + \frac 1 2\right)$. In the following $k_B = 1$ and $\beta = 1/T$.
The partition function is given by: $$ Z = \sum_{n=0}^\infty e^{-\beta \epsilon_n} = e^{-\frac 1 2 \beta \hbar \omega} \frac{e^{-\beta \hbar \omega}}{1 - e^{-\beta\hbar\omega}} = e^{-\frac 1 2 \beta \hbar \omega} \frac{1}{e^{\beta \hbar \omega} - 1}. $$
The energy in the system is then given by: $$E = \frac 1 Z \sum_n \epsilon_n e^{\beta \epsilon_n} = -\partial_\beta \ln(Z) = \frac 1 2 \hbar \omega + \frac{\hbar \omega e^{\beta\hbar\omega}}{e^{\beta\hbar\omega}-1} = \frac 1 2 \hbar \omega + \frac{\hbar\omega}{1 - e^{-\beta\hbar\omega}}.$$
The heat capacity will be given by $C = \frac{dE}{dT}$: $$C = \partial_T E = (\partial_T \beta) \partial_\beta E = \frac 1 {T^2} \frac{\hbar^2\omega^2e^{-\beta\hbar\omega}}{(1 - e^{-\beta\hbar\omega})^2}.$$ While this term looks weird, so you could (a) plot it or (b) consider limit cases (and argue with continuity and monotonicity in between). But one can easily see that this term is continuous (so there no discrete jump in the heat capacity).
If $T \ll \hbar\omega$, the argument of the exponential gets large, so we can neglect it compared to the 1 in the denominator, this gives an expression $$C = \frac 1 {T^2} \hbar^2 \omega^2 e^{-\hbar\omega/T}.$$ Which gets exponentially small.
If $T \gg \hbar\omega$, the argument of the exponential is close to zero, so we can Taylor expand the exponential, giving: $$C = \frac 1 {T^2} \hbar^2 \omega^2 \frac{1 - \beta\hbar\omega}{\big(1-(1 - \beta\hbar\omega) \big)^2} \approx 1.$$ Which coincides with the well known result: In the high temperature limit there is one $T$ of energy per oscillatory degree of freedom.
Why does this hold in general? Below an excitation threshold the energy will have a contribution $\frac 1 Z e^{-\beta \Delta E} \Delta E$, this gives no zero contribution when $T < \Delta E$, but is rather exponentially suppressed when $T \ll \Delta E$ and then the coefficient goes to one as $T$ reaches and surpasses $T \approx \Delta E$.
As a final remark, there are special situations where the heat capacity can jump: Phase transitions.
Why does the heat capacity go to zero?
In the following we will always normalize the ground state energy of the system to 0 for simplicity.
Explicitely, in the case of an solid state system there is no continuous translation or rotation that can be excited, the only thing you can store energy in are phonons and electrons, that phonons lead to the $\propto T^3$ behaviour is simple to prove (in metals, electrons actually dominate for $T \to 0$ with a heat-capacity $\propto T$): The lowest lying phonon mode is acoustic, that is has linear dispersion and no gap (this can be seen as a direct consequence of the Goldstone theorem: Phonons are the Goldstone modes of the broken translation symmetry), this means that in 3d the density of states has the form $z \propto \epsilon^2$ for small $\epsilon$, the heat capacity is then given by: \begin{align*} C &= \partial_T \int_0^\infty d\epsilon\, z(\epsilon) \epsilon n_B(\beta \epsilon) \propto \partial_T \int_0^\infty d\epsilon\, \epsilon^2 \epsilon e^{-\beta\epsilon} \\ &\propto \partial_T T^4 \int_0^\infty dx\, x^3 n_B(x) \propto T^3. \end{align*} Where $n_B$ is the Bose distribution and the significant step was the substitution $x = \beta \epsilon$ (that is $\epsilon = Tx$). This also shows that for a system without an excitation gap the heat capacity can go to zero as $T \to 0$, but it will not be exponentially small as $T \to 0$ but rather follow a power law. A similar argument will work for translations of gas molecules as well (they have a spectrum $\epsilon \propto k^2$, so $z(\epsilon) \propto \epsilon^{1/2}$, which will give $C \propto T^{3/2}$). With Bosons there are some complications due to Bose-Einstein-Condensation (which can be fixed by considering the ground state separately). With Fermions you get in trouble when reaching the degenerate Fermi gas regime (there the heat capacity will behave as $\propto T$ as only electrons close to the Fermi level can be excited).
As a final remark, the requirement that the heat capacity goes to zero for $T \to 0$ is an alternative formulation of the third law of thermodynamics. So it should not be too surprising it holds.
No comments:
Post a Comment