I'm trying to understand Planck's law for the black body radiation, and it states that a black body at a certain temperature will have a maximum intensity for the emission at a certain wavelength, and the intensity will drop steeply for shorter wavelengths. Contrarily, the classic theory expected an exponential increase.
I'm trying to understand the reason behind that law, and I guess it might have to do with the vibration of the atoms of the black body and the energy that they can emit in the form of photons.
Could you explain in qualitative terms what's the reason?
Answer
The Planck distribution has a more general interpretation: It gives the statistical distribution of non-conserved bosons (e.g. photons, phonons, etc.). I.e., it is the Bose-Einstein distribution without a chemical potential.
With this in mind, note that, in general, in thermal equilibrium without particle-number conservation, the number of particles $n(E)$ occupying states with energy $E$ is proportional to a Boltzmann factor. To be precise: $$ n(E) = \frac{g(E) e^{-\beta E}}{Z} $$ Here $g(E)$ is the number of states with energy $E$, $\beta = \frac{1}{kT}$ where $k$ is the Boltzmann constant, and $Z$ is the partition function (i.e. a normalization factor).
The classical result for $n(E)$ or equivalently $n(\lambda)$ diverges despite the exponential decrease of the Boltzmann factor because $g(E)$ grows unrealistically when the quantization of energy levels is not accounted for. This is the so-called ultraviolet catastrophe.
When the energy of e.g. photons is assumed to be quantized so that $E = h\nu$ the degeneracy $g(E)$ does not outstrip the Boltzmann factor $e^{-\beta E}$ and $n(E) \longrightarrow 0$ as $E \longrightarrow \infty$, as it should. This result is of course due to Planck, hence the name of the distribution. It is straightforward to work this out explicitly for photons in a closed box or with periodic boundary conditions (e.g. see Thermal Physics by Kittel).
I hope this was not too technical. To summarize, the fundamental problem in the classical theory is that the number of accessible states at high energies (short wavelengths) is unrealistically large because the energy levels of a "classical photon" are not quantized. Without this quantization, the divergence of $n(E)$ (equivalently, of $n(\lambda)$) would imply that the energy density of a box of photons is infinite at thermal equilibrium. This is of course nonsensical.
No comments:
Post a Comment