I read a book in which one chapter gave a speech about the fundamental constants of the Universe, and I remember it stated this:
If the mass of an electron, the Planck constant, the speed of light, or the mass of a proton were even just slightly different (smaller or bigger) than what they actually are, then the whole Universe would not exist as we know it. Maybe we all wouldn't exist.
This speech works for all the fundamental known constants of the Universe but one: the Boltzmann constant. Its value is well known but even if its value were $10$ times bigger or if it were exactly $1$, or $45.90$ or $10^6$ well... the Universe would remain the same as it is now. The Boltzmann constant is not really fundamental to the existence of the Universe.
Maybe they weren't the exact words, but the concept is correct.
Now I ask: is that true, and why?
Answer
We can understand all of this business if we visit the statistical mechanics notion of temperature, and then connect it to experimental realities.
First we consider the statistical mechanics way of defining temperature. Given a physical system with some degree of freedom $X$, denote the number of possible different states of that system when $X$ takes the value $x$ by the symbol $\Omega(x)$. From statistical considerations we can show that modestly large systems strongly tend to sit in states such that $\Omega(x)$ is maximized. In other words, to find the equilibrium state $x_\text{eq}$ of the system you would write $$ \left. \left( \frac{d\Omega}{dx} \right) \right|_{x_\text{eq}} = 0$$ and solve for $x_\text{eq}$. It's actually more convenient to work with $\ln \Omega$ so we'll do that from now on.
Now suppose we add the constraint that the system has a certain amount of energy $E_0$. Denote the energy of the system when $X$ has value $x$ by $E(x)$. In order to find the equilibrium value $x_\text{eq}$, we now have to maximize $\ln \Omega$ with respect to $x$, but keeping the constraint $E(x)=E_0$. The method of Lagrange multipliers is the famous mathematical tool used to solve such problems. One constructs the function $$\mathcal{L}(x) \equiv \ln \Omega(x) + t (E_0 - E(x))$$ and minimizes $\mathcal{L}$ with respect to $x$ and $t$. The parameter $t$ is the Lagrange multiplier; note that it has dimensions of inverse energy. The condition $\partial \mathcal{L} / \partial x = 0$ leads to $$t \equiv \frac{\partial \ln \Omega}{\partial x} \frac{\partial x}{\partial E} \implies t = \frac{\partial \ln \Omega}{\partial E} \, .$$ Now remember the thermodynamic relation $$\frac{1}{T} = \frac{\partial S}{\partial E} \, .$$ Since the entropy $S$ is defined as $S \equiv k_b \ln \Omega$ we see that the temperature is actually $$T = \frac{1}{k_b t} \, .$$ In other words, the thing we call temperature is just the (reciprocal of the) Lagrange multiplier which comes from having fixed energy when you try to maximize the entropy of a system, but multiplied by a constant $k_b$.
If not for the $k_b$ then temperature would have dimensions of energy! You can see from the discussion above that $k_b$ is very much just an extra random constant that doesn't need to be there. Entropy could have been defined as a dimensionless quantity, i.e. $S \equiv \ln \Omega$ without the $k_b$ and everything would be fine. You'll notice in calculations that $k_b$ and $T$ almost always shows up together; it's no accident and it's basically because, as we said, $k_b$ is just a dummy factor which converts energy to temperature.
Folks figured out thermodynamics before statistical mechanics. In particular, we had thermometers. People measured the "hotness" of stuff by looking at the height of a liquid in a thermometer. The height of a thermometer reading was the definition of temperature; no relation to energy. Entropy was defined as heat transfer divided by temperature. Therefore, entropy has dimensions of $[\text{energy}] / [\text{temperature}]$.$^{[a]}$
We measured the temperatures $T$, pressures $P$, volumes $V$, and number of particles $N$ of some gasses and found that they always obeyed the ideal gas law$^{[b,c]}$
$$P V = N k_b T \, .$$
This law was known from experiment for a long time before Boltzmann realized that entropy is actually proportional to the logarithm of the number of available microstates, a dimensionless quantity. However, since entropy was already defined and had this funny temperature dimensions, he had to inject a dimensioned quantity for "backwards compatibility". He was the first to write $$ S = k_b \ln \Omega$$ and this equation is so important that it's on his tomb.
In practice, it is actually rather difficult to measure temperature and energy in the same system over many orders of magnitude. I think that it's for this reason that we still have independent temperature and energy standards and units.
Boltzmann's constant is just a conversion between energy and a made-up dimension we call "temperature". Logically, temperature should have dimensions of energy and Boltzmann's constant is just a dummy that converts between the two for historical reasons. Boltzmann's constant contains no physical meaning whatsoever. Note that the value of $k_b$ isn't the real issue; values of constants depend on the units system you use. The important point is that, unlike the speed of light or the mass of the proton, $k_b$ doesn't refer to any unit-independent physical thing in Nature.
Temperature is the Langrange multiplier that comes from imposing fixed energy on the problem of maximizing entropy. As such, it logically has dimensions of energy.
Boltzmann's constant $k_b$ only exists because people defined temperature and entropy before they understood statistical mechanics.
You will always see $k_b$ and $T$ together because the only logically relevant parameter is $k_b T$, which has dimensions of energy.
$[a]$: Note that if temperature had dimensions of energy then under this definition entropy would have been dimensionless (as it "should" be).
$[b]$: Actually, this law was originally written as $PV = n R T$ where $n$ is the number of moles of a substance and $R$ is the ideal gas constant. That's not really important though because you can group Avogadro's number in with $R$ to get $k_b$. $R$ and $k_b$ have equivalent "status".
$[c]$: Note again how $k_b$ and $T$ show up together.
No comments:
Post a Comment