I would like to ask if anyone has found a tight enough way to define the term "quantum fluctuation" so that it can become a useful rather than a misleading piece of physics terminology.
Terminology in science serves, I guess, two main purposes: 1. as an aide to understanding as we learn, 2. to enable us to communicate fluently with one another. It is the first that concerns me here.
Let me first deal with the second issue, to get it out of the way. If one expert in quantum theory uses the phrase "quantum fluctuation" in the course of a conversation with another expert, then they probably both have in mind something about the non-zero value of kinetic energy in the ground state of motion, or the non-zero range of values of some observable, or the mathematics of quantum field theory, or something like that. They may be interested in symmetry-breaking, or phase transitions. The terminology "quantum fluctuation" doesn't matter since they both know what they are talking about in much more precise mathematical terms. They won't be misled by the phrase any more than someone working in QCD would be misled by terminology such as "charm" and "colour".
Now let's return to the first issue, which is the heart of my question. Is this phrase well-chosen? Does it help people to get good understanding, or does it tend to mislead? Does it give good physical intuition? Does it mean something other than "quantum spread"? Can we find a sufficiently tight definition of "quantum fluctuation" so that it becomes more helpful than it is misleading?
The following web article illustrates the use of "quantum fluctuation" as an attempt to describe quantum field theory for the non-expert: https://profmattstrassler.com/articles-and-posts/particle-physics-basics/quantum-fluctuations-and-their-energy/ Language such as "jitter" is freely employed. Such attempts show that the term invites the student to form a physical picture of something dynamically moving around.
Countless books and lectures talk freely about "bubbling and rippling" etc. etc.
Here are my reservations:
In so-called "zero point motion" there is no motion. There is kinetic energy, admittedly, and $\langle \psi | \hat{x}^2 | \psi \rangle \ne 0$, but in my mind's eye I don't see the oscillator "fluctuating" so much as just sitting there, not moving too and fro but simply spread out. It is not like a classical standing wave where the oscillating string or whatever moves up and down, and it is not like a thing moving to-and-fro in a potential well. The electron in the ground state of hydrogen is not fluctuating.
In ordinary English the word "fluctuation" refers to something dynamic, involving change as a function of time. But in the case of a time-independent Hamiltonian, there is no dynamic change for a system in an energy eigenstate, except for the unobservable global phase. This assertion applies as much to field states in quantum field theory as it does to non-relativistic quantum mechanics. So where are these famous "vacuum fluctuations"? Note again, my question does not concern correct mathematical treatment of field theory. It concerns whether the term "fluctuation" is well-chosen to give good physical understanding.
The virtual particles that appear in Feynman diagrams are not fluctuations; they are terms in a series of integrals whose sum describes a smooth and non-fluctuating scattering process.
Classical phase transitions may be said to be "driven" by thermal fluctuations. That's fine; it is an example of a dynamic change, a cause and effect. But if someone says that a quantum phase transition is "driven" or "caused" by quantum fluctuation (and I think people do say that), then what exactly are they saying?
Spontaneous emission by atoms manifests the coupling between the atom and the electromagnetic field in its vacuum state, and demonstrates that the field is not absent nor without physical effect. Since there is a stochastic nature to the clicks if the emitted photons are detected by ordinary detectors, one might loosely ascribe the randomness in the timing of the detection events to a randomness in the behaviour of the field in its vacuum state. Is this perhaps what "quantum fluctuation" means?
I do want an answer if there is one; I don't intend merely to generate discussion. But I think my question may be hard to answer because it touches on unresolved interpretational issues to do with the quantum measurement problem, and on the physics of symmetry breaking.
Related questions.
This question: Quantum Fluctuations of the vacuum gives a link to a video showing a computer simulation of vacuum fluctuations in a lecture by David Tong. What exactly is being plotted in the simulation here?
This question: Understanding quantum fluctuations is similar to mine but different enough to warrant my new one.
I found this question and its discussion helpful:
What is spontaneous symmetry breaking in QUANTUM systems?
Answer
I understand your concern. I believe that the reason for this terminology has to be understood historically, where it is meant to be something different than classical (thermal) fluctuation. Once one remembers this I think the term achieves its purpose (i.e., your point 1.).
The one thing that one has to realize is that there are no "fluctuations" classically at zero temperature. Consider a classical spin model on a graph. $\sigma$ is a configurations of spins ($0$ or $1$) on this graph. The classical Hamiltonian is a function of $\sigma$, $E(\sigma)$. At zero temperature the system is in the state of minimum energy, let it be $\sigma_0$. In other word we can say the system is in state $\rho$ with
$$ \rho_{\sigma,\sigma'}= \delta_{\sigma,\sigma_0} \delta_{\sigma',\sigma_0}. $$
(I'm using a notation also valid quantum mechanically). Clearly the state is diagonal (it's classical) but it's also pure, or in other words, an extremal (a -Kronecker- delta function). The state is "frozen" in the configuration $\sigma_0$. Intuitively there are no "fluctuations", i.e., other configurations contributing to the state. How do we measure this?
Any classical observable $A$ is also diagonal in $\sigma$. Computing averages with the above state one has
$$ \Delta A^2 := \langle A^2 \rangle - \langle A \rangle^2 = 0. \ \ \ \ \quad (1) $$
Indeed the two facts are equivalent (being an extremal and having zero fluctuations for any observable).
If we now raise the temperature, at equilibrium the state of the system is
$$ \rho_{\sigma,\sigma'}= \delta_{\sigma,\sigma'} e^{-\beta E(\sigma)}/Z, $$
with $Z$ partition function. Clearly now Eq. (1) will not be valid in general and we can have a phase transition as we rise the temperature. We say (colloquially) that this phase transition is due to thermal fluctuations.
Now you see the reason for the term "quantum fluctuations". Quantum mechanically Eq. (1) is in general violated also at zero temperature.
No comments:
Post a Comment