Conceptually, we can understand thermal fluctuations to emerge from the constant jiggling of neighbouring atoms in a non-zero temperature environment. In other words, as the temperature increases, the atoms/particles gain in kinetic energy and bounce more frequently off of one another, randomly exploring their degrees of freedom and in turn fluctuating around their equilibrium state. Thermal fluctuations are also at the core of Brownian motion of particles, which we describe mathematically through Wiener processes, where particles are modeled to undergo a continuous random walk that is homogeneous in time with independent increments, with the vital feature that the probability distribution of the walk is a Gaussian with mean 0. This is for example implemented in the Langevin equation as a Gaussian distributed noise term $\eta (t).$
Already from the above we have both a good physical and mathematical picture of what thermal fluctuations consists of. I am trying to put this understanding in contrast with quantum fluctuations, which I find very confusing. The hope is to understand it better at least at a conceptual level first.
In quantum thermodynamics literature, one often comes across sentences such as: "Quantum fluctuations may cause a phase transition..." or "the origin of fluctuations is quantum rather than thermal." And of course often the only bit of explanation is that this is due to the Heisenberg uncertainty relation for energy and time. But what is the underlying process? I imagine this is entirely different than thermal motion which is a completely classical concept so I don't expect there to be a same-spirited explanation for the quantum case, but say given a system of many electrons where quantum fluctuations are relevant: how do we define $\Delta t$ that is introduced in the Heisenberg uncertainty principle to then speak of the corresponding energy fluctuations? That is, to what time-scale in the system does it correspond to? Since it is not originated by thermal motion, any time scale related to how frequently the electrons bounce off of one another is irrelevant here. What process/interaction is then defining the relevant $\Delta t,$ which I imagine has to be very short so that quantum fluctuations $\Delta E$ are significant?
Often we say that since time $t$ is not an operator in QM, one is to interpret the time energy uncertainty as: the average time taken for an arbitrary observable $A$ to change by its standard deviation, starting from a state $\Psi$. Then we say "it's the shortest time scale where changes (in system's observables) are noticeable. But this is no explanation here as it begs to immediately ask: but what physical process/interaction of the system is defining that "average time"? But let us forget for a moment what is experimentally noticeable, and ask: if a many body system (electron gas for example) is left on its own, i.e., we are not performing any measurements, and that quantum fluctuations are relevant (because it hasn't fully decohered towards a fully mixed state), as in they may cause a phase transition - due to the large fluctuations in the energy of the electrons, what are the relevant time scales that are behind these fluctuations if they are to be quantum in nature?
In other words, are there physical processes by which we can understand these quantum fluctuations? (in analogy to how we understand thermal motion). I understand that here the answer just may be: "Well quantum phenomena are all weird in nature and inexplicable in terms of classical concepts". I would be completely fine with that, in fact I don't mean to downgrade the underlying explanation to a classical version, I'm just asking this question in the hope that there's a more physical way of understanding quantum fluctuations within the framework of Quantum Mechanics. Please note that, the electron gas in the above text is only taken as a dummy example, so feel free to choose another system more suitable to your explanation.
No comments:
Post a Comment