I am trying to better understand the current scientific consensus (to the extent that such a thing exists) on the interpretation of quantum physics. I understand that this is still very much an active area of research but it seems to me that there is a general belief that decoherence is some sort of holy grail?
When I first was introduced to QT we were taught around the formalism of the Copenhagen interpretation (vague and unsatisfactory concepts of observations/measurements with a totally arbitrary line-in-the-sand between quantum and classical) and it was my understanding that the so called, "Measurement Problem" was still one of the big unsolved problems in physics. More recently however, I have gained the impression that a lot of people regard decoherence as the solution to this problem, but I have also seen specific claims that decoeherence does not attempt to resolve the measurement problem at all. Which is it? Is the measurement problem still a thing?
I have come across this paper which very explicitly claims that decoherence does not solve the measurement problem and never claimed to. So then, why are so many physicists behaving as though the problem has been solved?
A short outline of the paper:
Why Decoherence has not Solved the Measurement Problem: A Response to P. W. Anderson
It has lately become fashionable to claim that decoherence has solved the quantum measurement problem by eliminating the necessity for Von Neumann's wave function collapse postulate. For example, in a recent review in Studies in History and Philosophy of Modern Physics, Anderson (2001) states _The last chapter... deals with the quantum measurement problem....My main test, allowing me to bypass the extensive discussion, was a quick, unsuccessful search in the index for the word decoherence which describes the process that used to be called collapse of the wave function. The concept is now experimentally veried by beautiful atomic beam techniques quantifying the whole process." And again, in his response to the author's response (Anderson, 2001), Our diverence about `decoherence' is real.
In a somewhat similar vein, Tegmark and Wheeler (2001) state in a recent Scientic American article discussing the many-worlds" interpretation of quantum mechanics and decoherence, ...it is time to update the quantum textbooks: although these infallibly list explicit non-unitary collapse as a fundamental postulate in one of the early chapters, ...many physicists ... no longer take this seriously. The notion of collapse will undoubtedly retain great utility as a calculational recipe, but an added caveat clarifying that it is probably not a fundamental process violating Schreodinger's equation could save astute students many hours of frustrated confusion.
2 bonus questions!
It is my understanding that decoherence is a gradual process? Something that occurs quite rapidly but not instantaneously (correct?). So then, what does this have to say about the old notion of the discontinuous "qauntum leap"? Does the mainstream physics community still regard this "collapse" process as a discontinuous thing or is my understanding out of date?
This may not be worded correctly but here goes... When does decoherence occur? Take for example the double-slit experiment, if "collapse" occurs before the two waves interfere, then there will be no interference pattern. If "collapse" occurs after interference then there will be an interference pattern. This tell me that the process of "collapse" has a definite time of occurrence. So then, why does it happen when it does? What is preventing decoherence from occurring before the particle/wave passes through the slits? What is so different about a "measurement apparatus" compared to a molecule of nitrogen in the air or some other such thing? If I am to believe that it's simply a matter of having a large number of hidden degrees of freedom then where do you draw the line? Precisely how many free particles does an object need to have before we call it a "classical" object?!
Answer
You can find a comprehensive review of decoherence and how it fits in the QM interpretation debate in Decoherence, the measurement problem, and interpretations of quantum mechanics (Schlosshauer, 2005).
From the "concluding remarks" section:
We have argued that, within the standard interpretation of quantum mechanics, decoherence cannot solve the problem of definite outcomes in quantum measurement
As for your bonus questions, they pertain to the dynamics of decoherence while relating it to the concept of "collapse". But these are excluding views: in the decoherence program there is no collapse. This is precisely what the above paper calls "the problem of definite outcomes":
Phase coherence between macroscopically different pointer states is preserved in the state that includes the environment, and we can always enlarge the system so as to include (at least parts of) the environment. In other words, the superposition of different pointer positions still exists, coherence is only “delocalized into the larger system” [...] Much of the general criticism directed against decoherence with respect to its ability to solve the measurement problem (at least in the context of the standard interpretation) has been centered on this argument.
Another interesting paper providing an unusual view of what decoherence is all about is Decoherence without decoherence (Weinstein, 2009). Abstract:
It has been claimed that decoherence of open quantum systems explains the tendency of macroscopic systems to exhibit quasiclassical behavior. We show that quasiclassicality is in fact an unremarkable property, characterizing generic subsystems of environments even in the absence of dynamical decoherence. It is suggested that decoherence is best regarded as explaining the persistence of true classicality, rather than the emergence of quasiclassicality.
and from the conclusion:
What “decoherence” does to pointer states is in fact to maintain their classicality by precluding a loss of coherence. Decoherence does not explain the emergence of classicality, but its persistence. It does so by preventing the loss of coherence in the basis of one or more observables. The emergence of classicality, on the other hand, appears to await a resolution of the so-called “measurement problem” – only when physical properties take on definite values does one have something resembling a classical world.
No comments:
Post a Comment