This question is similar to previously asked questions, but the responses to them are confusing and I think it may be better covered by listing out all the potential answers for clarity.
It's a simple and common question: why does light seem to travel more slowly in media which is transparent to its wavelength than it does in vacuum? I have seen responses all over the web from PhD professors at major universities whose answers are completely different. Here are all of the general categories of answers I have seen professional physicists put forth:
Light actually does move slower through transparent media. We don't really know why.
Light actually does move slower through transparent media. The reason is that light's EM effects induce nearby charged particles (electrons and nuclei) to alter the EM field with a harmonic vibration that "cancels out" some of the velocity of the light wave.
Light does not move slower. We don't know why it seems to.
Light does not move slower. It bounces around in the media which causes it to progress more slowly.
Light does not move slower. It is absorbed and emitted by electrons in the media which causes it to progress more slowly.
My thoughts on each of these:
If light actually moves slower but we haven't figured out why, I would expect it to behave relativistically in a manner similar to bradyons (particles with invariant mass which cannot reach the speed of light); but this is inconsistent with a form of energy which does not experience time. I don't see how any explanation for "slowed" light, other than 2, can be consistent.
I am currently leaning toward this answer, even though it is the rarest one I have seen. However, I don't understand the mechanics of how a light wave can be cancelled out or slowed by EM induction. My strong suspicion is that quantum effects are necessary: that is, light wouldn't be slowed at all were the environment always entangled with it (if you're one of those Copenhagen oddballs, this means if the wavefunction were continuously collapsed such that the light behaves as individual photons).
This seems pretty likely. I don't expect physicists to talk out their asses, but I have a hard time understanding why so many qualified physicists have completely different explanations for this basic principle.
This seems very unlikely to me, despite being the second-most common explanation I have found. If light were scattered, it wouldn't progress in the same direction through the media: it would disperse (to slow appreciably it would need to ricochet off of billions of atoms along the way). But we can see a beam of light refract through transparent media, and it doesn't diffuse much at all.
This is the most common explanation, yet I find it to be the least convincing! Not only do the issues from 4 apply here, but also we are talking about material which is almost completely transparent to the wavelength of light being refracted. EDIT: I previously asserted here that the slowing effect does not depend upon the frequency of light, which is incorrect. See below.
Is anybody who actually does physics for a living certain you understand this phenomenon? Or are we all spitting blind in the dark? It's very frustrating to see physicists giving incompatible explanations (with an air of certainty!) for a phenomenon known since antiquity, but I suppose it may be possible that more than one explanation is true...
EDIT: I believe I have the answer! I have answered my own question below.
Answer
After a lot more searching, I have found the answer to my question! :D
Below is a summary of the information I found. There is no specific webpage I can link to because I relied on sources who quoted other sources which no longer exist, but maybe this information can be useful to someone else someday. Most of what I learned comes from Professor Lou Bloomfield who currently teaches physics at the University of Virginia.
EDIT: None of this is quoted material: all information posted below has been completely reworded, and the analogies (aside from the guitar string) are mine.
When surrounded by normal matter, a light wave's electric field will cause electrons to jiggle at a rate equal to the frequency of the light wave: the electric component of the light wave will alternately attract and repel charged particles.
When electrons in a material transparent to a certain frequency are excited by a light wave of that frequency, this takes energy away from the light wave. But surprisingly, no photons are absorbed: since the material is transparent to the frequency of the wave, there is no higher orbital which matches exactly the energy level an individual photon would impart to an electron. This means the energy transfer can't involve a real particle interaction.
So what happens? Instead of absorbing one or more photons, the electrons enter a virtual quantum state: a temporary excitation that doesn't exactly match one of the states that the electron can occupy. This is very much like vibrating a guitar string by aiming sound at the string. If the sound you aim at the string matches a frequency that the string can vibrate at, it will cause the string to vibrate. If the sound you use is the wrong frequency, the string will wiggle a little bit as though trying to vibrate, then stop when the sound passes. That's what happens to the electrons: they borrow energy from the light wave, wiggle a little, and then return the energy.
A virtual quantum state is very limited in duration, and doesn't count as a particle interaction. The light wave and the electron remain unentangled and continue to act as probability waves. The electron can only play with the light wave's energy for a brief period before returning it. The characteristics of the light wave remain unchanged because there was no real particle interaction. So the light does not ricochet off of atoms, nor does it get emitted in the usual sense by the electrons which play with it.
Even though the interactions are all virtual, electrons are matter and they take time to jiggle. As this happens over and over and over again, it slows the progress of the wave.
You might think of this like a kind of friction which acts against the progress of the wave. Consider a car whose wheels turn at a constant speed, and imagine it encounters a series of large bumps that slow it down slightly. The speedometer is based on the wheel rotation, so it would say the car has not changed speed at all: it is just as fast as it was on flat terrain. The car will, however, cover less ground per time interval because some of the wheel-turning is used to surmount the humps. These humps are akin to the process of electrons temporarily borrowing energy from the light wave.
So is the light wave truly slowed, or is the light still moving at c and only its progress is slowed? This isn't actually a well-formed question, and for all practical purposes the answer doesn't matter. However, I find it easier to think about it as slowing the wave's progress. This means the characteristic that "light moves at speed c in all reference frames" still holds true, which makes it much easier for me to reason about relativistic effects.
Additionally, I was incorrect about different frequencies slowing by the same amount: lower frequencies are slowed less than higher frequencies. When the frequency is lower, even though the wave has less energy, the electrons will need to jiggle over a wider area (they are pulled for a longer period, then pushed for a longer period). Since the electrons remain bound to their atoms in this interaction, they can't be pulled out of the atom by a virtual excitation. So the slower the frequency is, the "more virtual" the excitation must be, and the less time the electrons have to play with the light.
Is this information useful? If so, is there a way I could make it more accessible? Just curious, as I am very new to SE.
No comments:
Post a Comment