I haven't yet gotten a good answer to this: If you have two rays of light of the same wavelength and polarization (just to make it simple for now, but it easily generalizes to any range and all polarizations) meet at a point such that they're 180 degrees out of phase (due to path length difference, or whatever), we all know they interfere destructively, and a detector at exactly that point wouldn't read anything.
So my question is, since such an insanely huge number of photons are coming out of the sun constantly, why isn't any photon hitting a detector matched up with another photon that happens to be exactly out of phase with it? If you have an enormous number of randomly produced photons traveling random distances (with respect to their wavelength, anyway), that seems like it would happen, similar to the way that the sum of a huge number of randomly selected 1's and -1's would never stray far from 0. Mathematically, it would be:
$$\int_0 ^{2\pi} e^{i \phi} d\phi = 0$$
Of course, the same would happen for a given polarization, and any given wavelength.
I'm pretty sure I see the sun though, so I suspect something with my assumption that there are effectively an infinite number of photons hitting a given spot is flawed... are they locally in phase or something?
No comments:
Post a Comment