Almost in every book on physics we can find a statement like "diffraction gets stronger when the size of the slit is comparable to the wavelength". Let's say we have a wall in a bathtub with a slit in it. When water waves reach the slit, the books usually come up with the Huygens' principle to explain that the points on the wavefront which are near the edges interfere in some fancy way so that the water waves spread out radially. However I do not see any connection between the size of a slit and the wavelength. I do understand that if we have a really tiny slit, there are very few points on the wavefront which produce radial waves, and if the slit gets bigger, there are more points that produce the same secondary waves which eventually interfere in an interesting way. But what it has to do with the wavelength? If we have a tiny slit and even more tiny wavelength, how does this change the game?
So for example, if we have a big wavelength, the diffraction would be:
But if the wavelength gets smaller while the slit remains the same, I'd expect the very same diffraction with the only difference that the resultant wave will have a shorter wavelength:
No comments:
Post a Comment