- #1
Ray12
- 6
- 0
I've searched online and on the forum but still can't find an explanation or mechanism behind why diffraction is dependent upon wavelength.
For example, assume a water wave that diffracts around a small boat (smaller than the wavelength). The degree of diffraction decreases as the boat gets bigger, until being nil when the boat is larger than the wavelength.
Why is this? Is it just an empirical observation that's taken as an axiom or is there an explanation for this?
(Note: any explanations can involve Newtonian Mechanics and Vector Calculus, as I am already familiar with them.)
For example, assume a water wave that diffracts around a small boat (smaller than the wavelength). The degree of diffraction decreases as the boat gets bigger, until being nil when the boat is larger than the wavelength.
Why is this? Is it just an empirical observation that's taken as an axiom or is there an explanation for this?
(Note: any explanations can involve Newtonian Mechanics and Vector Calculus, as I am already familiar with them.)