- #1
Irid
- 207
- 1
Homework Statement
I'm studying scattering from a rough surface, and my textbook defines h(x,y) as a small vertical deviation from a flat surface. Then they proceed calculations by assuming that the height difference between two points h(x,y)-h(x',y')=some f(x-x', y-y'), i.e. it depends only on the relative position of the two points. In other words, the surface is isotropic.
2. Question
I don't see what kind of surface would ever fulfill this condition, except some very special one, like a constant inclination. If there are any bumps or dips, obviously Δh will not be the same as me move around the surface using the same bar of length (x-x')... Could anybody explain this assumption?