- #1
Icheb
- 42
- 0
Light is falling on the planar side of a thin planar convex glass lens. The curvature radius |r| is 50cm. I have to calculate the distance of the focal point to the curved surface of the lens, which to my understanding is the focal length, since it's a thin lens. Additionally I am supposed to only look at rays that are close to the axis, and I'm not sure what exactly that means.
From my understanding the only equation I need is [tex]\frac{1}{f} = (\frac{n}{n_M} - 1) (\frac{1}{r_1} + \frac{1}{r_2})[/tex]. n would be 1,491, n_M would be 1, r_1 would be 50cm and r_2 would be infinitely big (since the lens is planar convex).
Would that be sufficient to solve it? If so, what does it mean that I'm only supposed to look at rays that are close to the axis? What does that change?
From my understanding the only equation I need is [tex]\frac{1}{f} = (\frac{n}{n_M} - 1) (\frac{1}{r_1} + \frac{1}{r_2})[/tex]. n would be 1,491, n_M would be 1, r_1 would be 50cm and r_2 would be infinitely big (since the lens is planar convex).
Would that be sufficient to solve it? If so, what does it mean that I'm only supposed to look at rays that are close to the axis? What does that change?