How Can a Lens Maintain Constant Thickness and Still Focus Light?

In summary, the Lens of Constant Thickness refers to a type of lens in which the thickness remains constant throughout, unlike other lenses where the thickness varies. This type of lens is commonly used in eyeglasses and camera lenses, as it allows for a more accurate and consistent refraction of light. The constant thickness also makes it easier to manufacture and reduces the potential for defects. However, this type of lens has limitations in terms of correcting certain types of visual impairments. Overall, the Lens of Constant Thickness plays a crucial role in various optical devices and provides a reliable and efficient solution for correcting vision.
  • #1
greendog77
25
0

Homework Statement



The index of refraction of glass can be increased by diffusing in impurities. It is then possible to make a lens of constant thickness. Given a disk of radius a and thickness d, find the radial variation of the index of refraction, n(r), which will produce a lens with focal length F. You may assume a thin lens (d<<a). Note that there are two approaches to image focusing problems. You may be more familiar with one, where you would trace light rays using Snell’s law, and two rays converge to a point. This approach will lead to a very complex solution.

Homework Equations



Snell's law: n1 * sin(θ1) = n2 * sin(θ2)

The Attempt at a Solution



At first the problem seemed simple enough--just apply snell's law to figure out how to make the rays converge to a focal point F. However I realized a slab doesn't actually change the angle of the ray, (angle of deviation is 0). Though you can change the index of refraction at different points, the lens is thin so that in the region that a ray of light traverses, the lens will have essentially a constant index of refraction. I'm not quite sure how this lens would actually work. Also, what are the "two approaches" that the problem mentions? Thanks!
 
Physics news on Phys.org
  • #2
Normally you think of light parallel to the optic axis refracted through the focus ... but you can turn that picture the other way around too: light diverging from the focus is refracted so it does what?
 
  • #3
in that case the emerging rays would all be parallel, correct?
 
  • #4
Well done ... however: have you covered the concept of a "phase shift" yet?
Graded index materials?

Also - I don't think you should assume the lens is all that thin.
 
  • #5
I've learned about phase shifts but not graded index materials. To me that sounds like a method to solve a specific set of similar problems?
 
  • #6
Yah - you can use phase-shift concepts to solve this problem quite simply.
Have you done any work on the phase shift in relation to a thin lens and in relation to refractive index?
 
  • #7
hm no i haven't. what is the method?
 
  • #8
Well, if a plane wave passes, at normal incidence, through a slab of a certain refractive index, it travels at a different speed through the slab than it would through the air. This means when it emerges, it's phase is different from what it would be if the slab weren't there.

The thicker the slab, the bigger the phase difference.
The bigger the refractive index, the bigger the phase difference.

I'm pretty sure this is the alternate method your teacher is alluding to.
You want to find the refractive index variation that has the same effect on the phase shift that a thin lens has.

What were your phase-shift lessons about? Did you learn the term "optical path difference"?
It is this knowledge you need to apply and it is difficult to advise you properly without knowing where you stand.
 
  • #9
I've learned about path difference and all the basic concepts about waves, so I think it's a matter of actually applying them to this method I've never heard about. Based on what you've told me it sounds like what I should do is equate the path difference of the plane waves to zero when they emerge from the lens? In that case,

n(r)*d + sqrt(F^2+r^2) = n(0)*d + F

I think that does give me the correct answer! But I'm not too sure why it does. If we can assume the wave to be a plane wave, then what happened to tracing light rays using Snell's law? If light rays enter the lens won't it emerge virtually undisturbed? Why does this change when we use wave fronts? I do understand that by Huygen's principle we can assume all light sources as a spherical front and points on the primary front expand as secondary spherical fronts, so we can use this plane wave model for the problem. But then again this seems to contradict with the ray model. If you could help me clarify this, that'd be extremely helpful. Thanks!
 
  • #10
...it sounds like what I should do is equate the path difference of the plane waves to zero when they emerge from the lens?
... not exactly - the optical path difference is what gives you the phase difference.

It should not be zero. But what you got does look good ... so perhaps I misunderstood what you were trying to say?

Have a look at:
http://www.iue.tuwien.ac.at/phd/kirchauer/node51.html
... and [pdf format documents]:
http://www-inst.eecs.berkeley.edu/~ee119/sp10/Exams/finalsoutions10.pdf
(Q3 p6 - example: compare with your notes.)
http://www.physics.byu.edu/faculty/bergeson/physics571/notes/L06ABCDGaussian.pdf
(It's a bit daunting, skip to appendix 2 where they derive the phase shift for a lens - provided in case it is not in your notes already.)
http://home.comcast.net/~szemengtan/LinearSystems/waves.pdf
(this last is a detailed treatment using Fourier analysis - included for completeness)

Per your question:
Incoming plane waves are parallel light rays - what does a thin lens do to parallel light rays?
 
Last edited by a moderator:
  • #11
sorry, i meant that the phase shift of the emerging waves should have the same value?
 
  • #12
i meant that the phase shift of the emerging waves should have the same value?
... as each other you mean?

But you can readily see that this cannot be the case, since the emerging rays converge (or diverge), then there must be a different phase shift traversing different parts of the lens. You test for what should or should not be is always real life, not math.

Perhaps you are thinking that since the thin lens is infinitely thin, then each part of the incoming plane wave is infinitesimally phase shifted - which is too small to have a effect. If so then you have also realized that there is exactly the same argument in ray optics!

If the optic axis is z, and a lens is centered in the x-y plane, looking only at the z-y plane:
The thickness of the lens varies with y: ##d(y)=d_0-f(y)## where ##f(y)## usually depends on the curvature.
[The lens extends from -Y to Y so d(y>Y)=0.]

With a thin lens, ##d_0## is very small in such a way that ##f(y)## still has an effect.
If it didn't have an effect, then the result would not be a lens.
Technically d_0 does not need to be small - the thin lens is technically one whose surface radius of curvature is large compared with Y. So thin lenses don't have to be, literally, "thin".

[note: the thickness of a spherical lens varies with ##r=\sqrt{x^2+y^2}## ... I just looked at the y-axis for simplicity.]

Aside:
With ray optics it is easy to think of the rays tracing out the trajectories of corpuscles of light that get deflected by interfaces etc. In the phase picture you are dealing more with a wave model - in this picture, the effect of an optical component is to hold parts of the incident wave back.

The third link in post #10 handles the math.

There's a wave simulator here:
http://www.falstad.com/ripple/
... in the top menu, select "setup: biconvex" and next one down select "plane waves".

With a bit of fiddling you can get something like this:

attachment.php?attachmentid=65928&stc=1&d=1390440803.png


I've change it about so it is oriented like your ray diagrams - plane wave comes in from the left.
The same shade of blue-grey has the same phase with shading so that the lightest blue-grey and the darkest are 180deg out of phase.

You can see that the emerging wave-fronts converge towards the focal point - form a waist there - then diverge from it.

A vertical straight edge can be used to compare phases along equal horizontal coordinates.
Before the lens, the phases are all the same for the same horizontal but after the lens they are all different.

Note: The simulation also includes diffraction and reflection effects - which are small-ish but still visible.

See how the phase picture gives you a much more complete idea of what is going on?
You should have a play with the simulator and satisfy yourself that it can reproduce the optics that you already know from drawing ray diagrams and then just generally play around.
 

Attachments

  • wave-lens.png
    wave-lens.png
    22.4 KB · Views: 797

FAQ: How Can a Lens Maintain Constant Thickness and Still Focus Light?

1. What is the lens of constant thickness?

The lens of constant thickness, also known as a plano-convex lens, is a type of lens that has one flat surface (plano) and one outwardly curved surface (convex). It is commonly used in optical systems to focus or diverge light.

2. How does the lens of constant thickness work?

The lens of constant thickness works by bending light rays as they pass through it. The curved surface causes the light rays to converge or diverge, depending on the shape of the lens. This allows for the manipulation of light in various optical systems.

3. What are the applications of the lens of constant thickness?

The lens of constant thickness has many applications in optics, including use in telescopes, microscopes, cameras, and eyeglasses. It is also commonly used in laser systems and in the production of optical instruments.

4. What are the advantages of using a lens of constant thickness?

One of the main advantages of using a lens of constant thickness is its simplicity and ease of manufacturing. It is also more affordable compared to other types of lenses. Additionally, it has a lower spherical and chromatic aberration, making it useful for high-precision optical systems.

5. What are the disadvantages of using a lens of constant thickness?

One potential disadvantage of using a lens of constant thickness is that it has limited optical power and can only focus or diverge light to a certain extent. It also has a limited field of view and may cause distortion in the resulting image. Additionally, it may be prone to scratches or damage on its curved surface.

Similar threads

Replies
3
Views
704
Replies
1
Views
3K
Replies
10
Views
2K
Replies
9
Views
1K
Replies
1
Views
8K
Replies
2
Views
1K
Back
Top