How Do Different Wavefront Shapes Appear to the Human Eye?

  • #1
JasmineMasown
13
3
TL;DR Summary
How do different shapes of wavefronts appear to human eye??
I learnt that wavefronts can be spherical, cylindrical or planar. But i wonder, if one were to observe light waves of different wavefronts with naked eyes, how would they appear? Would they appear all the same or would there be any difference in how we see them?
Just a basic question that I felt curious about. Would love to know your thoughts :p

Thank you!
 
Science news on Phys.org
  • #2
What shape is the wavefront coming to your eye from a distant star? What shape is the wavefront coming to your eye from an LED a few centimetres away? How do they look different if they are both in your visual field at once?
 
  • Like
Likes vanhees71
  • #3
Ibix said:
What shape is the wavefront coming to your eye from a distant star? What shape is the wavefront coming to your eye from an LED a few centimetres away? How do they look different if they are both in your visual field at once?
So basically you're trying to convey that all of it will look the same to our eyes... Just as light... Right?
 
  • #4
JasmineMasown said:
So basically you're trying to convey that all of it will look the same to our eyes... Just as light... Right?
Sort of. I'm actually asking you questions whose answers will lead you to what you want to know. The general aim of this site is to promote understanding of science, not just give answers.

If you look at an LED and a star and focus on one or other, how will it look? How will the other look? And if you think about the wavefronts from each, how would you describe them?
 
  • Like
Likes vanhees71
  • #5
Ibix said:
Sort of. I'm actually asking you questions whose answers will lead you to what you want to know. The general aim of this site is to promote understanding of science, not just give answers.

If you look at an LED and a star and focus on one or other, how will it look? How will the other look? And if you think about the wavefronts from each, how would you describe them?
Okay. Thanks for your former clarification :)

Regarding the questions asked in the latter section, If I look at an LED and star, they both look alike. The light from both sources appear the same.
Now talking about their wavefronts, LED has a spherical one since it's just centimetres away from me. Meanwhile, the wavefront of star, is actually planar owing to its large distance from the viewer (me)
 
  • Like
Likes vanhees71 and PeroK
  • #6
JasmineMasown said:
Okay. Thanks for your former clarification :)

Regarding the questions asked in the latter section, If I look at an LED and star, they both look alike. The light from both sources appear the same.
Now talking about their wavefronts, LED has a spherical one since it's just centimetres away from me. Meanwhile, the wavefront of star, is actually planar owing to its large distance from the viewer (me)
Aha! So that leads to the observation that all forms of wavefronts look akin to us.
 
  • #7
JasmineMasown said:
Regarding the questions asked in the latter section, If I look at an LED and star, they both look alike.
I don't think so. Unless you have incredible depth of focus, one will appear blurred.
 
  • Like
Likes vanhees71
  • #8
Wouldn't both of them appear blurry? I mean, we can't see individual light rays. Do you mean in any other way?
Ibix said:
I don't think so. Unless you have incredible depth of focus, one will appear blurred.
 
  • #9
JasmineMasown said:
Wouldn't both of them appear blurry? I mean, we can't see individual light rays. Do you mean in any other way?
Stars don't usually appear blurry if you focus on them (and are wearing glasses if necessary). They do appear blurry if you focus on something right in front of your nose. Try it! Or use a couple of small candles, one near you and one on the otger side of the room. Focus on one but think about what the other looks like.
 
  • Like
Likes vanhees71 and JasmineMasown
  • #10
JasmineMasown said:
Wouldn't both of them appear blurry? I mean, we can't see individual light rays. Do you mean in any other way?
Let us first consider looking at the far away star.

There is no particular problem getting tight focus on an image coming in from infinity. The light from that star enters our eyes. (A portion of the plane wave a few millimeters in diameter comes in through our pupils). It is focused by the lens so that a real image is produced on the retina. Light that enters from the left and right sides of the pupil is deflected to arrive at the same spot. Light that enters from the top and bottom sides of the pupil is also deflected to arrive at that spot.

One (or maybe a tight cluster of a few) rods and cones are illuminated. A pixel on our retina is activated. That is what it means for the image to be sharp.

There are muscles in the eye that can stretch the lens in the eye to change its focal length slightly. In most humans, they can relax sufficiently to get proper focus for an image at infinity. Some near sighted individuals may need correction (glasses) to achieve this.

Let us switch to consider an LED at close range.

There is a limit to how hard the muscles can pull and how close a real image can be so that focus is attained. In my youth, I could get down to perhaps 3 or 5 centimeters. These days it is more like 15 to 30 centimeters.

For me, an LED held a few centimeters from my eyes is sure to appear blurry while the star will not.

My insufficiently stretched lens is not strong enough. Its focal plane would be behind the retina. Light from the LED that enters on the left side of my pupil illuminates a spot that is a bit leftward on the retina. Light from the right illuminates a spot a bit right. Top illuminates top. Bottom illuminates bottom. Instead of illuminating a single rod or cone or a tight cluster of rods and cones, a fuzzy region is illuminated instead. The LED will appear blurry.

You do understand that what we "see" is the result of activation of an array of illuminated pixels on the retina. An image which is then heavily post-processed by the optical cortex where we get things such as edge detection, auto focus, auto binocular tracking, color scaling, motion detection and various more sophisticated things.
 
Last edited:
  • Like
Likes vanhees71 and JasmineMasown
  • #11
JasmineMasown said:
So that leads to the observation that all forms of wavefronts look akin to us.
What does "look akin" mean? Spherical vs planar wavefronts require different lens geometry to be focused, which our brain can interpret to judge distance. So the perception is not the same.
 
  • Like
Likes vanhees71 and JasmineMasown
  • #12
Detectors, like the photosensitive cells in your eye, can only detect intensity, which is independent of the phase of the wave. So it doesn't matter what the phase is at your retina. However, as others have said, the wave's phase has everything to do with how it propagates, through lenses and such, to get to the detector.

Huygens' principle is basically just a description of how light waves can be represented as a sum of individual fields from point sources. This sum is done at the field level and phase information is critical to the proper summation.
 
  • Like
Likes Klystron, vanhees71 and JasmineMasown
  • #13
Okay thank you so much everyone. Now i have understood where you all are getting at.

A mistake that I did from my side was assuming that the LED was beyond our 'near distance' (that counts as a few centimetres too I guess) which led me to think that both of the lights (from LED as well as the star) would appear the same. (i.e. both would be sharp, so to say)

And another correction, which I'd like to make is about both of them being 'blurry'. Lemme try to phrase things better. By 'blurry' I didn't mean to imply that my eyes won't be able to focus on either of those. Instead, what I meant was that light waves coming from both are basically light waves and we can't see individual rays of light in each case as such (sure we can see clusters of rays in some cases as when the sun is partly covered by clouds). So in that way, they must be same.

But now I've understood, thanks to answers given by you all (in particular, jbriggs444 and A.T.), that since we are considering the LED to be very close to us (apx. less than 25cm), the light rays emitted by the LED would illuminate a wider region of con and rod cells and that would be perceived as a 'blurry' image (not sharp). Meanwhile, the light rays from the star being distant to us, would illuminate a narrower region of cons and rods leading to the formation of a sharp image. From here, one can infer that spherical wavefront (generated by a near source) would appear unfocused while planar wavefront (generated by a distant object) would appear sharp.

If there's any other mistake in my understanding, feel free to correct :)

Thank you again!
 
  • #14
JasmineMasown said:
Instead, what I meant was that light waves coming from both are basically light waves and we can't see individual rays of light
Waves and rays are just representations of the same thing. Rays are perpendicular to wavefronts and indicate their propagation direction.

JasmineMasown said:
From here, one can infer that spherical wavefront (generated by a near source) would appear unfocused while planar wavefront (generated by a distant object) would appear sharp.
Which one appears sharp depends on how your eye is focused.
 
  • Like
Likes jbriggs444 and vanhees71
  • #15
A.T. said:
Which one appears sharp depends on how your eye is focused.
Yeah that's true. I've experimented with that thing a lot since my childhood.
 
  • #16
A.T. said:
Which one appears sharp depends on how your eye is focused.
But basically, here we are focusing on the scenario where both the objects under consideration (star and LED) will be our centre of attention (at one moment, one object and at the next moment, the next one)... Taking that into account my assertion is still relevant, ain't it?
 
  • #17
JasmineMasown said:
But basically, here we are focusing on the scenario where both the objects under consideration (star and LED) will be our centre of attention (at one moment, one object and at the next moment, the next one)... Taking that into account my assertion is still relevant, ain't it?
What assertion is that? Would it be:
JasmineMasown said:
From here, one can infer that spherical wavefront (generated by a near source) would appear unfocused while planar wavefront (generated by a distant object) would appear sharp.
No, taking into account the "center of attention" of the eye does not make this inference correct.
 
  • #18
jbriggs444 said:
What assertion is that? Would it be:
No, taking into account the "center of attention" of the eye does not make this inference correct.
So what's the right answer??
 
  • #19
JasmineMasown said:
So what's the right answer??
A.T. said:
Which one appears sharp depends on how your eye is focused.
 
  • Like
Likes jbriggs444
  • #20
So, both of the wavefronts would appear the same depending on whether our eye is focused or not... Is that so?
 
  • #21
I'm totally perplexed now
 
  • #22
JasmineMasown said:
So, both of the wavefronts would appear the same depending on whether our eye is focused or not... Is that so?
What does "appear" mean here? Just the sharpness of the image, or also the depth perception done by the brain?
 
  • #23
With only one pointlike object to observe and with you concentrating on that object you will automatically focus your eyes to obtain maximum sharpness for the resulting image. The result will be a single illuminated point on your retina(s) regardless of whether the actual object is at 15 cm or at 15 light years.

You should be able to use binocular depth perception to recognize that the nearby object is nearby. No other good depth cues would be available with only a single bright point in the field of view.

You will not be able to see some faint illuminated spherical surface or any faint flat surface. What you do see will be due to the single illuminated point on your retina in either case.Based on experimentation with my own eyes, the initial autofocus cue comes from binocular vision. The two eyes will slew to bring their respective views into coincidence so that the illuminated points coincide. This will provide a distance measure to work with. Both eyes then proceed by focussing in to this image depth. If sharpness is not obtained, they will "hunt" until a sharper image is obtained. If one can interfere with the binocular depth phase (e.g by looking through a chain link fence, vertical slats on a fence or at piece of pegboard so that aliasing is possible), the hunting behavior becomes noticible. Being able to do this on demand is very handy for "spot the difference" picture pairs. Just cross your eyes so that the two pictures coincide and then look for the shimmery/flickering areas where differences exist. I learned this all in a couple of days when I was twelve or thirteen and my dad put up pegboard on the wall next to my bed.
 
Last edited:
  • Like
Likes vanhees71 and JasmineMasown
  • #24
A.T. said:
What does "appear" mean here? Just the sharpness of the image, or also the depth perception done by the brain?
I meant the sharpness as well as the depth perception by the brain
 
  • #25
JasmineMasown said:
I meant the sharpness as well as the depth perception by the brain
You can make either one look sharp (not at the same time but alternating), but the distance perception can be different between the two states.
 
  • Like
Likes JasmineMasown
  • #26
jbriggs444 said:
You do understand that what we "see" is the result of activation of an array of illuminated pixels on the retina.

I have to jump in here- what you wrote above is really misleading and is barely a spherical cow approximation. Individual rods and cones don't function like electronics, they are not in a regular array, and photoreceptors are not individually addressed by individual retinal ganglia. Conversely, retinal ganglia can attach to multiple photoreceptors, and those photoreceptors need not be neighboring.

The essential irreducible element of vison at the retinal ganglion level is the "receptive field", and there are both "on-center" and "off-center" fields.

There are approximately 7 layers of processing just at the retina (spatial averaging, temporal averaging, edge detection, movement detection etc), before the signals travel through the optic nerve to area V1 of the visual cortex. V1 performs another 5-9 layers of processing (orientation detection, binocular disparity, movement direction, etc) before the signal moves onwards towards other parts of the cortex- we have more than 30 distinct post-processing regions (the extrastriate cortex).

An excellent source for this topic is "Basic Vision" by Snowden, Thompson, and Troscianko.
 
  • Informative
  • Like
Likes JasmineMasown, difalcojr, Lord Jestocost and 3 others
  • #27
JasmineMasown said:
I learnt that wavefronts can be spherical, cylindrical or planar. But i wonder, if one were to observe light waves of different wavefronts with naked eyes, how would they appear?
The human eye has a variable focus thanks to its flexible lens that adjusts for fine focus. So a wave emitted from a point-like source will appear identical no matter its 'shape' (which could be more accurately termed its curvature) as long as it can be brought to focus. All light waves will be either close to a flat, planar shape or a curved shape that is spherical or near-spherical. As long as your eye can accommodate to bring the wave into focus, it will look like a dot.

Waves that cannot be brought to focus will look like a blurry dot, with the degree of degradation proportional to how out of focus the wave is.
 
  • Like
Likes sophiecentaur, JasmineMasown and difalcojr
  • #28
Andy Resnick said:
. Individual rods and cones don't function like electronics, they are not in a regular array, and photoreceptors are not individually addressed by individual retinal ganglia.
The multiple interconnections between receptors and the brain are (in my view) a form of spatial filter. Given enough processor power, a digital processor could do a similar job but, of course our vision is adaptive and very smart so we do pretty well, considering our optical hardware is pretty second rate, in many ways.
 
  • Like
Likes JasmineMasown
  • #29
Drakkith said:
The human eye has a variable focus thanks to its flexible lens that adjusts for fine focus. So a wave emitted from a point-like source will appear identical no matter its 'shape' (which could be more accurately termed its curvature) as long as it can be brought to focus. All light waves will be either close to a flat, planar shape or a curved shape that is spherical or near-spherical. As long as your eye can accommodate to bring the wave into focus, it will look like a dot.

Waves that cannot be brought to focus will look like a blurry dot, with the degree of degradation proportional to how out of focus the wave is.
Thanks for that 😊
Got it!
 

Related to How Do Different Wavefront Shapes Appear to the Human Eye?

What is Huygens' principle and how does it explain the propagation of wavefronts?

Huygens' principle states that every point on a wavefront acts as a source of secondary spherical wavelets, and the wavefront at any subsequent time is the envelope of these secondary wavelets. This principle helps explain the propagation of waves by showing how wavefronts evolve over time as they encounter obstacles or openings.

How do different shapes of wavefronts appear to the human eye?

Different shapes of wavefronts can appear as various patterns of light and shadow. For example, a spherical wavefront might appear as a point source of light, while a planar wavefront could appear as a uniform beam of light. The human eye perceives these patterns based on the constructive and destructive interference of the secondary wavelets generated by the wavefronts.

What happens to wavefronts when they encounter obstacles?

When wavefronts encounter obstacles, they can bend around the edges (diffraction), reflect off surfaces (reflection), or pass through openings and spread out (diffraction through apertures). The shape of the wavefront changes based on the size and shape of the obstacle relative to the wavelength of the wave.

Can Huygens' principle be applied to all types of waves?

Yes, Huygens' principle can be applied to all types of waves, including light waves, sound waves, and water waves. The principle is a general description of wave propagation and is not limited to any specific type of wave. However, the effects and patterns observed may vary depending on the medium and the wavelength of the waves.

How does the wavelength of a wave affect the appearance of wavefronts to the human eye?

The wavelength of a wave significantly affects the appearance of wavefronts. For light waves, shorter wavelengths (blue light) tend to diffract less and produce sharper images, while longer wavelengths (red light) diffract more and produce more blurred images. The human eye is more sensitive to certain wavelengths, which can also influence how wavefronts are perceived.

Similar threads

Replies
7
Views
1K
Replies
1
Views
2K
Replies
1
Views
2K
  • Biology and Medical
Replies
11
Views
3K
  • STEM Academic Advising
Replies
16
Views
430
  • Classical Physics
Replies
21
Views
1K
Replies
1
Views
1K
  • Special and General Relativity
Replies
24
Views
2K
Replies
7
Views
1K
  • Programming and Computer Science
Replies
1
Views
2K
Back
Top