- #1
RhysGM
- 10
- 0
- TL;DR Summary
- I'm trying to work out what "resolution" my telescope will have at various magnifications. And what the limit in magnification is based on how much light there is and the size of the object.
Lets assume I have a telescope with an aperture of 200 mm, if we also assume λ has a wavelength of 500 nm
According to;
http://labman.phys.utk.edu/phys222core/modules/m9/resolving_power.htm
θ = 1.22 λ/d, where θmin is the angular separation that can be resolved.
1.22 * 500 nm / 200 mm = 3.05
Assuming Jupiter has an apparent diameter of 49 arcseconds, and we can resolve differences in light down to 3 arc seconds which seems very low.
49 / 3.05 = 16
assuming nm needs to be converted to mm?
16 * 1,000,000 = approx 16,000,000 which seems way to high.
Lets now assume I have a telescope with a focal length of 2000 mm and a 5 mm eyepiece with an apparent field of view of 68° this gives a true field of view of 0.17° (68 / 2000 / 5). At 49 arc seconds Jupiter has an apparent field of view of 0.0136° and would therefore fill 8% of the eyepiece.
I can get a Barlow, a lens which is placed between the objective lens and the focal point, changing the angle of light, effectively simulating a longer focal length, this will increase magnification. However as the aperture remains the same, the telescope will resolve with the same power. For instance if I get a 4x Barlow to increase the focal length of the telescope to 8,000 mm, with a magnification of 1,600x, Jupiter will fill 33% of the eyepiece. (At this magnification the exit pupil will be too low for the eye to physically see)
Ultimately I want to add some additional formulas to my astronomy spreadsheet that gives a number I can compare with all my Telescopes / Eyepieces and Barlows, I can then use certain combinations to understand what that number means from a practical point of view. And then when I want to buy a new telescope/eyepiece/barlow, I can plug those values in and find out what I would expect to see. I can then optimise the hardware for every deep sky object I wish to view.
Does this make sense? Can anyone see a better way to understand, how blurry / faint an objective will be perceived?
According to;
http://labman.phys.utk.edu/phys222core/modules/m9/resolving_power.htm
θ = 1.22 λ/d, where θmin is the angular separation that can be resolved.
1.22 * 500 nm / 200 mm = 3.05
Assuming Jupiter has an apparent diameter of 49 arcseconds, and we can resolve differences in light down to 3 arc seconds which seems very low.
49 / 3.05 = 16
assuming nm needs to be converted to mm?
16 * 1,000,000 = approx 16,000,000 which seems way to high.
Lets now assume I have a telescope with a focal length of 2000 mm and a 5 mm eyepiece with an apparent field of view of 68° this gives a true field of view of 0.17° (68 / 2000 / 5). At 49 arc seconds Jupiter has an apparent field of view of 0.0136° and would therefore fill 8% of the eyepiece.
I can get a Barlow, a lens which is placed between the objective lens and the focal point, changing the angle of light, effectively simulating a longer focal length, this will increase magnification. However as the aperture remains the same, the telescope will resolve with the same power. For instance if I get a 4x Barlow to increase the focal length of the telescope to 8,000 mm, with a magnification of 1,600x, Jupiter will fill 33% of the eyepiece. (At this magnification the exit pupil will be too low for the eye to physically see)
Ultimately I want to add some additional formulas to my astronomy spreadsheet that gives a number I can compare with all my Telescopes / Eyepieces and Barlows, I can then use certain combinations to understand what that number means from a practical point of view. And then when I want to buy a new telescope/eyepiece/barlow, I can plug those values in and find out what I would expect to see. I can then optimise the hardware for every deep sky object I wish to view.
Does this make sense? Can anyone see a better way to understand, how blurry / faint an objective will be perceived?