Can better sensors extract unlimited info from old lens?

In summary, the conversation revolves around the topic of whether a higher resolution sensor can extract more information from existing lenses. The opponent claims that this is possible through Fourier transforms of modulation transfer functions, while the original poster argues that there are limits to lens performance regardless of the recording medium. The opponent also references a book on sampled imaging systems that disproves the claim. The conversation also touches on the topic of resolving power in astrophotography.
  • #1
nugat
8
0
I placed this in quantum department but since no comments came either it belongs here or is too trivial.
Please comment.

MY POST IN QUANTUM PH. FORUM
...
"
This problem I encountered in my photography forum.
A new model of camera comes out with a much higher megapixel sensor (Nikon D800, 36mpix). I argue that such sensor will need new lenses to be meaningful. Others claim that a given lens performance (here resolution) is always utilized by a new higher resolution sensor. They call up Fourier transforms, modulation transfer functions etc. I am not an engineer, more of an amateur philosopher. To me it smells of Zeno paradox (a "quantum Zeno effect" ?). There sure must be a limit on the amount of information a new ever improving recording medium (sensor) can extract from the same old lens? Or not?
Or does this issue belong to classical physics?
I mean, in my mind I see a lemon running out of juice at some point of the squeeze.
Isn't the information coming through any given lens limited then? No more photons to be converted into electrons? On one hand it's the issue of quantum efficiency of the CMOS sensor, but also the physical information limit of the light field?
So are those Fourier transforms of MTFs accurate descriptions or we are at the point where technology stumbles on quanta?


OPONENT'S ORIGINAL POST IN PHOTO FORUM

"... As I wrote the formula is an approximation. For accurate results one would need to multiply the MTFs by means of Fourier transforms.
Every image will show more resolution on the D800 with any lens at any aperture than on previous Nikon's.
The resolution of a system is a function of the resolution of it's parts, not only a single component.
The resolution of a lens plus sensor system can be approximated by:

1/Total resolution = 1/lens resolution + 1/sensor resolution. Or

Total resolution = (lens resolution)x(sensor resolution) / [ (lens resolution) + (sensor resolution) ]. "...




MY POSITION IN PHOTO FORUM

"Resolution figures are meaningless without accompanying contrast data.
One has to put somewhere the cutoff line: what is acceptable. In cinematography Zeiss Master Prime delivering 70lp/mm at 70% contrast is the standard bearer (at 25k U$ and several pounds)
In photography I am willing to accept 50% contrast as a minimum performance measure.
If somebody wants to go down to MTF 25 or even 10--his choice. He will be able to point to "extra resolution" on his cranked up monitor viewed at 100%.
The concept of a lens delivering an ever-increasing stream of visual information as the recording medium becomes more and more capable, is philosophically intriguing, to say the least. A "quantum Zeno effect" of sorts."



OPPONENT'S ANSWER IN PHOTO FORUM

"If you would have read the thread I linked to further, you would have found that the approximation is based on multiplications of MTFs by means of Fourier transforms, You can pick any frequency (resolution) you want. Higher frequencies are generally used as a measure of resolution and lower as contrast.

The approximate formula is still valid and serves as a good first approximation to explain the principles of what can be expected for a system of lens plus sensor.The optical performance of a system of two components (lens and sensor) is not determined just by a single component but by both."

You can think of it as two MTFs stacked on top of each others."
 
Science news on Phys.org
  • #2
nugat said:
I placed this in quantum department but since no comments came either it belongs here or is too trivial.
Please comment.

<snip>

OPONENT'S ORIGINAL POST IN PHOTO FORUM

"... Every image will show more resolution on the D800 with any lens at any aperture than on previous Nikon's.
<snip>

It's hard for me to figure out what you are asking, but the sentence I pulled out from the post is false. A blurry image will still look blurry at increased sampling. I haven't seen any side-by-side comparisons of the new D800 and (say) D7000, but the claim is that some high-quality lenses will show improved performance on the D800.

What exactly are you asking?
 
  • #3
There is a strong belief among some photo pundits that a higher resolution sensor will always extract extra resolution from all existing lenses.This is supposedly based on Fourier transforms of modulation transfer functions of the imaging chain.
Is it true?
Do math models of imaging sciences give support to such claims?

I have had an impression that there is something like a lens performance limits independent of the measuring (recording) component of the imaging chain. If a lens delivers 70lp/mm at MTF70 as measured on the MTF equipment that should be it. No sensor should extract 90lp/mm at MTF70 from this lens.
Am I wrong?
TIA
 
  • #4
nugat said:
There is a strong belief among some photo pundits that a higher resolution sensor will always extract extra resolution from all existing lenses.This is supposedly based on Fourier transforms of modulation transfer functions of the imaging chain.
Is it true?

No.

nugat said:
Do math models of imaging sciences give support to such claims?

On the contrary, the results demonstrate the claims are false. See, for example, "Analysis of Sampled Imaging Systems":

https://www.amazon.com/dp/0819434892/?tag=pfamazon01-20

nugat said:
I have had an impression that there is something like a lens performance limits independent of the measuring (recording) component of the imaging chain. If a lens delivers 70lp/mm at MTF70 as measured on the MTF equipment that should be it. No sensor should extract 90lp/mm at MTF70 from this lens.
Am I wrong?
TIA

If I understand you, then you are correct- the lens has become the limiting factor in your system performance.
 
  • #5
Thanks. I ordered the book from. Time to brush up in optoelectrical engineering.
 
  • #6
Note, this is a common problem with astrophotography. There are online and downloadable calculators for matching a camera with a telescope based on resolution. Short version, though, is that it does you no good to have a camera with much higher a resolution than your telescope.
 
  • #7
Thank you for the info. Could you pass me any link to such calculators?
It's good to know that the issue is understood and long resolved among professionals.
Out of curriosity, how do you define resolving power of astrophotography glass? In lp/mm?
TIA.
 
  • #8
There's no one single best metric to define 'resolving power', because that concept is ill-defined. One could specify the point spread function over the exit pupil, but in terms of adaptive optics, it likely makes more sense to specify the actual wavefront at the exit pupil.
 
  • #9
Like counting all the photons and their energies?
 
  • #10
nugat said:
Like counting all the photons and their energies?

I don't understand what you mean- typically, photon counting is not used as a wide-field imaging method (although it is used as a confocal-type imaging method).
 
  • #11
nugat said:
Thank you for the info. Could you pass me any link to such calculators?
It's good to know that the issue is understood and long resolved among professionals.
Out of curriosity, how do you define resolving power of astrophotography glass? In lp/mm?
TIA.

In a "perfect" system it would be the smallest spot the telescope or camera is capable of making based on the diameter of the objective and the wavelength of light. Attempting to decrease pixel sizes to less than about half the size of the airy disc would be a waste, as it would not get any more detail. Note that real systems will have imperfections and aberrations such as dispersion, coma, astigmatism, and others that will degrade the image even further.

http://en.wikipedia.org/wiki/Airy_disc
http://en.wikipedia.org/wiki/Optical_resolution
 
  • #12
Andy Resnick said:
I don't understand what you mean- typically, photon counting is not used as a wide-field imaging method (although it is used as a confocal-type imaging method).

Excuse my unscientific language, I am just an amateur.
I was thinking of a model to describe resolution in terms of photons registered vs photons sent out from a strict pattern. How many photons travel the lens and arrive at the proper positions in space.
Or maybe it's total nonsense.
 
  • #13
nugat said:
Thank you for the info. Could you pass me any link to such calculators?
http://www.newastro.com/downloads/index.php
http://www.astro.shoregalaxy.com/index_010.htm
http://www.ccd.com/ccd113.html
http://geogdata.csun.edu/~voltaire/pixel.html

There is some debate in these links about just how many pixels you need to cover the telescope's resolution limit, but they all agree that the the resolution limit is not dependent on the number of pixels. In other words: One common method for assessing resolution is by being able to "split" a double star (see both stars). No amount of additional pixels (beyond a well-matched camera/scope) will change whether a certain telescope can split a certain double star.

Also, in astrophotography there is a very serious downside to too many pixels: the pixels don't actually touch each other, so the more pixels you have the more of the imaging surface is covered with borders between pixels instead of pixels. That reduces the sensitivity of the camera, in addition to the loss in sensitivity caused by the smaller pixels themselves. This issue of course also exists for regular photography, it just isn't as important.
 
Last edited by a moderator:
  • #14
There is a counter-argument for more pixels, as a way of reduce the effect of noise in the sensor, by spreading the noise over a wider spatial frequency range and filtering out the part that can't possibly be "signal" because it is beyond the resolving power of the optics.

But for "normal" photography I don't think that has much relevance, especally considering that many digital images are filtered through lossy compression schemes (e.g. jpg) in any case.

FWIW this principle is applied in other digital data aquisition systems - e.g. pro audio recording and processing often uses sampling rates much higher than human hearing can process, and much higher than what finally stored on CDs or DVDs, to minimize noise.
 

FAQ: Can better sensors extract unlimited info from old lens?

What is the purpose of using better sensors with old lenses?

The purpose of using better sensors with old lenses is to extract more detailed and accurate information from the lens, resulting in higher quality images.

Are there any limitations to using better sensors with old lenses?

While better sensors can improve the performance of old lenses, they cannot completely compensate for any physical limitations or defects in the lens itself.

Do all old lenses benefit from better sensors?

Not all old lenses will benefit from better sensors. The effectiveness of using better sensors depends on the quality and condition of the lens.

Can better sensors extract unlimited information from old lenses?

No, better sensors have their own limitations and cannot extract unlimited information from old lenses. The amount of information that can be extracted depends on the capabilities of the sensor and the quality of the lens.

How do better sensors improve the performance of old lenses?

Better sensors have higher resolution and sensitivity, which allows them to capture more detailed information from the lens. This results in sharper and more detailed images.

Back
Top