Have sensors really passed the resolution of lenses? NO!

Discussion in 'Digital Cameras' started by RichA, Feb 11, 2012.

  1. RichA

    RichA Guest

    I keep reading that statement, that they have.
    Does this mean that putting a good lens on a 24 megapixel camera (not
    to mention a 36mp) rather than an 16 megapixel camera will yield no
    extra resolution? Has this been anyone's experience, even with some
    marginal lenses, like Sony's 16mm pancake? I'd urge people to
    actually do some tests rather than accept it at face value.
    Reason being, there are tiny sensored cameras out there that will
    resolve a decent amount of detail with their lenses and therefore,
    scaling up those sensors to the size of larger sensors would disprove
    the idea that sensors have now out-resolved lenses.
    This could be proven quite handily by mounting a DSLR lens on a
    diminute Pentax Q with an adapter, or even using Nikon's V1 as both
    cameras, where their sensors scaled-up to say APS size for example,
    would exceed the pixel count of any (including the D800) DSLR, and it
    would be shown the lenses are capable of delivering even more detail.
     
    RichA, Feb 11, 2012
    #1
    1. Advertisements

  2. The answer is, as you well, know "it depends". With some DSLR lenses, 10
    MP may well be enough to capture all they can offer when wide-open. When
    viewing a "normal size" print at "normal" distance, or looking at an image
    on a computer screen or CRT display. not pixel peeping, could you even
    tell the difference between a 16 MP and 24 MP print? I doubt it.

    To get the full resolution out of a 36 MP sensor will likely require the
    best of lenses, and some of the best of photographic techniques (such as
    tripod, best focusing, reducing mirror-slap etc. etc.)
     
    David J Taylor, Feb 11, 2012
    #2
    1. Advertisements

  3. Consider that many consider the resolution of Kodachrome 35mm film to be
    around 100-250 mega pixels.
     
    Kenneth Scharf, Feb 11, 2012
    #3
  4. RichA

    Guest Guest

    consider that many people are stupid. there's absolutely *nothing* that
    can support that nonsense.
     
    Guest, Feb 11, 2012
    #4
  5. RichA

    Ray Fischer Guest

    According to the modulation-transfer diagram for Kodachrome 25 a good
    estimate for an equivalent digital image is about 20 megapixels.

    http://www.kodak.com/global/en/professional/support/techPubs/e55/e55.pdf

    (70 cycles/mm * 2 pixels/cycle * 35mm) * (70 * 2 * 25)
     
    Ray Fischer, Feb 11, 2012
    #5
  6. RichA

    Vance Guest

    This is as deeply and well thought out as any post from you I have
    read. Of course, I only read your posts when I am in the mood, so you
    may have done better and I wouldn't know.

    It's an interesting question. It seems from your heading that your
    opinion is no while David Taylor's more qualitied opinion in his post
    is that it all depends, which is intuitively more appealling.
    Postings to newsgroups understandably limit how full an explanation
    can be, which is a good reason to link to outside resources. The
    conclusion reached by Luminous Landscape, using their greater
    resources and probably coming from a more knowledgeable perspective
    than most of us have, came to the conclusion the same conclusion as
    David Taylor and they definitely aren't alone.

    Quoting Luminous Landscape's conclusion in full:

    'So, do sensors outresolve lenses? It depends on the lens you use, the
    properties of the light, the aperture and the format. Small format
    sensors may have surpassed the limit, this is, in most cases they are
    lens-limited in terms of resolution. It is easier to correct
    aberrations for a smaller light circle though, so you can approach
    diffraction-limited resolutions for lower f-numbers. The signal-to-
    noise ratio, however, imposes an inflexible limit to the effective
    resolution of the whole system, mostly due to photon shot noise.

    Sensors for larger formats are approaching the diffraction limit of
    real lenses, and it is more difficult to get high levels of aberration
    suppression for them. The point is that you cannot fully exploit the
    resolution potential of high-resolution sensors with regular mass-
    produced lenses, particularly for larger formats.

    You cannot compare the limits of two different photographic systems
    looking at a print because the variables that determine the subjective
    perception come into play. Different systems can provide comparable
    results on paper under certain conditions (the circle of confusion
    reasoning explains how that is possible), but the limit of a system
    must be evaluated considering the pixel as the minimum circle of
    confusion.'

    For those who want a deeper understanding of the question, or want to
    see if the conclusion is sound, the link is:

    http://luminous-landscape.com/tutorials/resolution.shtml

    Your approach to 'proving' that sensors haven't outstripped lens
    resolution seems a weak and way to much bother. In deference to
    people who know more than I do, which seems reasonable, I have to go
    with the less catagorical answer than either 'Yes' or 'No' and say
    'Maybe', in spite of how commonly you hear 'Sensors have outstipped
    lens resolution.'

    Vance
     
    Vance, Feb 11, 2012
    #6
  7. RichA

    Rich Guest

    Well, that stuff has been around for a while, but in truth, they are
    referring mostly to scans of the film and the relative size of the files
    that result from those scans. It's not resolution, it's the recording of
    the grain structure of the film, which is why a course grain film will
    often require even more memory than higher resolution, fine-grained film.
    Apart from specialty films meant for ultra-high (by 35mm film standards)
    resolution max-out at about 8 megapixels, compared to equivalent digital
    images.
     
    Rich, Feb 12, 2012
    #7
  8. You want more than 2 pixels/cycle[1]. You want to seriously
    oversample[1]. You want 0% response, not 10%[2].

    (130 cycles/mm * 3 pixels/cycle * 3 (oversample))^2 * 24 * 36
    => >1 GPix. See? :)

    -Wolfgang

    [1] http://clarkvision.com/articles/sampling1/index.html
    [2] http://clarkvision.com/articles/scandetail/index.html#modulation
     
    Wolfgang Weisselberg, Feb 23, 2012
    #8
  9. Yes, and you can extend the MTF to the whole system, including any
    filtering on the sensor (e.g. anti-alias filtering), the effects on any
    processing, the MTF of the screen or printer, and the MTF of the eye, to
    predict the image on the retina.

    However, it can be convenient to take some defined point of the MTF (e.g.
    30%), or the integrated area under the MTF curve, and a single
    "resolution" equivalent measure, but as this excludes the shape of the MTF
    curve is doesn't tell the whole story.

    David
     
    David J Taylor, Feb 25, 2012
    #9
  10. RichA

    Alfred Molon Guest

    If you take 30% as the limit, the lens resolution will be an extremely
    low one. MTF charts are typically made up to a max of 40 lp/mm and
    already at 40 lp/mm the MTF is very, very low.
    On the other hand with 4 micrometer pixels we are at 125 lp/mm, and you
    can imagine what kind of MTF you get at 125 lp/mm - probably just a few
    percent.
     
    Alfred Molon, Feb 25, 2012
    #10
  11. RichA

    Alfred Molon Guest

    ?????

    I really don't understand what you mean.
     
    Alfred Molon, Feb 25, 2012
    #11
  12. RichA

    Bruce Guest



    Don't worry, Alfred. Browne doesn't understand either.
     
    Bruce, Feb 25, 2012
    #12
  13. RichA

    Alan Browne Guest

    What you said:
    QUOTE
    Lenses do not have a resolution. There is something called
    modulation transfer function (MTF) which tells you what
    the response is at a certain spatial frequency.
    /QUOTE

    However, as you go up in spatial frequency, the contrast ratio falls
    off. When you can no longer usefully distinguish between two close
    lines you're at the limit for that lens. MTF is the means to
    establishing the lens resolution.

    Ref: http://en.wikipedia.org/wiki/Modulation_transfer_function

    Q The optical transfer function (OTF) of an imaging system (camera,
    video system, microscope etc.) is the true measure of resolution
    (image sharpness) that the system is capable of.
    /Q

    And: http://en.wikipedia.org/wiki/Optical_resolution#Lens_resolution
     
    Alan Browne, Feb 25, 2012
    #13
  14. RichA

    Alfred Molon Guest

    Now it gets more clear, but the way you initially put it "higher spatial
    frequency" contrast _is_ lens resolution" means nothing. You were
    meaning something else. By the way, first time I hear of this OTF.
     
    Alfred Molon, Feb 25, 2012
    #14
  15. RichA

    Alan Browne Guest

    What set me off was when you said "lenses do not have a resolution".
    (They do). And then brought up MTF (which is how you define/find the
    resolution)
    It's not useful to photography - as we do it anyway.

    MTF is a component of OTF (See the equation). We're concerned with
    magnitude (causes exposure), not phase - MTF "discards" the phase info
    from OTF.

    If a phase sensor could be done - and very small, I guess resolutions
    could be magnitudes higher and no lens at all would be needed.
     
    Alan Browne, Feb 25, 2012
    #15
  16. RichA

    Joe Kotroczo Guest

    On 25/02/2012 19:55, Alan Browne wrote:
    (...)
    If you record phase as well as modulation in essence you'd have digital
    holography, no?
     
    Joe Kotroczo, Feb 25, 2012
    #16
  17. RichA

    Alan Browne Guest

    You certainly have the information to focus at any distance and resolve
    for any desired DOF. Not sure about holography.

    The article in wikipedia is a good start, and it it:


    [1] A hologram represents a recording of information regarding the light
    that came from the original scene as scattered in a range of directions
    rather than from only one direction, as in a photograph. This allows the
    scene to be viewed from a range of different angles, as if it were still
    present.

    [3] A holographic recording requires a second light beam (the reference
    beam) to be directed onto the recording medium.

    and much more, of course.
     
    Alan Browne, Feb 25, 2012
    #17
  18. RichA

    Ray Fischer Guest

    Absolute accuracy is neither the goal nor is it possible.
    This is a useful guideline.

    If you want to indulge yourself in pointless pedantry then go ahead.
     
    Ray Fischer, Feb 26, 2012
    #18
  19. RichA

    Joe Kotroczo Guest

    Plenoptic camera maybe?
    Yes, but that's for analog holograms, with a physical recording medium
    such as photopolymers or photorefractives.

    If you look at <http://en.wikipedia.org/wiki/Digital_holography> you'll
    find "The phase-shifting digital holography process entails capturing
    multiple interferograms that each indicate the optical phase
    relationships between light returned from all sampled points on the
    illuminated surface and a controlled reference beam of light that is
    collinear to the object beam"

    Not that I truly understand any of it... ;-)
     
    Joe Kotroczo, Feb 26, 2012
    #19
  20. RichA

    Alan Browne Guest

    ? (Okay, I'll go look it up...)

    Sure, like the Lytro. Though the way I read the para on Wikip, it's not
    as described in our flight of fancy above.
    Me neither - but I believe it entails multiple POV's, not a single POV.
     
    Alan Browne, Feb 26, 2012
    #20
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.