Has Foveon future?

Discussion in 'Digital Cameras' started by ThomasH, Oct 17, 2003.

  1. ThomasH

    Rob Davison Guest

    In the context we are using it it does (or more accurately it usually
    does) if the samples are taken from different physical locations.
    As they are with bayer pattern sensors.

    If a hick from the backwoods of New Zealand can understand this then
    surely a suave, sophistocated pro can grasp it too?
    Thats right!
    Thats wrong however ... and here is how you prove it:

    Take a fullsize 10D image (one of those on dpreview will do).
    Scale it to 1500x1000 pixels in the image editor of your choice.
    Scale that back to 3072x2048.
    Compare with the original at the pixel level.

    The (very considerable) difference you will notice is the difference
    between the truth and the line of bovine excreta George is pushing.


    Its not the fact that you're talking bullshit George.
    Its the fact that you damn well *know* you're talking bullshit.

    Unlike some of the other folks hereabouts I don't find much amusement in
    baiting you and discourse with someone as intellectually dishonest as
    you are is simply going to be a waste of time.

    Enjoy your camera.
     
    Rob Davison, Nov 14, 2003
    1. Advertisements

  2. You really don't want to understand, do you?
    The Sigma software sharpens, and you don't sharpen the Canon file.
    So much for a fair comparison, but then you never wannted a fair comparison,
    did you?
    The Raw file is 2048x3072 pixels (6MP remember?), your JPEG example is
    1024x1536. I know math can be challenging for some, but this is something
    even you must be able to figure out. But then, I'm an optimist...
    I had a look, and properly sharpened it already looks much better than an
    SD9 image, and when you print it the difference is even clearer. Or do you
    also deny that you need to interpolate the 1512x2268 pixel SD9 file by a
    factor of 135% to reach the same size as the 10D?
    So it seems it's you that's wrong, again, on all counts.

    Bart
     
    Bart van der Wolf, Nov 14, 2003
    1. Advertisements

  3. No it doesn't. You'd have the same or better optical resolution if the 3
    sensors were at the same location. What makes you think taking a sample at
    the wrong place increases the number of optical samples, or the value of
    each?
    I think a hick would laugh at the notion that spreading a R, G, and B sensor
    farther and farther apart increases the amount of optical data they capture
    when ulitmately assembled into a single pixles, or in Bayers case, 4 pixels
    centered on the ame color (minus artifact variance).
    No kidding!
    Now you must be kidding. So 1 sensor is worth more than 1 sensor as long as
    you move it away for the pixel location it is trying to sense.
    Lets see it.
     
    George Preddy, Nov 14, 2003
  4. The software has a sharpness slider, I don't use it.
    Image sizes are
    .. 3072 x 2048
    .. 2048 x 1360
    .. 1536 x 1024
    Now where have I seen that before?
    Even I can figure it out. You can't, however.
    Obviously you didn't. Ooops.
    I think you should reconsider posting again.
     
    George Preddy, Nov 14, 2003
  5. ThomasH

    Chris Brown Guest

    RTFM - the default setting is to apply significant sharpening.

    You are comparing sharpened images with non-sharpened images, and concluding
    the sharpened image looks, um, sharper. Amazing!

    In next week's "George Teaches Digital Photography", we see how SD9 images
    taken with a 50mm lens are much sharper than 10D images taken with no lens
    at all.
     
    Chris Brown, Nov 14, 2003
  6. ThomasH

    Ray Fischer Guest

    And you have not.
    That statement shows your ignorance. Digital interpolation has
    nothing to do with optics.
    Which makes you pretty dishonest for continuing to insist that it is
    meaningful.
    So you divide 6.3 by 3 and get 1.5.

    You _are_ a stupid little idiot.
     
    Ray Fischer, Nov 14, 2003
  7. ThomasH

    Ray Fischer Guest

    Typical coward: You lie and then run away when challenged.
     
    Ray Fischer, Nov 14, 2003
  8. Sure I can, you deliberately chose the lowest quality to favor the SD9. So
    predictable.
    Why, don't you like the truth?

    Bart
     
    Bart van der Wolf, Nov 14, 2003
  9. No - but 2 sensors moved apart are worth more than
    2 sensors in the same location.

    Oops - I said goodby George earlier and here I go
    again - answering the nonsens. But this one was
    so obvious that I could not resist.


    Roland
     
    Roland Karlsson, Nov 14, 2003
  10. ThomasH

    Kevin Guest

    ooops. Must have been reading this through the sigma ccd.
     
    Kevin, Nov 14, 2003
  11. More precisely, 2 sensors moved apart are worth more than 2 in the same
    location for resolving spatial detail. Two sensors in the same place,
    sensitive to different colours, are better for resolving colour without
    improving spatial detail. You can pick one or the other.

    The problem is that George wants to count the extra sensors as *both*
    improving colour and improving spatial resolution. They only do the
    former, not the latter.

    Dave
     
    Dave Martindale, Nov 14, 2003
  12. Does Sigma make spectacle lenses?

    Dave
     
    Dave Martindale, Nov 14, 2003
  13. Really, how many colors do 2 sensors "really" sense, if I move them farther
    and farther from the pixel origin?
     
    George Preddy, Nov 14, 2003
  14. If you had full color capability at each location, you'd actually be right,
    but the scenario is 1/3rd color capability at 3 displaced locations, vs full
    color capability at one proper location.

    The former is simply a less accurate way to sense the exact amount of
    mutually exclusive (so no benefit whatsoever to spreading out) optical
    information.
     
    George Preddy, Nov 14, 2003
  15. Yes, I agree, somewhat depending on the interpolation algorithm, I suppose.
    I've said this all along. The perfect example is a full black image, in
    this special case (no light) the "6MP" Bayer is going to get 100% of its 6
    million interpolated pixels 100% correct, the Foveon, dumbly, is going to
    combine RGB sets to make 1 black from 3 black sensors.

    Problem is, we are talking digital color sensors sensing color targets (so
    there is light), even fully gray scale subjects present an RGB-dependent
    problem. And don't get me wrong, I'm not saying the Foveon is always going
    to outperform Bayer to the theoretical full-color spectrum ratio, no sensor
    "knows" in advance, during design, what the image color distribution is
    going to be. But this is true of all designs, so pointing to a specific
    color distribution and saying desing X is better there, and design Y is
    better here, is a futile and useless exercise, what matters is average, or
    "full color" performance. Or put another way, full spectrum performance.

    Bayer proponents like to say that green is all that matters, it's not, that
    is purely a scam originating in their absolute requirement to double (read:
    throw away) one primary, any primary, in a 2x2 scalable grid.
    Disproportionately more green info is not embedded in the final image, or it
    would be diproportionately more green. What is embedded is more lunminance
    info, if, the extra sensors picked up any green component, this helps to
    guess luminance a tiny bit better in the 4 exactly-the-same-color pixels (if
    there were no artifacts) that a single Bayer RG(G)B set outputs for each
    discrete color snesed. Problem is, it's a false assumption, luminance only
    needs to be "estimated" at all because the RGB set isn't co-located and thus
    it isn't known.

    It is the quintessential self-licking ice cream cone.

    In fact, the "green is most important" argument only seals Bayers fate, even
    if you accept it wholesale: Foveon embeds 2X as much green as Bayer in
    every pixel, as they so love to point out.
     
    George Preddy, Nov 14, 2003
  16. So post a 6MP image from his RAW file, how hard is that? Just do it.
     
    George Preddy, Nov 14, 2003
  17. ThomasH

    JPS Guest

    In message <bp239o$9j9$>,
    No, everyone here always knew that you could create a 100mp image from
    any other image.
    Standard Bayer de-mosaicing does not interpolate resolution. It
    interpolates full RGB data to satisfy the format of graphics programs
    and displays. It outputs the same resolution as the sensor has unique
    pixel locations. Fuji super CCD interpolates extra pixels to become a
    grid.
    Wrong. The Bayer's pixels have a basis in optical reality. The 13.7mp
    SD9 files do not.

    You can represent everything that the SD9 captures with 3.43mp. You can
    *NOT* represent everything a 6.3mp bayer image captures with anything
    less than 6.3mp. There is a *big* difference in resolution when you
    output 1.58mp and 6.3mp images from a 6.3mp bayer sensor.
    It would be a miracle, if you ever got it, or if you ever became honest,
    whichever would be the case.
    --
     
    JPS, Nov 15, 2003
  18. Colors?
     
    Roland Karlsson, Nov 15, 2003
  19. Which is why it adds nothing. I suppose you don't even need optics to see
    if you have enough digital interpolation, right?
    Neither interpolated Bayer, nor interpolated Foveon resolutions are useful
    for describing optical resolution, as I've said all along. Glad you finally
    agree.
     
    George Preddy, Nov 15, 2003
  20. Foveon interpolation is actually more accurate, not that SD-9 user would
    care after outputing both. Foveon interpolates precisely between known
    colors with known luminance, Bayer has to guess between guesses, which is
    why there are any number of ways to do it and all produce widely varying
    results.
    Why not? 10.3M RGB sensors = 3.43M full color pixels. Foveon doesn't
    recycle already used data in mulitple output pixles in various output
    locations, each pixel color is optically discrete.
    You'll have to do better. The fact is the 10D is sensor poor, all the
    mathematicians on God's green Earth can't fix that. Here are the facts:
    http://www.pbase.com/image/22273598 .

    The SD-9 has 230% of the Canon 10D's red sensor count.
    The SD-9 has 230% of the Canon 10D's blue sensor count.
    The SD-9 has 110% of the Canon 10D's green sensor count.
     
    George Preddy, Nov 15, 2003
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.