Discussion in 'Photography' started by GassMan, Sep 29, 2007.

  1. GassMan

    OG Guest

    Hah! 'loss of information' can have unexpected effects
    I recently had some screenshots that I needed to save. Saving as jpegs gave
    a file size of 250kb, and when I looked at them the reason was the copious
    'artefacts' that had been introduced during compression.
    I then repeated the screenshot and saved it as a lossless PNG file - file
    size about 87Kb.

    Yes, the jpegs had lots of informaton, but not all of it had been there when
    I started.

    Given that Ken saved all his old document files as .txt I think we can
    assume that he treats not all 'information' as of equal value.
    OG, Dec 22, 2007
    1. Advertisements

  2. GassMan

    OG Guest

    You seem to have the idea that it's OK to let the 'RAW converter' decide
    when the result is 'right' - which is what happens in the camera.
    Were you to save the RAW on the camera, you'll know that the perfect 'RAW
    converter' is RAW software plus suitably sensitive person.

    I've just realised that the above may imply that you are worse than your
    camera's software at converting RAW into jpegs. Not a deliberate affront.
    OG, Dec 22, 2007
    1. Advertisements

  3. Actually that statement is not wrong either. It happens
    that companies like Nikon go to extremes to determine
    what would be more acceptable to more people, and in
    fact the same model cameras may have slight differences
    in the raw conversion parameters when manufactured for
    different regions. For example a Nikon camera sold in
    Japan may have slightly different results for the exact
    same settings as a camera sold in Europe and that may be
    different than one sold in the US.

    While Joel may or may not be a "suitably sensitive
    person", it isn't an affront to suggest that Nikon is
    assuming, correctly, that *most* customers in fact are
    not nearly as capable as the Nikon people are at making
    such decisions!
    Floyd L. Davidson, Dec 22, 2007
  4. GassMan

    Joel Guest

    Yes, just by reading the magic number 250kb we can see a magical loss,
    because we are talking about several Megs not few hunders of Ks.
    Joel, Dec 22, 2007
  5. GassMan

    Joel Guest

    I don't trust no software but my own eyes and judgement, I don't have time
    left to believe the fairy tale <bg>
    Joel, Dec 22, 2007
  6. GassMan

    Scott W Guest

    You don't seem to have time to learn about raw either, you might find
    it time well spent.

    Scott W, Dec 22, 2007
  7. GassMan

    Joel Guest

    No I don't! because RAW world is too small for both of us, so I lets you
    enjoy the whole world of RAW <bg>
    Joel, Dec 22, 2007
  8. GassMan

    Not4wood Guest

    Actually Joel:

    Look at it this way. JPG at fine and large quality is letting the sensor on
    your camera see everything in 8bit per pixel. Shooting in Raw, depending on
    your camera (I'm first looking at this thread and missed what you are using)
    this can be between 12bits and 16bits per pixel. This means first, that the
    compression rate for JPG is cutting off some details and your loosing this
    when you first go to process. Next up, when you do process JPG for either
    cropping or some slight editing, you are not saving any steps if you also do
    the same thing in Raw.

    Now, this 8bits lets say for argument sake. Compare this 8bit JPG to an
    exact same well exposed shot of 12bit Raw. They both look the same at first
    glance. But, if you have very dark areas of shadow and need to bring out
    the details you will find them in Raw but not in JPG and as well as the very
    extreme highlights. The detail is there in Raw but again, alas not in JPG.
    I have just started to shoot in Raw and am still in the testing stages. I
    have only shot Raw in one session when I was out looking around for Fall
    shots of the morning of Thanksgiving. Instead of making up your mind with
    the prejudice against Raw just make a few tests keeping an open mind and see
    for yourself. I am not asking you to do this if your on assignment or
    working in any way for a customer, but just doing it for your own artwork
    and walking around. Like everything else your camera offers, like all of
    your camera gear its another tool in your camera bag to be used when and if
    you need it. Why would you not use something if it can give you any help in
    trying to get a great exposure, great quality piece of Art? I'm not saying
    go out and marry it, just take it on a date and see what she might offer?

    Not4wood, Dec 23, 2007
  9. There are 16bit sensors? That's 64bits overall, but most computer displays
    are only 32bit. That's a serious amount of information loss. I have a D80
    which is 12bit and tone mapping can be bad enough. Anyone think we may see
    48bit then 64bit graphics cards and displays for computers?
    No Person known, Dec 30, 2007
  10. GassMan

    Scott W Guest

    Well first off 16 bits * 3 comes out to 48 bits not 64.

    I know of 14 bit per color cameras, don't know of 16 bit ones but
    would not argue that there are not.

    The thing to remember is that the computer display is very non-linear
    whereas the camera is very linear, so you need a lot of linear bits to
    make use of 8bits/color, just how many depends on the color space you
    are working in.

    And then there is the fact that many people will selectively lighten
    areas of a photo, this takes even more dynamic range and thus more

    Scott W, Dec 30, 2007
  11. Digital Cameras use 4 lines, not 3. The sensors are read is R-G and G-
    B, so green tends to get doubled up for better results.
    No Person known, Jan 16, 2008
  12. That is *not* 14 bits per color though! It's 14 bits
    per pixel/sensor location.

    The information from *nine* such pixel locations is used
    to determine the actual color of a location, hence it is
    entirely possible to generate 16 bit per color RGB image
    files from even a 12 bit RAW file (and all of those 16
    bits will be useful data).

    Keep in mind also that while the computer may well use
    24 bits per color in the graphics device, the monitor
    you display it on almost certainly uses either 6 bits or
    8 bits per RGB channel for display.
    But each of those pixel/sensor locations is just one
    color, and the bit depth of the RAW file is for each
    sensor, as explained above.
    Floyd L. Davidson, Jan 16, 2008
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.