JPEG 9 new lossless JPEG standard

Discussion in 'Digital Cameras' started by Alfred Molon, Jan 22, 2013.

  1. Alfred Molon

    Alfred Molon Guest

    There is a new lossless JPEG standard released just now:
    http://www.infai.org/jpeg

    Does anybody know more (performance, compression etc.)?
     
    Alfred Molon, Jan 22, 2013
    #1
    1. Advertisements

  2. Alfred Molon

    Alfred Molon Guest

    No idea. What compression ratio did you get with lossless JPEG 2000?
     
    Alfred Molon, Jan 22, 2013
    #2
    1. Advertisements

  3. Alfred Molon

    nick c Guest

    nick c, Jan 22, 2013
    #3
  4. There were questions about whether patents covered some of the details,
    and as a result it was not widely implemented, and never became common.
     
    David Dyer-Bennet, Jan 23, 2013
    #4
  5. Alfred Molon

    Martin Brown Guest

    Performance will be better than the original lossless JPEG (which was
    pretty terrible and in practice almost never used outside a handful of
    niche markets). PsPro8 will write a JPEG lossless file and give it the
    ..JPG extension thus crashing almost every other JPEG decoder.

    JPEG came to always mean lossy JPEG in common usage and I hope that IJG
    are giving their new format a distinctive name like .JPGL to avoid
    codecs crashed by a format that they can't make sense of a la PsP8.

    The new standard probably gives compression for 24bit rgb images broadly
    comparable with or if they have done it right slightly better than PNG.
    I have not had time to play with the new release yet.
     
    Martin Brown, Jan 23, 2013
    #5
  6. Alfred Molon

    Joe Kotroczo Guest

    JPEG2000 is used in the digital cinema DCP format.
     
    Joe Kotroczo, Jan 23, 2013
    #6
  7. Alfred Molon

    Joe Kotroczo Guest

    Any application which crashes because the content of a file doesn't
    match what it expected due to the file's filename extension is broken.

    As is an operating system that solely relies on filename extensions to
    figure out file type.
     
    Joe Kotroczo, Jan 23, 2013
    #7
  8. Alfred Molon

    Martin Brown Guest

    I agree, but Microsoft and the commonly used IJG codec both baulk on
    Lossless-JPG streams with a .JPG extension. Malformed JPG files have
    been used as a way to vector hostile executable code in the past. It
    confused end users no end since they have .JPG files that the decoder
    refuses to decode and in some cases for older codecs actually crashes.

    Relevant prior art is the JPEG-LS sceme called LOCO by HP
    http://www.hpl.hp.com/loco/

    Full paper at http://www.hpl.hp.com/loco/HPL-98-193R1.pdf

    I hope JPEG9 shows how it compares on the same test data.
    These are used for some scientific image telemetry

    http://www.hpl.hp.com/news/2004/jan-mar/hp_mars.html
     
    Martin Brown, Jan 23, 2013
    #8
  9. Alfred Molon

    bugbear Guest

    bugbear, Jan 23, 2013
    #9
  10. Alfred Molon

    Alan Browne Guest

    Frankly, I don't care if JPG is slightly lossy. If it's important I
    have it in another lossless (and higher DR) format (tif, raw, dng ...).

    Hopefully a tool to turn the JPG-9 encoded files to lossy JPG files will
    soon emerge. A simple line command would be fine...
     
    Alan Browne, Jan 24, 2013
    #10
  11. Most lossless compression algorithms only work on data words that are
    about 8 bits. That's why high efficiency lossless compression on high
    dynamic range images is uncommon. A lot of work has to be done to
    convert the 16, 24, or 32 bit data into fewer bits in a way that
    enhances compression rather than hinders it.
     
    Kevin McMurtrie, Jan 26, 2013
    #11
  12. Logic failure. UCS16 can easily be losslessly compressed by
    common lossless compressors, for example.
    Or one simply uses a compression algorithm that has no
    problems if words are larger than an octet. (The "typical"
    image is 24bit btw. if it has colour.) Not to mention that a
    compressor would only care about how large a data word is if
    they need to understand the data.

    -Wolfgang
     
    Wolfgang Weisselberg, Jan 27, 2013
    #12
  13. Alfred Molon

    Martin Brown Guest

    The JPEG9 codec still includes all the original JPEG standard stuff
    *and* in addition a new lossless encoder and colourspace it calls RGB1
    that allows better lossless compression RGB images. I haven't tried it
    out yet. Roundtuit problem.
    That is somewhat misleading. The original draft lossy JPEG standard
    provided for images using 8bit or 12bit input data and the IJG codec can
    be compiled for the latter case. 12bit lossy JPEG is seldom seen.

    The original lossless JPEG standard also allowed for lossless encoding
    of any data of bit length 2 through 16 bits. The problem was that there
    were already other lossless encoders about that were as good or better
    whereas the lossy JPEG high compression encoding was new and extremely
    useful with almost no perceptual losses and *much* smaller file sizes.

    IJG making a free implementation publicly available made it the defacto
    standard for (pre)web images in the days when a really fast dialup modem
    could manage up to 2kb/s on a good day with a trailing wind.

    Variations on the theme of JPEG-LS extended by HP and other researchers
    are used in the lossless compression of image data from space probes and
    archiving digital X-rays but seldom (never?) seen in consumer kit.

    The problem for lossless algorithms in general is that they generally
    spend an inordinate amount of their space and time budget faithfully
    preserving exactly the thermal noise from the imaging system.
     
    Martin Brown, Jan 28, 2013
    #13
  14. Alfred Molon

    Alfred Molon Guest

    Will JPEG still be widespread in 50 years?
     
    Alfred Molon, Jan 28, 2013
    #14
  15. Alfred Molon

    Savageduck Guest

    Maybe. However there is one thing I am certain of, the only way I will
    be wide spread in 50 years is if somebody takes the time to empty out
    the Folgers can into the wind.
     
    Savageduck, Jan 28, 2013
    #15
  16. Alfred Molon

    Martin Brown Guest

    Impossible to say, but there is no reason why it should not survive.

    The IJG JPEG codec is freely available in sourcecode form and the nasty
    blocking patents on things that would improve it will time out.

    Wavelets could in principal do slightly better in terms of higher
    fidelity at a smaller size but JPEG is basically good enough for all
    consumer grade imaging. The extent that JPEG is spread across the web
    pretty much ensures that there will be decoders for the foreseeable
    future - although 50 years is perhaps a bit of a stretch.

    My instinct is that if J2k was going to take off on the web it would
    have done so by now. If you look back in time I was an advocate for it.

    http://www.nezumi.demon.co.uk/photo/j2k/j2k_v_jpeg.htm

    J2k works significantly better than JPEG at highest quality but the
    gains were not sufficient to overcome the inertia and various patent
    litigation barriers. Various image apps have J2k codec in today.
     
    Martin Brown, Jan 28, 2013
    #16
  17. The final compression stage is usually something simple like 'deflate'.
    It does very much care what the bytes look like. If you were to dump
    out an interleaved RGB image with 16 bits per pixel per channel to a
    file and compress it with bzip2 or gzip, you'd find that not much
    happens. It might even get a few bytes larger. PNG puts a simple
    predictive filter in front of deflate so that typical images produce
    simpler patterns at the byte level. Lossless JPEG attempts to create
    simple patterns representing error correction for the lossy conversion.

    Deflate and bzip2 only work well at reducing 8 bit words. Getting a 16
    or 24 bit per pixel per channel image down to clean patterns of 8 bits
    takes some work that's very specific to image processing.
     
    Kevin McMurtrie, Jan 29, 2013
    #17
  18. Deflate doesn't care at all. Only the archivable compression
    rate may be suboptimal.
    So we are not at "only work well" from your previous "only
    work". That's at least the right direction.

    For example deflate (the algorithm) is nowhere dependent on
    "8 bit words". The characters can be of arbitrary size.

    The same is true for the bzip2 algorithm.

    What work would that be?
    Tell the guy who's pointing a pistol at you, forcing you to
    filter posts from Google to go away. Then it will be *your*
    choice if you want to filter posts from Google.

    -Wolfgang
     
    Wolfgang Weisselberg, Feb 4, 2013
    #18
  19. OK, you should call up the GIF, PNG, JPEG, MPEG, and FLAC folks to tell
    them of your brilliant discovery. I bet they'll feel silly for all the
    work they did coming up with algorithms to prepare data for compression.
    Thanks for saving the Internet.

    Refresh your meds.
     
    Kevin McMurtrie, Feb 5, 2013
    #19
  20. Alfred Molon

    Martin Brown Guest

    It cares about the patterns in the source data if you want to actually
    get useful compression. If you don't mind the file getting bigger then
    the algorithm works but it does not compress the image.

    Some patterns necessarily grow in size when compressed with a given
    lossless compression algorithm and that problem is more likely to be
    encountered if you feed RGB interleaved image data or YCC for that
    matter into a simple lossless encoder.

    Shuffling the data to be RRRRR...GGGGG...BBBBB either physically or in
    practice by altering the indexing and then using a simple forward
    predictor is the most common solution for lossless these days.

    Your claim that the "compression rate may be suboptimal" needs replacing
    with the statement that in the worst case the file size will grow. 24bit
    colour interleaved noisy photographic image data is not well suited to
    the sorts of lossless compression that were originally heavily optimised
    for text and binary executables. Both cases where lossless is absolutely
    essential. In many photographic images you can afford to drop the least
    significant bit or quantise JPEG coefficients to gain extra compression
    and still have adequate signal to noise.
    (this would be disastrous for executable binaries)
    It isn't a bad idea unless you are interested in forged copy watches and
    other fashion dross. Google inject far too much spam into Usenet.
     
    Martin Brown, Feb 5, 2013
    #20
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.