BREAKING NEWS: The end of JPEG is in sight

Discussion in 'Photoshop Tutorials' started by +/-, Sep 30, 2005.

  1. +/-

    Lorem Ipsum Guest

    If you managed to find a general method for loslessly compressing
    And further, methods can be patented (if they meet the prerequisites of not
    being public earlier, etc. etc.) so look to the patent office. Regardless,
    patents do not require that the method be proven to be better, just unique.
    I can check that out from work later.

    The so-called white paper is topical, not adequate to tell what the author
    is really doing. References to symbolic representations look like nothing
    but adaptive compression schemes in the same file. Nothing new there in the
    research community.

    I look forward to authoritative reviews in the journals.

    Now it's time to go to the day job on T2, 100mb desktop machines, fiber
    optic backbones and one fast courier who can carry a few terabytes of images
    in his arms up the elevator faster than God.
     
    Lorem Ipsum, Sep 30, 2005
    #21
    1. Advertisements

  2. Absolutely correct!
     
    Thomas T. Veldhouse, Sep 30, 2005
    #22
    1. Advertisements

  3. At work, I can get 100 Mbps when I need it. The backbone tends to be
    fast enough.
    That means that you have a completely obsolete backbone.

    (At home, I have about 3Mbps down (and the ISP's network has enough capacity),
    so a single 10 MByte image takes about 30 seconds.)
    There is not going to be any lossless compression that works better than jpeg
    (without using domain specific knowledge). If you have images that are suitable
    for lossless compression, compress them with png. For grayscale images,
    compressing a TIFF with bzip2 may also work.
     
    Philip Homburg, Sep 30, 2005
    #23
  4. +/-

    Lorem Ipsum Guest

    Will you people who reply with lame one-liners please SNIP THE ARTICLE?
    (mark)?
     
    Lorem Ipsum, Sep 30, 2005
    #24
  5. That's being kind. It looks like snake-oil to me. For example the
    references to the OSI model say to me that the author neither
    understands the OSI model nor wants to.

    There was a similar case of somebody making ludicrous compression claims
    in the Netherlands a year or so back. I don't think anyone managed to
    discover whether the author was a con-man or merely deluded, but he
    certainly didn't have anything workable. (He is now dead, so I guess
    we'll never know.) It seems to be this decade's perpetual motion
    machine.
     
    Stephen Poley, Sep 30, 2005
    #25
  6. +/-

    Matt Ion Guest

    That's a perfect example: even the wonderful fre InfanView has only very
    very limited support for JPEG2000 because the plugin must be paid for.
    99.9% of users are going to have no need for the format's extra
    features/capabilities that are going to be worth actually paying for the
    support, especially when regular JPG is more than sufficient.


    ---
    avast! Antivirus: Outbound message clean.
    Virus Database (VPS): 0539-2, 09/29/2005
    Tested on: 9/30/2005 7:45:44 AM
    avast! - copyright (c) 1988-2005 ALWIL Software.
    http://www.avast.com
     
    Matt Ion, Sep 30, 2005
    #26
  7. +/-

    toby Guest

    toby, Sep 30, 2005
    #27
  8. +/-

    toby Guest

    toby, Sep 30, 2005
    #28
  9. BTW how can anything be more than 100% smaller?
    100% smaller = 0

    Gerrit
     
    Gerrit 't Hart, Sep 30, 2005
    #29
  10. DNG uses lossless JPEG compression. That indicates that it can't be
    nearly as much as a lossy JPEG compression. So it is unlikely to be
    able to compete with this new form, assuming the statement about it is
    accurate. But I am sceptical about whether this is really a lossless
    compression.
     
    Barry Pearson, Sep 30, 2005
    #30
  11. Err, sort of; it gives the most efficient result if you are
    constrained to map each input symbol to the same output bit pattern.
    That's hardly a universal constraint.
    This, however, is true.
     
    Richard Kettlewell, Sep 30, 2005
    #31
  12. Which means you get a negative image. Quite common in photography. :cool:
     
    Roger Whitehead, Sep 30, 2005
    #32
  13. The port coming out of your cable modem is 10 Mbps, but the cable's
    maximum useful bandwidth (and the modem's maximum capability) is a
    fraction of that. And you do have to share it with your neighbours
    because there's only one wire.

    Any hospital installing bargain-basement equipment today would get at
    least 100 Mbps hardware and switches not hubs. That can actually
    sustain at least 50 Mbps of data transfer, and many transfers can be in
    progress at the same time because of the switches as long as they use
    different paths. 10 MB images are not a problem.

    Dave
     
    Dave Martindale, Sep 30, 2005
    #33
  14. In message
    You beat me to it. This is sheer nonsense, along the same lines as
    journalists writing "three times smaller" when they mean (I think)
    "one-third as big", or "300% bigger" when there's only a 200% increase.
    And another thing... (rant, mutter, mumble)
     
    Peter Twydell, Sep 30, 2005
    #34
  15. +/-

    Ken Weitzel Guest


    I'm giving 110% effort here, but still confused :)

    Ken
     
    Ken Weitzel, Sep 30, 2005
    #35
  16. +/-

    PcB Guest

    <<The data-compression algorithm he invented shrinks images into a format
    called a MatrixView Universal, or MVU, which is 15 to 300 percent smaller
    than a JPEG.
    Er, how can you make something 300% smaller? 100% is all of it ....

    --
    Paul ============}
    o o

    // Live fast, die old //
    PaulsPages are at http://homepage.ntlworld.com/pcbradley/
     
    PcB, Sep 30, 2005
    #36
  17. +/-

    toby Guest

    Huffman is not particularly effective except for bilevel (1-bit)
    images. The LZW family of algorithms in particular perform better in
    general. TIFF uses LZW, ZIP and varieties of RLE (such as Apple
    Packbits), in addition to the CCITT Huffman-based methods defined for
    faxes.

    References:
    TIFF standard,
    http://www.digitalpreservation.gov/formats/fdd/fdd000022.shtml
    LZW Explained, http://www.danbbs.dk/~dino/whirlgif/lzw.html
    Intro to Data Compression,
    http://www.faqs.org/faqs/compression-faq/part2/section-1.html
     
    toby, Sep 30, 2005
    #37
  18. +/-

    eawckyegcy Guest

    Because there a large number of "data compression" claims that have
    later been shown to be bullshit (or, usually, failure on the part of
    the claimant to back up his claim). They are the perpetual motion
    machines of computation.
    More likely is that you are just ignorant of the history of these
    things.
     
    eawckyegcy, Sep 30, 2005
    #38
  19. +/-

    Gormless Guest

    If 10 megabytes a minute can cripple a hospital network then I don't think
    much of their networks.
    And since when was a 10 Mb image 'enormous'?
     
    Gormless, Sep 30, 2005
    #39
  20. +/-

    Rich Guest

    B.S.
     
    Rich, Oct 1, 2005
    #40
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.