Best scanning manager program?

Discussion in 'Scanners' started by T. Wise, Sep 6, 2005.

  1. T. Wise

    Father Kodak Guest

    Need, no. Nice to have, YES!
    There is always a bottleneck somewhere. Fix one bottleneck, and then
    another one "appears." That's life.
    Sure. But that suggests that with dual cores or processors, overall
    system utilization is higher, and therefore you're getting more done,
    faster.


    I don't doubt it. After the first thousand or so, I may get smarter
    about my workflow.
    Please note. I keep some backups for months at a minimum. That's a
    lot of backup drives. And if I do ever digitize all those slides and
    negs, plus shoot with digital cameras, I'll need terrabytes online.
    Do I really want to have backup hard drives for terrabytes? in
    several backup sets? I would fill up a closet.

    While hard drive backup may be Ok for short-term transactional backup,
    it won't serve well for weeks or months or longer. Do you think any
    business that has to comply with Sarbanes-Oxley or HIPAA or
    Graham-Leech-Bliley is doing all its backups to disk? I seriously
    doubt itr.
    I'm aware of dual-layer DVDs. Yes, but still a lot less than my
    current network backup load, and I'm trying to get away from media
    switching.
     
    Father Kodak, Sep 19, 2005
    #41
    1. Advertisements

  2. I also have 8x12 and 13x19 prints from 35mm using Vuescan. And the
    quality of these prints has been deemed very good by a panel of
    professional photographers. So the conclusion I gather is that
    quality can be obtained from Vuescan.

    For the record: I use both Vuescan and Nikonscan. I find each has
    strengths and weaknesses and some of my slides scan better with one
    and not the other.

    David
     
    David Blanchard, Sep 19, 2005
    #42
    1. Advertisements

  3. No. Speedup is only partially dependant on software's multiple processor
    awareness. Times when operating systems ran one and only one task are
    long gone. These days you're going to run many programs, virus checker,
    IM, mp3 players, web-browser (and all that flash crap) at the same time
    and while one processor is cruncing on your 'main task' the other one
    will happily do the others (that would take away from that one processor
    on uniprocessor system). This speedup is not insignificant, I've yet to
    encounter single-processor system that _feels_ smoother than even old
    SMP.
     
    Pasi Savolainen, Sep 20, 2005
    #43
  4. T. Wise

    Father Kodak Guest

    Hecate,

    I see that you and your company do a lot of toil and go to a lot of
    trouble in your backup approach. Perhaps someone's brain was bubbling
    over with ideas that have the effect of doubling the workload.

    For me, as a home network user, high quality tape has the advantages
    of simplicity and compactness.

    Father Kodak.
     
    Father Kodak, Sep 20, 2005
    #44

  5. I think it's inappropriate to mention the names of these photographers
    in this public newsgroup since they are not involved in this
    discussion (and probably never visit this newsgroup). Having said
    that, the panel includes photographers who routinely publish in
    Arizona Highways and National Geographic. Good enough panel for you?

    -db-
     
    David Blanchard, Sep 20, 2005
    #45
  6. I don't want to take sides about VueScan, as I haven't done testing to
    check whether what Don says, or whatever is relevant, is true of VueScan
    or not.

    But let me express some general opinions about "judging criteria". On
    this, I basiclly agree with Don: it's the technicalities that count.
    That the final result is pleasing, good or what you want to call it does
    not, IMHO, say anything about the used programs' qualities -- except
    that, of course, when the "technicalities" are very very wrong, the
    final result will hardly be too good.
    Well, if it scans at 8-bit, that's an obvious limitation of FilmGet.
    VueScan scans at 16-bit so there will obviously (*) be more information
    in the raw scan, but I see that more as a matter of "input data quality"
    than "processing quality".

    (*) Of course it's not really obvious, as I have argued with Don a
    little about this, as you have read. But let's just assume that
    scanners, or at least decent film scanners, do have more than 8 bits per
    channel of meaningful information. Under this assumption, of course, a
    program like FilmGet that, as you say, only scans at 8 bpc will not even
    be considered by someone who wants to get full quality from the scanner.
    But then you shouldn't even care that FilmGet creates "gaps and spikes"
    in the histogram, while VueScan doesn't: it's still "just" the histogram
    that we're talking about.

    Anyway. Now, Don says that VueScan produces "smooth" histograms because
    it deliberately introduces noise.

    Let's assume for a moment that it does *not*, and that the smooth
    histogram is simply the result of smart processing that minimizes
    information loss: what would you prefer at this point, a program like
    this, or a program that creates "gaps and spikes"?

    And between the "gaps and spikes" creating program and another program
    that does not show gaps and spikes because it hides them in noise, what
    would you prefer?

    Me, I'd prefer the "smart processing" program over the "noise hidden
    gaps and spikes" program, and the "noise hidden gaps and spikes"
    programs over the "gaps and spikes" program.

    That's because the "smart processing" program does what I think it's
    supposed to do: is smart enough to discard the least possible amount of
    valid information.
    The "noise hidden gaps and spikes" is then definitely worse than it, but
    it's still better than the "gaps and spikes" program, because it tries,
    at least, to process the image so that the loss of information is as
    invisible as possible.

    All of this is *independent* of final image quality: perhaps the three
    programs would all give final images that, for me, are indistinguishable
    from one another.
    But this doesn't matter. Who knows that, someday, I might not want to
    crank the levels on those images, or heavily change the gamma, or
    whatever? Won't then the three images differentiate? Surely, after a
    point, they will; and at that point, having used the "best" program will
    pay.
    Smooth histogram doesn't mean much unless you *know* its smoothness
    comes from valid information.
    For example, imagine a really bad scanner, which has 16-bit A/D but is
    so bad that only carries data in the four most significant bits, and all
    the rest is drowned in noise.
    A scan from it will show a smooth histogram. So...?

    Sure, in such a case, there probably *will* be "visual corruption upon
    inspection". But even in a case where there isn't visual corrupution,
    can you say that it will remain that way upon playing with levels,
    applying sharpening, or doing whatever you might fancy doing in a future?
    "Desired"? Desired by whom? I have a thermometer right in front of me.
    It shows me there are 22C. Unless it's way off, which it isn't, it
    produces "the desired result", I couldn't possibly want more from it.

    But, is this an argument against the production of high-precision
    temperature measuring devices?
    You know, what's "desired" may differ from person to person, and even
    change for the same person, depending on various factors.

    This doesn't prevent doing a decent, scientific, objective or
    near-objective analysis showing that the high-precision thermometer is
    definitely, unarguably better than the one I have.
    Then perhaps Don is wrong about VueScan. This, however, takes nothing
    away from the value of scientific testing.
    Oh well, it convinces *me*.
    Since the "subjective" judgement is subjective, I'd rather make the
    relative processing myself, i.e. in Photoshop or something.
    I'd like the scanner program to do the "objective" part for me, though,
    and do those things like adjusting for film curves, setting exposure
    times, even sharpening (if the scanner's exact need of sharpening is
    known), in a mathematically as exact way as possible -- and losslessly.


    by LjL
     
    Lorenzo J. Lucchini, Sep 22, 2005
    #46
  7. You're right of course. But what I was thinking of is a program that
    makes the histogram "as smooth as possible", without using "tricks" like
    noise.

    For example, take a program that applies gamma 2.2 to a raw scan:
    certainly, the resulting histogram will be better if the program just
    applies gamma 2.2, than if it iterates the application of muliple,
    smaller gammas.

    OK, this is a stupid example, no remotely sane program would apply gamma
    like that; but still (since there is not only gamma in the real world,
    but there are also other things that need to be done to the image), a
    program that applies transformations "smartly" has an edge over one that
    applies them naively.

    Often, it's simply the *order* in which you apply the transformations
    that makes a difference!
    As I said previously, I don't quite agree on this, *as long as* the
    noise is *only* applied so that it fills in the gaps, *without* touching
    the existing data.

    That's because this noise addition is lossless, and the "original" image
    without noise can be precisely reconstructed, as long as one also has
    its histogram.

    It's not "pure data", that's for sure, but the "pure data" can be
    reconstructed (with the "save the histogram" caveat), and the result
    tends to be more pleasing to the eye, since it hides posterization.
    Indeed we may.
    And you're not wrong saying that it is mathematically impossible;
    however, the mathematical impossibility falls if we assume that the
    histogram of the original image is not discarded. If it's kept -- and
    the noise is applied in the correct way, i.e. *only* filling the gaps --
    there should be no loss of information at all.
    But a "smart" application of noise to smooth the histogram, like I
    described above, would be welcome IMHO.
    Of course, it would have to be done correctly, and should be an
    user-selectable option (tooltip: "Helps hiding excessive posterization;
    a histogram taken before application must be kept for this operation to
    be reversible"), that goes without saying.
    by LjL
     
    Lorenzo J. Lucchini, Sep 22, 2005
    #47
  8. T. Wise

    Roger Guest

    No one knows for sure how long even the best optical disks will last.
    Accelerated life testing has been extrapolated to show several disks
    *should* be very long life, but no one knows for sure. Some have
    turned out to be dismally shy of what had been predicted. For long
    life you go with the best you can find and treat them nice.

    Archived data can be rendered useless due to a number of causes of
    which the normal storage life of the disk is only one. Gold, as in
    the original Kodak disks was listed as potentially 100 years. It's
    far more likely the medium will become obsolete long before it
    actually fails.

    For me, CDs are already obsolete. I shoot about 80 gigs worth of
    digital images a year, or have been. I've also been scanning 35mm
    slides and negatives which generate 60 to 128 meg files for each image
    at 4000 dpi. For me, CDs are just too small to be practical. I still
    have a lot of archiving to do and using two DVDs, with one local and
    one remote I can use a lot of them in a hurry. I also have over 3
    terabytes of on-line storage using 250 and 300 gig Ultra ATA HDs in
    USB-2 enclosures. Two external drives with two or three internal is a
    lot of storage when you take into account 4 computers. I'm running a
    gigabit network using Cat5e cable, but would prefer something a tad
    faster.
    Moderator? No such thing. It's just up to the users to either guide
    the conversation back, or start a new thread.

    Roger Halstead (K8RI & ARRL life member)
    (N833R, S# CD-2 Worlds oldest Debonair)
    www.rogerhalstead.com
     
    Roger, Sep 23, 2005
    #48
  9. Sorry, I still stand by what I said.

    And what I said was: adding noise to an image whose histogram has
    "gaps", so that such gaps are filled, is a reversible operation provided
    that:
    - the original histogram is not discarded
    - noise is applied in an appropriate way
    - the algorithm used is known (but this is obvious, and holds for every
    "reversible", or "lossless", operation)

    Instead of going to great lengths trying to explain why I still think
    so, I'll provide an example.
    If you download http://ljl.150m.com/scans/snoise.tar.gz , you will find
    a small (and ugly) C program that does what I'm talking about.

    There is a Linux executable included; sorry, no Windows compiler
    available at the moment.

    You should get a posterized ("gapped histogram") grayscale image and
    save it to "raw" format (Photoshop's raw is fine).
    Or you can use the included "test.raw".

    Using "test.raw", run

    ../snoise -add 300 465 test.raw test_noise.raw histogram.txt

    Now load the output image in Photoshop or something: you'll see that
    noise has been added, and that the histogram has no gaps anymore. The
    histogram is still all but "smooth" (you'll see what I mean), but there
    are no gaps in any case. I'm sure a decent algorithm would make it
    smooth for real.

    Now run

    ../snoise -rem 300 465 test_noise.raw test_denoised.raw histogram.txt

    You can check with Photoshop that "test_denoised.raw" is perfectly
    identical to the original "test_noise.raw" (save bugs, but it works with
    the included test image).


    Just in case, the general syntax is

    ../snoise [-add|-rem] ImageWidth ImageHeight InputImage.raw
    OutputImage.raw HistogramFile.txt
    by LjL
     
    Lorenzo J. Lucchini, Sep 24, 2005
    #49
  10. It all depends on what you mean by "corruption", and by "uncontrollably".
    Sure, as I'm adding noise to the data, I'm "corrupting" them. Also, the
    final result would possibly more pleasing with some interpolation.

    However, right now I wasn't focusing on the pleasantness of the final
    result, but just on its reversibility.

    The method I quickly implemented does not give results that are
    exceptionally pleasing to the eye; on the other hand, it is completely
    reversibly.

    So while I can understand "corruption", from an aesthetical point of
    view, I cannot understand "uncontrollability".

    Just feed the program the "corrupted" image together with the original
    histogram, and you'll get back precisely the original image.
    *Not just* one having the original histogram, but the *same* image.
    Please note that I'm not claiming to *remove posterization*.
    That's also why I snipped what you wrote at the beginning of the
    article: with your black-and-white example images, applying "my method"
    wouldn't result in removing posterization, by any stretch of imagination.

    It will just add noise, the result perhaps not being very good-looking,
    but reversible.

    http://ljl.150m.com/scans/bw_orig.tif - a B/W image like one you suggest
    http://ljl.150m.com/scans/bw_noise.tif - that image with noise applied
    http://ljl.150m.com/scans/bw_back.tif - that image with noise removed
    (change the extensions to .gif if you prefer)

    You can see that the third image is identical to the first (not just
    that the *histograms* are identical).

    That's all I wanted to demonstrate: that noise can be applied in a way
    that is reversible, as long as the histogram is considered a legitimate
    part of the data needed for reversing.
    By the way, the new http://ljl.150m.com/scans/snoise.tar.gz I've just
    uploaded contains also a Windows executable.

    by LjL
     
    Lorenzo J. Lucchini, Sep 25, 2005
    #50
  11. T. Wise

    Roger Guest

    You will probably find my name heavily involved in the discussion on
    RPD. The question and discussion resurfaces over there every few
    months.

    I'm glad to hear they are starting up some groups devoted to archiving
    data, but I'm afraid that the questions and discussions will most
    likely stay with the photography and scanning groups where the
    interest lies. It's difficult to get discussions pertaining to what
    people are doing in a group to move to a specific group even if it is
    devoted to that item. Sorta like RPD and RPD.slr-systems. There is
    still more SLR stuff on RPD than systems many months after the
    creation of the new group. Enough if you are looking for digital slr
    stuff you still need to read RPD in addition to the digital
    slr-systems.

    Nothing shows up on a search of the newsgroup index using
    archiving.digital. Actually, only one group shows up under archiving
    and that is of no use. Nothing under digital or archive pertaining to
    data archiving either.
    OTOH, I'm on Charter and the newsgroup service they use ... well...
    stinks is the most polite way I can put it. <:)) It's slow, it's
    throttled, connections are limited, and it's been my experience it
    takes forever to get something added even if you have the correct and
    complete name.

    It looks like I'm going to have to find one of the subscription news
    groups with a fast return and logical layout.
    I would if I could, but I can't.
    Absolutely zip on Charter.

    Roger Halstead (K8RI & ARRL life member)
    (N833R, S# CD-2 Worlds oldest Debonair)
    www.rogerhalstead.com
     
    Roger, Sep 26, 2005
    #51
  12. T. Wise

    Father Kodak Guest

    Also why Intel has had to rush dual core CPUs into the market. "Intel
    Inside" but "AMD Leading" these days.
    Even for non-dual processor aware tasks. Windows does run "smoother"
    with dual processors, even for "normal" tasks.
     
    Father Kodak, Sep 26, 2005
    #52
  13. T. Wise

    Father Kodak Guest

    Depends which technology.

    Tape is very entrenched in the enterprise. No way would that
    technology survive if it weren't reliable. Too many requirements
    these days for data retention.

    I'm not sure if you are in the US or not. If you are, have you heard
    of Sarbanes-Oxley? (or Sarbox?) If tape couldn't perform the
    required data backup for data integrity purposes, then there would be
    a wholesale rush to the corporate exits for something better.
    Yes, more than once with excellent results. But I've never skimped on
    the drives or media.
     
    Father Kodak, Sep 26, 2005
    #53
  14. Well, I'm not just applying noise randomly (otherwise the operation
    wouldn't be reversible anyway): I *am* taking the data into account, by
    taking the histogram into account.
    I'm not really so concerned with reversibility. In fact, I have no
    problem believing that many *irreversible* transforms would show much
    better results than my simplicistic algorithm.

    It's just that what you said here...

    --- CUT ---

    [myself]
    [you]
    The problem is that's just physically impossible. There's only a fixed
    amount of pixels in an image, and the histogram shows them all.

    --- CUT --

    .... is not really true.

    Whether this fact has any useful practical applications, I'll leave to
    more skillful eyes to judge. But it's not impossible.
    That's certainly a reasonable thing to do. Still, if one could find a
    posterization-hiding algorithm that is *both* nice-looking and
    reversible (it might not exist of course), there would be no need at all
    to keep a copy.

    Well, as long as that's the *only* transformation applied to the
    original image... so I guess this is all more theoretical than anything.
    But who knows, theory sometimes have some applications one doesn't
    think of off hand.
    by LjL
     
    Lorenzo J. Lucchini, Sep 26, 2005
    #54
  15. T. Wise

    HvdV Guest

    Indeed. The DAT tapes I used to have were not good in this respect. Rumour
    has it that this extends to all heliscan drives. Linear tape technology (DLT,
    LTO) is far more reliable, in any case better than CDs. To get a bit back to
    the OT, as good as B&W film.

    -- Hans
     
    HvdV, Sep 26, 2005
    #55
  16. That's fine. But then you're saying that VueScan is good *because* of
    objective (if a bit vague, as they've been defined in this subthread)
    facts like the histogram etc.
    That's the way I see it, too.
    I'm not qualified to give you a decent reply on this. By scientific or
    objective testing I mostly mean taking *measurements* (not visual
    observations) to discover things about
    - how much information (in an information-theory sense) the scanning
    software loses from the "raw" scan
    - how capable the scanning program is to restore important parameters of
    the scanned picture (colors, sharpness, etc); I suppose this can be
    measured objectively with things like test targets etc.
    - how good the scanning software is to instruct the scanner so that it
    extracts as much information as it is physically capable to (things like
    setting exposures, setting focus, multi-sampling, possibly decent
    multi-pass multi-sampling...)

    Then, I don't really know what tests are good for determining what.
    Actually, I mostly don't know what any tests do at all.

    But I just had the impression that you were rejecting the idea of
    "objective testing" itself (well, in the "scanner software" domain at
    least).
    I disagree wholeheartedly with such a position: even though I'm not
    competent enough to know the specifics, I definitely know that the
    parameters I listed above *are*, mostly, measurable. And ought to be
    measured.


    The answer to your second question is easy: I *don't* know that Don has
    done any objective testing. He might have, or he might have not.

    But please understand that I wasn't specifically defending Don or
    accusing VueScan (I'd have to know both better in order to "defend" or
    "accuse" either).
    I was just taking a stance against what I understood to be your idea of
    "testing a scanner program".


    by LjL
     
    Lorenzo J. Lucchini, Sep 26, 2005
    #56
  17. T. Wise

    Roger Guest

    Prints are not usually considered any where near as long lived as
    negatives and slides. OTOH they usually receive far rougher treatment
    than slides or negatives.
    Welll.. in general, but those negatives and slides don't have any
    gurantees either. I've have a number of Kodachrome and Ektachrome
    slides I scanned and restored and the originals were on their last
    legs. Most, but not all of these were well taken care of so the only
    reason for color shifts and fading that I can think of would be the
    quality of processing.

    I do think it's safe to say *most* slides and negatives will be
    available in a form we can use far longer than any current digital
    with those few exceptions where the image is fading due to either poor
    processing or abuse in storage.

    Roger Halstead (K8RI & ARRL life member)
    (N833R, S# CD-2 Worlds oldest Debonair)
    www.rogerhalstead.com
     
    Roger, Sep 27, 2005
    #57
  18. T. Wise

    Roger Guest

    I thought you were referring to a new news group. I have most of the
    posts in the previous thread/discussion on RPD.

    I also have one of the reference pages
    http://www.rogerhalstead.com/scanning.htm
    Some where around here I have an e-mail from a college that is using
    this page in one of their photo courses.

    It does need updating, but not a lot has changed.

    Roger Halstead (K8RI & ARRL life member)
    (N833R, S# CD-2 Worlds oldest Debonair)
    www.rogerhalstead.com
     
    Roger, Sep 27, 2005
    #58
  19. I think we're understanding what each other says, it's just a matter of
    terminology.

    I'm not so terribly strong in information theory, but as far as I
    understand, "lossless" and "reversible" are synonyms in this context: a
    "lossless transformation" is one where the original input can be exactly
    reconstructed from the output; and that's the same as a "reversible
    transformation".

    Just think of how these terms are used about compression: LZW
    compression is lossless, as it is reversible. JPEG compression is lossy,
    as it is irreversible.
    Yes, I see your point.

    But! The transformation is still lossless, for even if you apply LZW
    compression (to continue the example above), you destroy pixels: you
    even actually *remove* some of them (which is the whole purpose of
    compressing).

    It's just that your image viewer automatically applies the inverse
    transformation to get the original input back to you.

    It seems that, in the end, your definition of "lossless" is "don't touch
    the image".

    Think of it, even if you could somehow *add* pixels (so that your "all
    pixels are taken" wouldn't hold true anymore), it still wouldn't be
    lossless under your definition: how would you know which pixels are
    "original" and which ones were added?


    Also, you talk about "destroying some of the existing pixels".
    But wait a moment: how could I add new data (that is, random noise)
    while maintaining reversibility without taking up more bytes than the
    original image?

    The fact that I can shows that some of the stuff that was in the image
    *wasn't information to begin with*.
    In fact, the "existing pixels" in the original image conveyed much less
    information than they would be able to.

    Why? Precisely because of posterization. For each given pixel whose
    position in the histogram is in the middle of a gap, you can't really
    say what its real value would have been: for some reason, the image we
    have is posterized, and thus one single pixel value can map to a *range*
    of values in the original picture.

    If you take each of these pixels and give it any random value *in that
    range*, you aren't really taking away anything from the image, or
    "corrupting" it.
    That's because each of the pixels *really could* have had any of those
    values (in the original picture), and we have no way to know which.

    This is precisely what my program does.


    Take a posterized image. Throw it to my program. Take a lossless
    compression algorithm, and compress both the original image and the
    "noised" one.
    The original image will compress much better, and this is mostly because
    it *did not* contain the information that my program "destroyed" --
    which it didn't.
    See, now you're using what I take to be the correct definition of
    "lossless"!

    You say, "curves editing isn't lossless because applying the inverse
    curve won't recover the image [because of rounding errors, given a small
    enough number of bits per channel]".

    This is correct.
    On the other hand, it would obviously *not* be correct to say that
    "curves editing isn't lossless because it changes the original pixel
    values" -- duh, of course it does!
    But this looks like precisely the objection you make to my rant about
    adding noise.
    No, not more importanly, but much less importantly.

    In practice, sure, this will be an important factor.
    But in theory, it's implicit in the terms "lossless" and "reversible"
    that you have to know the full algorithm (that is, in this case, the
    various editing steps) that transformed the data.
    Come on, no, not really for images larger than a hundred squared pixels
    or so!
    Unless Photoshop macros are *really* bloated. But even then, there are
    other ways than Photoshop macros.
    I'm sure that this is true in practice.
    Yes, agreed.

    But it wasn't so much to suggest that "my method" could be used instead
    of good old archival; it was more to convey the idea the my "noised"
    images will contain the same information (and more, i.e. the noise, but
    this doesn't matter to us) as the originals.


    Last example: take an image that is all black on the left and all white
    on the right, with no grays - again like your own example.
    If you know that this is not a faithful reproduction of the original
    picture, but rather a result of posterization/quantization, then you may
    suppose that the original picture could have been a continuous shade of
    gray... or something else, you can't really know.

    So what happens if you add noise to the black/white image with my
    program? My program will fill the black part with random values 0..127,
    and the white part with values 128..255.

    What have you lost in the process? Nothing, as you didn't really know
    that the "black" part was black (instead of varying from 0 to about 127)
    asnd that the "white" part was white (instead of varying from about 128
    to 255).


    by LjL
     
    Lorenzo J. Lucchini, Sep 27, 2005
    #59
  20. The only way not to lose information (in the sense you use this term!)
    would be not to change the input at all.
    A lossy operation can't be reversible.

    Can you name a "lossless" operation (i.e. where "no information is
    lost"), which doesn't need to be reversed to get back the input?

    The only one I can think is the null operation.
    Cool. So, I don't destroy pixels when I apply my noise addition,
    otherwise I would not be able to get them back.

    And I do get them back. Try and see! Where *is* the difference between
    this and LZW? They both get back *exactly the same input image with no
    changes at all*, so why is my noise addition "destroying data" while LZW
    is not?
    Nor is information destroyed with my noise addition.
    Well, no information is lost.

    You have a set A of data, and you transform it to a set B of data by
    applying an operation O. If there exists an operation O' that can take B
    as input and give back A, then O was lossless/reversible.

    I still can't understand *where* you disagree with this concept.

    (My noise addition operation is obviously only lossless when the
    histogram of the original image "A" is considered part of the output
    "B", but I've made that clear multiple times. This is the only caveat I
    can see, though)
    That's why I said "even if you could". What I wanted to say is that the
    point is moot, since under your definition, it wouldn't be lossles *even
    if* this could be done.
    But I can. Just run the program. You'll see that you'll get back your
    original input every time, save bugs (i.e. "maintain reversibility"),
    and yet, noise will have been added in the output.
    Yes, it is lossy by definition.
    But no, that's not what I'm doing.

    I'm not removing marginal data; I'm removing *no* data.

    Answer this please: if you taken an image.tif, then compress it to an
    image.jpeg, then convert image.jpeg to image2.tif, will
    image.tif=image2.tif ever hold true?

    I bet it never will. Precisely because JPEG is lossy.

    On the other hand, if you take an image.tif, apply noise with my program
    and get an image_noise.tif, then use my program to remove the noise and
    get an image_denoised.tif, it will be true that
    image.tif=image_denoised.tif (though, again, you must feed my program
    the histogram of image.tif as well).
    This may be sound advice, but it's not the point.
    I'm not trying to demonstrate that my program is the best way to remove
    posterization, but only that it performs a lossless operation.
    I think the program can be improved to make the image look less
    terrible, hopefully to a point where the result will look better than
    the original posterization.

    But, for the moment, I realize perfectly that the patient died; I still
    think I can show that the operation was a success.
    The data does not exist *in the input*! Obviously, I'm talking about the
    input to my noise adder, that is the original scan.

    If the scan was made in 8-bit, then the data does not exist in it.

    If you scan at 16-bit, it's all different. And, granted, scanning at
    16-bit is the best way to go in many cases.
    Of course! but as you say, "for whatever reasons" we have an 8-bit file.
    Clearly, all the arguments about reversibility and lossless operations
    must be about *that* file.

    Otherwise, it's like saying, "no, LZW isn't lossless because you could
    have gotten a more genuine image using a better scanner [or scanning in
    16-bit]". Half of this sentence is true, but the other half doesn't make
    any sense. Guess which is which?
    "Recreate"? I've never claimed I can "recreate" useful 16-bit channels
    from a single scan of 8-bit channels.

    (That is, of course, unless the scanner *really* only outputs 8
    meaningful bit; but you said that even a low-end scanner has meaningful
    data beyond the 8th bit, and I'll take your word for it)

    In regard to the actual 16-bit data, *the* 8-bit file itself is
    corruption! But that wasn't the point of the current discussion, having
    nothing to do with "information", "corruption" and "lossless/reversible
    operations" relative to *the data you have*.
    If you can get better data by sampling the target picture again, that's
    another matter.
    Sure. Please don't get the impression that I'm rejecting this possibility.
    I currently scan at 8-bit mainly because of file size and bus speed, but
    I'm absolutely not trying to claim that scanning at 16-bit is worthless.

    The case is simply: you have a posterized image, and you can't or won't
    scan it again at a higher bit depth. What can be said about such an
    image? What reversible operations can be applied on it? Is artificial
    noise perceptually better than posterization? etc.
    Yes. By "something else" I didn't mean "anything else"... but there is a
    wide range of possibilities; all involve a transition, sure.
    Exactly! I've gained nothing, as far as information theory is concerned.
    But, let me stress once again that I've also lost nothing.

    So, what's the purpose in adding grain to a clean (but posterized) image?

    Well, if a purpose there is, it is perceptual. I think the human eye
    simply likes noise more than posterization; after all, the concept of
    quantization is probably unused in our brain, while noise is very
    present on the other hand.

    So, if we have an excessively low-bit-depth image (which is essentially
    what a posterized image is), I think our eyes prefers it displayed with
    noise in the (otherwise empty) lower bits, than with the lower bits at zero.

    This is essentially the whole point of my test program. If fills the
    lower bits with random data, when these lower bits would otherwise be
    consistenly at zero.

    (In reality, it's a little more complicated than just "filling in the
    lower bits", as the gaps in the histogram might not be uniformely
    spaced, in which case the situation can't be precisely compared with a
    low bit depth condition; but the overall concept keeps working).
    Amen!
    But this doesn't always make it useless to discuss "fixing the
    consequences". If you can, you solve the root problem, but if for some
    reason you can't, it can be helpful.
    You know, you can't just solve everything with "go take a better scan".
    The software tricks are useful sometimes.

    by LjL
     
    Lorenzo J. Lucchini, Sep 28, 2005
    #60
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.