VGA vs DVI connection to monitor

Discussion in 'Digital Cameras' started by DaveS, Jan 21, 2011.

  1. DaveS

    DaveS Guest

    The stuff I find using Google doesn't seem to be authoritative in any
    way on this question..

    I am receiving a new monitor (Dell U2311h) next week, and it can be
    connected by various types of cable. My graphics card can use VGA or
    DVI. The question is, will I experience any benefit by paying for and
    using a DVI connection over using the included VGA connection?

    I'm interested specifically in responses relating to photo editing.

    Dave S.
     
    DaveS, Jan 21, 2011
    #1
    1. Advertisements

  2. The stuff I find using Google doesn't seem to be authoritative in any
    Dave,

    Many Dell monitors are supplied with both VGA and DVI cables, so you may
    be able to test for yourself with no extra cost. Here, I've not noticed
    any significant difference - perhaps even /any/ difference - between VGA
    and DVI cables. I have heard that some monitors don't allow control over
    digital inputs, but I don't think that will apply to the Dell.

    What I /would/ urge you to try is to clear some desk space and keep both
    old /and/ new monitors, and using the two outputs on your graphics card to
    create a two-monitor setup. Going from one monitor to two was one of the
    most productive changes I made on my PC.

    Cheers,
    David
     
    David J Taylor, Jan 21, 2011
    #2
    1. Advertisements

  3. DaveS

    DaveS Guest

    Thanks for the advice.

    I'm afraid I'm far behind you on the second part of your message,
    though. My previous (current, as of today) monitor is a CRT. I had
    resisted replacing CRT with LCD because of reading that LCD were not as
    colour accurate until this week when differences between my brother's
    view of a photo varied so drastically with mine, that I checked a
    calibration web site and found I was missing several distinctions at the
    dark end.

    One step at a time.

    Dave S.
     
    DaveS, Jan 21, 2011
    #3
  4. DaveS

    Eric Stevens Guest

    I'm now on my second Dell monitor using DVI and I wouldn't go back to
    the less stable analogue connection.

    What I would highly recommend is that if you are doing photo editing
    you should invest in screen calibration equipment (I use Spyder).
    Small and almost subtle changes in the screen can result in large
    changes in what you print. Its highly frustrating when you can't get
    the same print results as you did several months previously.



    Eric Stevens
     
    Eric Stevens, Jan 21, 2011
    #4
  5. DaveS

    N Guest

    Why do you think the DVI connection will cost you more money? There'll
    be a DVI cable in the box.
     
    N, Jan 21, 2011
    #5
  6. DaveS

    DaveS Guest

    I set out to prove you wrong, but I stand corrected:
    What's in the Box

    Monitor with stand
    Power Cable
    DVI Cable
    VGA Cable (attached to the monitor)
    Drivers and Documentation media
    USB upstream cable
    Quick Setup Guide
    Safety Information

    I believe that I have purchased and set up LCD monitors for others where
    there was a DVI connector but no cable. Clearly, there is no cost for me
    to find out for myself if there is any visible difference with this monitor.

    Dave S.
     
    DaveS, Jan 22, 2011
    #6
  7. DaveS

    N Guest

    My new Dell PC at work has a video card with 3 DisplayPort connections.

    http://en.wikipedia.org/wiki/List_of_display_interfaces
     
    N, Jan 22, 2011
    #7
  8. Whether you think you can see it on any given displayed
    image or not, use the DVI connection.

    I won't go so far as to say digital data is vastly
    better than analog data, but it is certainly better and
    you get significantly improved precision. Another point
    is that with age the VGA interface will drift far more
    than will the DVI interface.
     
    Floyd L. Davidson, Jan 22, 2011
    #8
  9. DaveS

    Robert Coe Guest

    : > Clearly, there is no cost for me
    : >to find out for myself if there is any visible difference with this monitor.
    :
    : Whether you think you can see it on any given displayed
    : image or not, use the DVI connection.
    :
    : I won't go so far as to say digital data is vastly
    : better than analog data, but it is certainly better and
    : you get significantly improved precision. Another point
    : is that with age the VGA interface will drift far more
    : than will the DVI interface.

    DVI might be slightly more resistant to RF interference, especially if the
    cable is long. But in normal use, it's very unlikely that you'll be able to
    see any difference in image quality. That said, there's no reason not to take
    Floyd's advice: if your card supports DVI, you might as well use it.

    I have two dual-monitor setups at work, one of which uses one monitor on DVI
    and one on VGA. On that setup, I can see a slight color difference between the
    two monitors, but not enough to be annoying. On the setup with two DVI
    monitors connected to the same video card, the colors look identical (given
    identical settings of the monitors, of course).

    Bob
     
    Robert Coe, Jan 24, 2011
    #9
  10. In normal use it should be an obvious difference. A
    digital interface sends a specific discrete value to the
    monitor. It is the exact same value each time, and is
    calculated from the value in the digital image file. It
    doesn't change, and has the same accuracy each time.

    The VGA interface has to convert the digital value to an
    analog value, and then the monitor has to using the
    timing of a dot clock to pick out the precise time that
    the right value is made available. It is not nearly as
    precise as the process used by the digital interface.
    It can never be as accurate.
    But it *is* different! The difference is error.
    No error.
     
    Floyd L. Davidson, Jan 24, 2011
    #10
  11. DaveS

    Eric Stevens Guest

    Unless the monitors are calibrated, it might be two different errors.



    Eric Stevens
     
    Eric Stevens, Jan 24, 2011
    #11
  12. He specified "identical settings of the monitors, of course". They will
    by definition be the same.
     
    Floyd L. Davidson, Jan 24, 2011
    #12
  13. In normal use it should be an obvious difference. A
    Maybe it /should/, but in practice it does not (at least on correctly
    adjusted monitors).

    Cheers,
    David
     
    David J Taylor, Jan 24, 2011
    #13
  14. I don't agree with your statement at all. In practice
    with a digital interface it sends *exactly* the same
    value every time.

    The problem for the analog interface is that is isn't
    exactly the same every time.

    And that of course is precisely the distinction between
    digital and analog when it is affected by noise. The
    digital system can function with a much lower SNR than
    can an analog system. It's fundamental.
     
    Floyd L. Davidson, Jan 24, 2011
    #14
  15. Yes, you can get the "right" value into the monitor, but the issues of
    drift and calibration inside the monitor are just the same as with an
    analogue input monitor. I find that, in practice, drift of the analogue
    components in a VGA interface isn't an issue, and neither have I seen VGA
    signals affected by electrical noise even on moderate cable runs. Perhaps
    I've been lucky!

    Cheers,
    David
     
    David J Taylor, Jan 24, 2011
    #15
  16. DaveS

    DaveS Guest

    OK, so theory says there is a difference in the signal received by the
    monitor, depending on whether its coming from a digital or an analogue
    source. Experience shows there is no noticeable difference.

    Does anyone know how laptops connect video processor with display?

    Dave S.
     
    DaveS, Jan 24, 2011
    #16
  17. DaveS

    Savageduck Guest

    My MacBook Pro has a proprietary "mini displayport", "MDP" which
    supports VGA, DVI, and HDMI via their MDP adapters.
    The same MDP is found on the current iMacs and MacMini.
     
    Savageduck, Jan 24, 2011
    #17
  18. DaveS

    Eric Stevens Guest

    It all depends what you mean by 'settings'. Even though they are set
    identically for brightness, contrast etc etc, they may still need two
    slightly different ICC color profiles. Even having slightly different
    lighting of the respective work areas may be sufficient to require two
    different ICC profiles. I accept I am close to quibbling.



    Eric Stevens
     
    Eric Stevens, Jan 24, 2011
    #18
  19. DaveS

    Eric Stevens Guest

    For a given definition of 'noticeable'.


    Eric Stevens
     
    Eric Stevens, Jan 24, 2011
    #19
  20. DaveS

    DaveS Guest

    I don't know Macs, but surely you're talking about an external port. My
    question intended to enquire about the internal connection between video
    processor and display.

    Dave S.
     
    DaveS, Jan 24, 2011
    #20
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.