Which ATI (or other? better?) card 1) has DVI and 2) will work with an OLDER AGP chip?

Discussion in 'Amateur Video Production' started by MISS CHIEVOUS, Oct 20, 2006.

  1. Benjamin & Frodo . . . you guys are JUST THE GREATEST!! Honestly, I
    can't thank you enough for this intelligence! --and it will be an
    excellent resource for anyone else who faces this issue.

    I'd like to focus this discussion on the "end game": what result is it
    exactly that I'd like to achieve if I buy this monitor. In the
    best-case scenario, two results would be possible:
    - - - - - - - - - - - - - - - - - - - - - - - - -
    1. I would turn this monitor 90° and the graphics card driving it
    would be capable of IMMEDIATELY self-adjusting to the (10:16) 1200 x
    I wouldn't need to access the video card's software (control panel) in
    any way to instruct the screen to adjust; the hardware __alone__ would
    be smart enough to reset automatically.

    and (very important!)

    2. The video card (and/or monitor itself) would -->>REMEMBER MY
    SETTINGS for each of the two respective modes I would be using (16:10
    and 10:16). I would not have to reset the display via the graphic
    card's control panel each time I physically rotated the monitor; it
    would remember the layout and display it automatically.
    - - - - - - - - - - - - - - - - - - - - - - - - -

    Some of this is beyond the reasonable scope of your ability to advise
    (you almost need to have this monitor before you to see if these
    features are available or not), so I'm not expecting an answer for this
    specific monitor.

    "Generally" speaking -- to the extent you know about monitors that
    PIVOT (can be physically rotated) -- are these features automatic? or
    are they dependent upon the video card that is driving the monitor?
    Are they limited by the generation of processor on the motherboard? or
    is this an issue of the OS?

    As I see it, I have exactly five outcomes to consider:

    1. Windows 2000 Pro + ATI Radeon 7200 = Monitor PIVOTS 90°
    2. Windows 2000 Pro + **NEW AGP GRAPHICS CARD** = Monitor PIVOTS 90°
    3. Windows XP + ATI Radeon 7200 = Monitor PIVOTS 90° automatically
    4. Windows XP + **NEW AGP GRAPHICS CARD** = Monitor PIVOTS 90°
    --------. . . and STILL this won't be automatic? = Doubtful I'll buy
    this monitor
    5. Windows XP + **NEW AGP GRAPHICS CARD** = Monitor PIVOTS 90° with
    software intervention by User

    My currentl ATI Radeon 7200 has the "Rotate" feature. The question is,
    is it automatic?


    MISS CHIEVOUS, Oct 25, 2006
    1. Advertisements

    If I remember right the 2407FPW doesn't have a twisting sensor which
    means the computer can't know if you have rotated the display or not. So
    you always have to change the setting manually.
    If the monitor doesn't tell the computer that it has been rotated this
    simply is not possible. However, to set up pivot is only one setting to
    change. You don't have to change resolution, just select pivot mode in
    the display properties or in the tray utility.
    On most monitors no. There are a few models that detect the display
    rotation and tell the computer to change settings but these usually are
    very expensive professional models.
    If pivot id done automatically id dependent on the monitor. If pivot can
    be done at all is dependent on the gfx card (every Radeon or Geforce
    does fine).
    No, but it also is an issue of the driver. Windows2k/XP/Vista usually is
    Automatic depends on the monitor not on the gfx card.

    Benjamin Gawert, Oct 25, 2006
    1. Advertisements

  3. * Frodo:
    Just be aware that some card makers didn't code the AGP connector for
    1x/2x any more to avoid to have to support these cards in old systems.

    Benjamin Gawert, Oct 25, 2006
  4. Benjamin Gawert wrote:
    ~~~~~CAPPED and edited for Quick Reference + Summary~~~~~
    Once again Benjamin, this is very helpful and I can't thank you enough.

    This intelligence informs 4 significant purchasing decisions:
    - - - - - - - - - - - - - - - - - - - - - - - - -
    The ATI Radeon "ROTATE" feature is virtually the identical feature I
    would see on even the most expensive, state-of-the-art ATI or GEFORCE
    - - - - - - - - - - - - - - - - - - - - - - - - -
    PORTRAIT MODE capability is the identical capability I would see on an
    XP (or Vista) OS interface.
    - - - - - - - - - - - - - - - - - - - - - - - - -
    The graphics slot on the motherboard does not interface with the
    monitor to enable its "ROTATE" (or "PIVOT") mode -- only the graphics
    card + monitor itself control this feature.
    - - - - - - - - - - - - - - - - - - - - - - - - -
    and finally,
    .. . .
    I'll have to MANUALLY CHANGE the monitor settings, through my video
    card's control panel, each time I want to rotate the monitor.
    - - - - - - - - - - - - - - - - - - - - - - - - -

    It looks like the purchase decision is going to turn on whether I want
    to right-click my ATI card each time I rotate the monitor.

    Do you by any chance happen to know Benjamin: are the display settings
    at least SAVED so that each time I right-click to adjust the mode I
    don't __also__ have to completely reconfigure my desktop? I have
    numerous shortcuts displayed on my desktop, and I am wondering if those
    icons will be incapable of remembering their respective positions.

    Thanks again guys. This is an incredible education in new monitor

    MISS CHIEVOUS, Oct 26, 2006
  5. Oh wait, I need to clarify one other factor:

    - - - - - - - - - - - - - - - - - - - - - - - - -
    With earlier monitors this may have been useful; but there is now no
    significant difference in image quality between VGA and DVI.
    - - - - - - - - - - - - - - - - - - - - - - - - -

    My only question here is: Does this hold true even with my older ATI
    Radeon 7200 AGP card, as well? In other words, is picture quality __in
    fact__ driven by the monitor . . . and I'll have the same quality with
    this new 24" Widescreen using a standard VGA cable with my older AGP


    MISS CHIEVOUS, Oct 26, 2006

    Ray S Guest

    Speaking off the cuff here, but I do know that even my old Matrox G450
    alows me to create custom configurations that I can switch to. Never
    used them, but it might even be true that you can assign hot keys to
    them. Worth checking the manual for any specific card your thinking of
    buying. You can usually d/l them from the manufacturer site.
    Ray S, Oct 26, 2006

    Frodo Guest

    What's the make and model of your current CRT (monitor)?

    Picture quality on VGA is affected by the DAC chips (Digital to Analog
    Converters) on the graphics card that take the digital information
    provided by the GPU (graphics processing unit) and changed to an analog
    signal that is sent to the VGA monitor.

    Each company that makes graphics cards decide on how much money they are
    willing to send on DAC.
    The more money spent, the better the DAC.
    DAC are a small cost for the graphics cards, but companies will try so save
    a few dollars using poor quality Ramdacs.
    Matrox uses really good Ramdacs, ATI based cards are a close second.
    In the past Nvidia let card manufacturers use what ever quality they wanted.
    But Nvidia started pushing card makers the use better Ramdacs starting
    around the time the 5200 came out.

    With DVI, there is no need for DAC, the digital signal sent straight to the

    My Hitachi CM 810 21" CRT (VGA) monitor had a comparable quality as a
    Viewsonic 922 19" (DVI ) LCD.
    Frodo, Oct 26, 2006
    This is wrong.
    No. Whoever says that is telling you BS. Image quality is a product of
    gfx card, video cable and monitor. With VGA the gfx card has to convert
    the digital image into analog signals, and the monitor again converts
    the analog signals into digital data. Besides that every conversion
    affects image quality analog signals are much more sensible to
    disturbances than digital signals. How much quality is lost with VGA
    depends not only on the monitor but also on the video cable and the gfx
    card. Of course not always the quality loss is visible, maybe because
    the conditions are good (gfx card provides good signal quality, video
    cable is good, no disturbances etc), because the monitor itself has an
    average image at best (especially cheap noname monitors), because the
    monitor is small (where differences are hardly noticeable), because the
    resolution is low (1280x1024 and below), and sometimes simply because
    the person in front of the monitor wouldn't even notice the difference
    if it bites him into his arse.
    Let me tell you something: I have a Dell 2005FPW (20" widescreen) and
    also had the 2405FPW for some time. I have several computers with gfx
    cards ranging from a cheap old Geforce2MX up to two 1700USD a piece
    Quadro FX4500 professional gfx boards. I can see a difference between
    VGA and DVI even on the 20" monitor with all gfx cards. With some cards
    the difference is hardly noticeable, with other cards it's more
    noticeable. But it's noticeable - always.

    With smaller monitors and/or lower resolutions the difference can be
    small enough to be unnoticeable, though. But you are looking at a 24"
    display with 1920x1200 resolution, don't forget that.

    And besides the image quality factor, DVI also has the advantage that
    there is no image centering or adjusting necessary. The image is always
    centered and clear and sharp, with VGA you often have to make adjustments.

    IMHO using a 24" TFT with 1920x1200 without DVI would be silly.

    Benjamin Gawert, Oct 26, 2006
  9. * Frodo:
    Nope. "DAC chips" (correctly called "RAMDAC") have been integrated into
    the graphics processors since RivaTNT/Rage Pro times, so for around a
    decade now. All RAMDACs in gfx cards of the last 8 years or so have a
    very high bandwidth (at least 320MHz, today 400+MHz is standard) and
    provide excellent signals.

    However, the difference in image quality doesn't come from the RAMDAC
    but from the output filters. Some card manufacturers tend to save a few
    cents by using cheap filters that allow them to fullfill EMI standards
    but which also limit the bandwidth. This results in a degradation of the
    signal quality and thus also the image quality.
    Right. That's the main reason why DVI provides a better image quality.

    Benjamin Gawert, Oct 26, 2006
  10. Ah! Okay, so I __would__ need a new graphics card with DVI. Got it.

    I'll have to think about what I want to do here. Unless the card can
    remember my desktop (or rather more to the point, the layout of my
    shortcuts) this really is going to oblige a huge change in the way I
    work. I'm so used to just having my shortcut icons right there at all


    MISS CHIEVOUS, Oct 26, 2006
  11. What should I make of Yousuf's post, here?

    Yousuf Khan wrote:
    Yeah, DVI doesn't improve your picture one iota. About the only
    advantage I've seen from it is that with an lcd monitor, it allows you
    to scale the non-native resolutions a little better, closer to a CRT
    monitor's scaling.
    MISS CHIEVOUS, Oct 26, 2006
    Don't know. It's up to you. He even didn't write what setup (gfx card,
    monitor) he has and where he didn't notice any difference. My experience
    tells me otherwise.

    Benjamin Gawert, Oct 26, 2006

    Frodo Guest

    Output filters, sound right.

    Frodo, Oct 26, 2006

    Bob Myers Guest

    I realize this is showing up as a reply to the wrong posting, but I missed
    following comments when they were first posted, and want to correct a
    couple of misconceptions:

    Actually, "DAC" (digital to analog converter) is correct; a "RAMDAC"
    was simply a DAC chip which included the color look-up tables
    ("color map" memory, as RAM), before BOTH functions were
    integrated into the graphics chips. There's no sense in keeping the
    term "RAMDAC" around at all any more, since the two are completely
    separate functions.

    This is a common misconception, but it IS a misconception. The
    main reason that any "digital" interface provides improved image
    quality with LCDs or other fixed-format displays is that such interfaces
    provide an explicit pixel clock, so that the data can always be properly
    mapped to the physical pixels of the screen. Analog interfaces such
    as the "VGA" connector do not provide such information, and instead
    the sampling clock has to be derived from other timing information
    (typically the horizontal sync signal) provided by the interface.
    Creating a sampling clock in this manner, though, can lead to some
    errors (and it's why many analog-input LCD monitors include controls
    which permit the user to fine-tune the sampling clock frequency and

    The notion that avoiding a digital-to-analog conversion is responsible
    for whatever quality improvement occurs comes from the common
    (but also mistaken) notion that LCDs are themselves somehow
    "digital." Fundamentally, though, the LCD is an analog-drive
    device, and a digital-to-analog conversion occurs within the LCD
    panel, at the drivers. In fact, LCDs have been made which preserve
    an analog video input all the way through to the pixel level - these were
    sometimes used back when analog monitor interfaces were all there were.

    Bob M.
    Bob Myers, Oct 27, 2006
  15. * Bob Myers:
    It makes sense because the CLUT table is still there. Most modern GPUs
    work with a fixed color depth (usually 24bit, some also with 32bit). The
    CLUT is still needed and used when modes with lower color depth (16bit,
    12bit, 8bit) are used.
    No, it isn't.
    This is of course correct, but doesn't change a yota to the fact that
    the main reason for the improved image quality of digital connections
    via DVI simply is the absence of A/D- and D/A conversion, and the fact
    while with analog video signals the image quality is directly
    proportional to the signal degradation of the transmission line with
    DVIs digital TMDS signalling the image quality remains constant until
    degradation reaches a certain point.
    LCDs are pixel-mapped devices with fixed resolution while CRTs can be
    pixel-mapped, line-mapped (i.e. TVs) or vector-mapped (i.e. data
    displays in most aircrafts). It simply doesn't matter that the LCD
    pixels are driven analog. The fact that the digital signals provide an
    image that is "pixel-matching" while the resulting signal after
    conversion into an anlog signal doesn't is one of the main factors
    invfluencing the image quality the user notices.

    Benjamin Gawert, Oct 27, 2006

    Bob Myers Guest

    Yes, it is. How long shall we keep this up? :)
    Not at all. This is far from the MAIN reason for improved
    image quality, as evidenced by the large number of current
    analog-interfaced LCD monitors (and other fixed-format display
    devices which provide analog inputs) in which the resulting image
    quality is indistinguishable from the same situation but with a
    "digital" input. The impact of instantaneous noise in the analog
    channel on the quality of the resulting image is generally negligible,
    unless it gets REALLY noisy - owing to the fact that the display and
    the eye will average out noise-induced errors over multiple frames.
    The classic problem with analog-connected LCDs, etc., has
    always been instability in the video data with respect to the display
    panel's physical pixels - or, in other words, incorrect and/or unstable
    sampling. The digital interface in this case has the distinct advantage
    of providing unambiguous pixel-level timing information, and so
    avoids such problems. (For a fairer comparison, try disconnecting
    the clock signal in a digital interface - and then try to regenerate
    THAT clock from, say the horizontal sync signal, which is exactly
    what analog inputs on LCDs are doing. Let me know how well
    that works out for you...:))
    If by the above, you mean LCDs are fixed-format (i.e., possessing
    a fixed physical array of pixels), that's precisely what I said above.
    However, CRTs have NEVER been "pixel mapped," in that there
    has never been a CRT-based display made (well, with the exception
    of some extremely low-volume, niche-market designs) where the
    "pixels" of the video data were in any way constrained to map to any
    physical structures (the phosphor dot triads, say) of the screen.
    Pixels as such simply don't exist for the CRT in the sense meant in
    this discussion.

    Bob M.
    Bob Myers, Oct 28, 2006
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.