DVI vs. D-Sub

Discussion in 'The HTPC, HDTV & Ultra High Definition section' started by addict131, Feb 28, 2006.

  1. addict131

    addict131 New Member

    Messages:
    6
    Likes Received:
    0
    GPU:
    XFX 6800 GS xXx 256MB
    hi guys, i was just wondering ... would changing to the use of a DVI-D cable increase the image quality on my LCD screen? im currently using a D-Sub... both my vid card and my LCD has the two plugs, but my monitor only came with a D-Sub wire...

    so im here wondering if i should go out and buy a DVI-D cable and expect a noticable quality improvement.

    im currently using a:
    Samsung 740B (600:1, 8ms, 300cd/m2)
    XFX 6800 GS xXx
     
  2. RustDust

    RustDust Active Member

    Messages:
    69
    Likes Received:
    0
    GPU:
    Sapphire 5770
    I have tried both, I cannot tell a difference between the two connected to my LCD monitor, I also have a DLP projector that I have tried and still couldn't tell any difference. Do a search on the subject an you will find the same results, save your money, I didn't.
     
  3. addict131

    addict131 New Member

    Messages:
    6
    Likes Received:
    0
    GPU:
    XFX 6800 GS xXx 256MB
    thanks, caused i asked around in some other forums and found that out... DVI-D wires are pretty expensive! 20 bucks for a frikken wire!!!
     
  4. chazly413

    chazly413 Maha Guru

    Messages:
    1,136
    Likes Received:
    2
    GPU:
    MSI GeForce 970 4GB
    On my monitor it made the text a lot sharper and I get less glitches, but that leads me to a somewhat faulty VGA cable so I dunno. It also gives you less latency because with a VGA, it has to transform from digital to analog in the cable to digital again in the LCD, cuz LCDs are natively digital. With the DVI cable, the signal stays digital the whole time, so you get less latency apparently.
     

  5. sbn

    sbn New Member

    Messages:
    2
    Likes Received:
    0
    On my Dell 2005FPW with X800XL, there's a HUGE difference - the text is sharper, the image is brighter and the contrast is better with DVI...

    On my dad's PC, with a Viewsonic (VX930b i think) the difference was even bigger, with VGA, the 3 darkest shades in the greyscale on the buttom of the following page looked equal, with DVI it was easy to see the difference
    http://www.dpreview.com/reviews/sonydscr1/
     
  6. cDreem

    cDreem New Member

    Messages:
    1
    Likes Received:
    0
    GPU:
    XFX GeForce 7900GS XT Ed.
    It all depends on the resolution you’re running at.

    Up to about 720p-ish (that's 1440x900) there's no noticeable difference between VGA D-Sub and DVI-D. The only difference would be in the latency, where VGA would have a latency equal to that of your monitor rating, and DVI-D would be a bit faster (so, less ghosting in-game).

    However, above this resolution, DVI-D is king. A friend of mine had a 19'' Sceptre that he used VGA with and it looked stunning. I recently hooked up my BenQ FP202W 20'' Widescreen (1000:1, 300 cd/m2, 5ms) which is a 1050p (res. 1680x1050) on a XFX 7900GS XT which looked like sin with the VGA D-Sub cables. I went to best buy and bought a pair of gold-plated DVI-D cables (the plating increases conductivity, just like the stuff on most high-end mice) for about 25 dollars after tax, and I noticed an immediate difference.
    On VGA D-Sub, the text was distorted and looked as if it had been blown up to a higher resolution by stretching it out in MS Paint. Also, the display overall looked a bit grainy if you looked very closely. Since I’m obsessive about things being perfect, I couldn't let it slide. I also had minimal ghosting in Counter Strike: Source (the only thing I tried with VGA D-Sub. I literally left my house 15 minutes after seeing the monitor on D-Sub to get DVI).
    As Soon as I turned on the monitor with DVI, I was beside myself at the improvement. Everything looked PERFECT. No more grainy-look, all text is sharp, and no ghosting in ANY game (and I’m a twitch FPS gamer, so I move the mouse very quickly).

    All-in-all, DVI does increase your monitor's output. If you're on a lower desktop resolution, you won't notice a difference there, but if you are getting ghosting in game, you will definitely see a reduction if you switch to DVI.

    NOTE: If your monitor does not support DVI, using a DVI cable and a DVI to D-Sub adapter on your monitor will NOT do anything for you. Only splurge on DVI if your monitor supports it.
     
    Last edited: Aug 1, 2007
  7. ipods_are_gay

    ipods_are_gay Active Member

    Messages:
    73
    Likes Received:
    0
    GPU:
    BFG 8800gts 640mb OC2
    my 19" looked ALOT sharper on DVI than D-sub

    and the whole *does an expensive cable make picture better* is a whole different can of worms we shall leave for the AVS guys to deal with :smoke:
     
  8. Joey

    Joey Guest

    Messages:
    4,144
    Likes Received:
    0
    GPU:
    2600XT + Panasonic S10
    There is no such thing as "1050p" and gold plated DVI cables adds nothing... much the same as gold plating on any wiring. On a pure digital connection it's doubly pointless.

    Save you money and get the cheapest DVI cable you can find... digital cabling either works or it doesn't. The signal can't be improved or degraded by anything.

    The reason to use DVI over VGA is perfect pixel mapping... that makes text look pin sharp, as has been said and it generally makes everything look solid.
    There is a little picture here for checking.... If this shows no signs of flickering then you are set with VGA.. if it's not then go for the DVI.

    [​IMG]
     
  9. greengiant

    greengiant Master Guru

    Messages:
    346
    Likes Received:
    0
    GPU:
    ATI 5770
    the thing about vga is that it has to be calibrated.

    most people just use the auto adjust feature and be done with it(some don't even do that), but a lot of the time it doesn't work perfectly, so the phase and clock have to be adjusted manually

    it can be done by looking at patters, like the ones joeydoo posted, and adjusting until all the pixels are lined up.

    On some monitors it is impossible because they use cheap DACs. Since VGA is analog the cable quality is also an issue. The cables that come with the monitor are usually pretty poor
     
  10. NvidiaFreak

    NvidiaFreak Ancient Guru

    Messages:
    4,769
    Likes Received:
    0
    GPU:
    ATI Radeon HD 7660D
    in the pc world there are call XHD aka 16:10 and the XHD for pc are 1050p, 1200p and 1600p, when console has 720p and 1080p. pretty much 1050p u can see the two black bars when seeing 720p movie on top when 16:9 u see none so the best 16:10 is 1920x1200 which is better then 1080p one way since i never test a 1920x1200 lcd before.
     
    Last edited: Aug 6, 2007

  11. Psytek

    Psytek Ancient Guru

    Messages:
    3,367
    Likes Received:
    0
    GPU:
    2x 260 GTX 216 SLI
    Will people please stop talking about 1050p and 1600p. Monitors are not TVs.
     
  12. eRa`

    eRa` Ancient Guru

    Messages:
    1,823
    Likes Received:
    1
    GPU:
    Palit GeForce GTX 570
    I can see a somewhat bad flickering in the picture of joeydoo, gonna get a DVI cable soon for my new 22" monitor.
    My graphics card haves two DVI-I outputs and my monitor one DVI-D input, are both compatible and what cable do I need for it?

    Am I right that I need a DVI-D cable? Or won't fit it with my 8800?

    Thanks in advance.

    EDIT: Never mind, just googled a bit and the DVI-D will work.
     
    Last edited: Aug 6, 2007
  13. GXGamer

    GXGamer Banned

    Messages:
    47
    Likes Received:
    0
    GPU:
    2 7800GTX's SLI
    I was wondering this too.

    I have a Samsung with D-sub VGA and it goes to 1366 768 and looks good, but i went to RGB or DVI to HDMI and it flickers around fonts and i get overscan issues.

    I would like to settle with RGB cuz VGA isn't HD. Is there a fix for overscan when using calyst drivers and ATiX800 PE to a Samsung 40 LCD TV that supports up to 1080i?
     
  14. Year

    Year Ancient Guru

    Messages:
    11,592
    Likes Received:
    2
    GPU:
    EVGA GTX 690
    the walking dead... resurrecting the thread lol

    on my samsung LED monitor, the D-Sub is superior to DVI.

    first of all colors are purer, banding is pretty much gone and colors are much more precise, i can now see proper shades of grey etc.

    also text don't murder the eyes anymore [​IMG]

    and finally the little bleeding that was on the right side is gone.

    i came to 2 conclusions

    1- the DVI-D cable i'm using is crap

    2- D-Sub handles colors better.

    performance wise i couldn't notice any difference, no additional latency or lag and fps is exactly the same.

    in the end i decided to use the D-sub cable.

    but i will investigate further, i suspect the quality of the DVI cable actually makes a difference, i got mine from Radioshack, cheapy. :D

    will buy a quality one and see if it makes a difference, but so far D-Sub wins.
     
  15. rflair

    rflair Don Coleus Staff Member

    Messages:
    4,854
    Likes Received:
    1,725
    GPU:
    5700XT
    DVI is a digital signal, cable quality is a moot point, unless your cable somehow has a broken wire.

    D-Sub looking better is strange, but it could be the analog conversion is creating a smoother color grade whereas the DVI signal is more precise to the content (digital to digital).
     

  16. dcx_badass

    dcx_badass Guest

    Messages:
    9,965
    Likes Received:
    1
    GPU:
    Palit GTX 1060 6GB
    I second the above, first you can't get crap DVI cables, it will basically work or not (with noticable errors).

    DVI is way better than D-Sub, sharper, chrisper, better colours, only way I think you could notice is as mentioned that it's softer so doesn't make lack of quality as obvious.
     
  17. Anarion

    Anarion Ancient Guru

    Messages:
    13,599
    Likes Received:
    386
    GPU:
    GeForce RTX 3060 Ti
    I have one TN display which does not seem to work correctly when I use DVI. Dithering is the problem. It is static instead of moving which looks rather horrible. I dunno if it is some kind of NVIDIA only issue on that particular displat, but it work correctly when PS3 is hooked to that display. It also looks correct when using analogue cable, not as sharp and crisp though.
     
  18. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    P stands for progressive the type scanning. I stands for interlaced another type of scanning. Monitors and LCD TV's to a large extent support both.

    240p
    576p
    288p
    480p
    720p
    1080p
    1440p
    for example...
     
  19. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    True to an extent. However, at higher resolutions, (higher frequencies) a better quality cable will allow you more distance before having trouble. With a cheapo HDMI cable @1920x1080 you can start having issues beyond 15'.
     
  20. Year

    Year Ancient Guru

    Messages:
    11,592
    Likes Received:
    2
    GPU:
    EVGA GTX 690
    well i discovered what was the problem, believe it or not it was the cable!

    i got an "Eforcity" DVI-D (dual link) cable.

    http://www.amazon.com/Eforcity-Blac...7?s=electronics&ie=UTF8&qid=1335488275&sr=1-7

    now everything looks great and yes better than vga indeed, the first thing i noticed was the quality of the desktop, you were right, vga was blurring the text and colors slightly, hence the placebo improvements.

    now that i think about it, i used to hear a faint wooshish/static sound on my speakers when the radiocrap cable was connected whenever i moved my mouse. :bugeye:

    so either the radioshack dvi cable was faulty (even though it was brand new) or it was made of some very crappy unshielded material or something, also the radioshack cable wasn't even gold plated, the effects of this cable on my monitor was similar to having an unshielded speaker close to a monitor with part of it suffering from what appeared to be gamma shifting.

    thankfully it was only the cable and not the actual dvi connector on the monitor or the video card acting up. ;)
     
    Last edited: Apr 27, 2012

Share This Page