Guru3D.com Forums

Go Back   Guru3D.com Forums > General Chat > The HTPC, HDTV & High Definition section
The HTPC, HDTV & High Definition section Home Theater PC Enthusiasts or want to talk in High-Definition ? This is Guru3Ds Premier Community of HD and HTPC.


Reply
 
Thread Tools Display Modes
DVI vs. D-Sub
Old
  (#1)
addict131
Newbie
 
Videocard: XFX 6800 GS xXx 256MB
Processor: Pentium 4 630 @ 3.0 GHz
Mainboard: Asus P5LD2-VM
Memory: KingMAX DDR2-533 1028MB x 2
Soundcard: Onboard
PSU: Generic 450 Watts
Default DVI vs. D-Sub - 02-28-2006, 02:43 | posts: 6 | Location: Toronto, Canada

hi guys, i was just wondering ... would changing to the use of a DVI-D cable increase the image quality on my LCD screen? im currently using a D-Sub... both my vid card and my LCD has the two plugs, but my monitor only came with a D-Sub wire...

so im here wondering if i should go out and buy a DVI-D cable and expect a noticable quality improvement.

im currently using a:
Samsung 740B (600:1, 8ms, 300cd/m2)
XFX 6800 GS xXx
   
Reply With Quote
 
Old
  (#2)
RustDust
Member Guru
 
RustDust's Avatar
 
Videocard: Sapphire 5770
Processor: X3 720 BE Unlocked to X4
Mainboard: Gigabyte MA790GPT-UD3H
Memory: G.Skill RipJaws 1600
Soundcard:
PSU: Sparkle ATX-400PN
Default 02-28-2006, 18:28 | posts: 69 | Location: Asheville, NC USofA

I have tried both, I cannot tell a difference between the two connected to my LCD monitor, I also have a DLP projector that I have tried and still couldn't tell any difference. Do a search on the subject an you will find the same results, save your money, I didn't.
   
Reply With Quote
Old
  (#3)
addict131
Newbie
 
Videocard: XFX 6800 GS xXx 256MB
Processor: Pentium 4 630 @ 3.0 GHz
Mainboard: Asus P5LD2-VM
Memory: KingMAX DDR2-533 1028MB x 2
Soundcard: Onboard
PSU: Generic 450 Watts
Default 02-28-2006, 23:41 | posts: 6 | Location: Toronto, Canada

thanks, caused i asked around in some other forums and found that out... DVI-D wires are pretty expensive! 20 bucks for a frikken wire!!!
   
Reply With Quote
Old
  (#4)
chazly413
Maha Guru
 
chazly413's Avatar
 
Videocard: XFX 6870 2GB
Processor: Athlon II X4 630
Mainboard: ASUS M4N98TD EVO
Memory: 2x4GB DDR31333
Soundcard: Realtek
PSU: Cooler Master 600W
Default 03-01-2006, 00:42 | posts: 1,136 | Location: Maryland, USA

On my monitor it made the text a lot sharper and I get less glitches, but that leads me to a somewhat faulty VGA cable so I dunno. It also gives you less latency because with a VGA, it has to transform from digital to analog in the cable to digital again in the LCD, cuz LCDs are natively digital. With the DVI cable, the signal stays digital the whole time, so you get less latency apparently.
   
Reply With Quote
 
Old
  (#5)
sbn
Newbie
 
Videocard:
Processor:
Mainboard:
Memory:
Soundcard:
PSU:
Default 03-01-2006, 01:51 | posts: 2

On my Dell 2005FPW with X800XL, there's a HUGE difference - the text is sharper, the image is brighter and the contrast is better with DVI...

On my dad's PC, with a Viewsonic (VX930b i think) the difference was even bigger, with VGA, the 3 darkest shades in the greyscale on the buttom of the following page looked equal, with DVI it was easy to see the difference
http://www.dpreview.com/reviews/sonydscr1/
   
Reply With Quote
Old
  (#6)
cDreem
Newbie
 
Videocard: XFX GeForce 7900GS XT Ed.
Processor: Pentium Core 2 Duo E6600
Mainboard: Gigabyte P35-DS3R
Memory: 2GB (2x1GB) Crucial DDR2 800
Soundcard: Creative SoundBlaster Audigy SE
PSU: Corsair 520W CMPSU-520HX
Default 08-01-2007, 16:29 | posts: 1

It all depends on the resolution youíre running at.

Up to about 720p-ish (that's 1440x900) there's no noticeable difference between VGA D-Sub and DVI-D. The only difference would be in the latency, where VGA would have a latency equal to that of your monitor rating, and DVI-D would be a bit faster (so, less ghosting in-game).

However, above this resolution, DVI-D is king. A friend of mine had a 19'' Sceptre that he used VGA with and it looked stunning. I recently hooked up my BenQ FP202W 20'' Widescreen (1000:1, 300 cd/m2, 5ms) which is a 1050p (res. 1680x1050) on a XFX 7900GS XT which looked like sin with the VGA D-Sub cables. I went to best buy and bought a pair of gold-plated DVI-D cables (the plating increases conductivity, just like the stuff on most high-end mice) for about 25 dollars after tax, and I noticed an immediate difference.
On VGA D-Sub, the text was distorted and looked as if it had been blown up to a higher resolution by stretching it out in MS Paint. Also, the display overall looked a bit grainy if you looked very closely. Since Iím obsessive about things being perfect, I couldn't let it slide. I also had minimal ghosting in Counter Strike: Source (the only thing I tried with VGA D-Sub. I literally left my house 15 minutes after seeing the monitor on D-Sub to get DVI).
As Soon as I turned on the monitor with DVI, I was beside myself at the improvement. Everything looked PERFECT. No more grainy-look, all text is sharp, and no ghosting in ANY game (and Iím a twitch FPS gamer, so I move the mouse very quickly).

All-in-all, DVI does increase your monitor's output. If you're on a lower desktop resolution, you won't notice a difference there, but if you are getting ghosting in game, you will definitely see a reduction if you switch to DVI.

NOTE: If your monitor does not support DVI, using a DVI cable and a DVI to D-Sub adapter on your monitor will NOT do anything for you. Only splurge on DVI if your monitor supports it.

Last edited by cDreem; 08-01-2007 at 16:42.
   
Reply With Quote
Old
  (#7)
ipods_are_gay
Member Guru
 
Videocard: BFG 8800gts 640mb OC2
Processor: Core2Duo e6600 with Tuniq Tower
Mainboard: ASUS P5N-E-SLI
Memory: 2gb Crucial Ballistix 5300
Soundcard: Audigy 2 ZS
PSU: OCZ gameXstream 600w
Default 08-06-2007, 00:08 | posts: 73

my 19" looked ALOT sharper on DVI than D-sub

and the whole *does an expensive cable make picture better* is a whole different can of worms we shall leave for the AVS guys to deal with
   
Reply With Quote
Old
  (#8)
Joey
Ancient Guru
 
Joey's Avatar
 
Videocard: 2600XT + Panasonic S10
Processor: Intel C2D 2.4Ghz Q6600
Mainboard: Foxconn G33M ĶATX
Memory: 4GB (2x2GB) 800Mhz Corsai
Soundcard: X-fi ExMusic + Logitech Z
PSU: Dell 350Watt.
Default 08-06-2007, 00:41 | posts: 4,156 | Location: Cambridge, England

Quote:
Originally Posted by cDreem View Post
However, above this resolution, DVI-D is king. A friend of mine had a 19'' Sceptre that he used VGA with and it looked stunning. I recently hooked up my BenQ FP202W 20'' Widescreen (1000:1, 300 cd/m2, 5ms) which is a 1050p (res. 1680x1050) on a XFX 7900GS XT which looked like sin with the VGA D-Sub cables. I went to best buy and bought a pair of gold-plated DVI-D cables (the plating increases conductivity, just like the stuff on most high-end mice) for about 25 dollars after tax, and I noticed an immediate difference.
On VGA D-Sub, the text was distorted and looked as if it had been blown up to a higher resolution by stretching it out in MS Paint. Also, the display overall looked a bit grainy if you looked very closely. Since Iím obsessive about things being perfect, I couldn't let it slide. I also had minimal ghosting in Counter Strike: Source (the only thing I tried with VGA D-Sub. I literally left my house 15 minutes after seeing the monitor on D-Sub to get DVI).
As Soon as I turned on the monitor with DVI, I was beside myself at the improvement. Everything looked PERFECT. No more grainy-look, all text is sharp, and no ghosting in ANY game (and Iím a twitch FPS gamer, so I move the mouse very quickly.
There is no such thing as "1050p" and gold plated DVI cables adds nothing... much the same as gold plating on any wiring. On a pure digital connection it's doubly pointless.

Save you money and get the cheapest DVI cable you can find... digital cabling either works or it doesn't. The signal can't be improved or degraded by anything.

The reason to use DVI over VGA is perfect pixel mapping... that makes text look pin sharp, as has been said and it generally makes everything look solid.
There is a little picture here for checking.... If this shows no signs of flickering then you are set with VGA.. if it's not then go for the DVI.

   
Reply With Quote
Old
  (#9)
greengiant
Master Guru
 
Videocard: ATI 5770
Processor: Core 2 Quad Q8200
Mainboard: Asus
Memory: 6GB DDR2
Soundcard:
PSU: Corsair 750W
Default 08-06-2007, 03:41 | posts: 346

the thing about vga is that it has to be calibrated.

most people just use the auto adjust feature and be done with it(some don't even do that), but a lot of the time it doesn't work perfectly, so the phase and clock have to be adjusted manually

it can be done by looking at patters, like the ones joeydoo posted, and adjusting until all the pixels are lined up.

On some monitors it is impossible because they use cheap DACs. Since VGA is analog the cable quality is also an issue. The cables that come with the monitor are usually pretty poor
   
Reply With Quote
Old
  (#10)
NvidiaFreak
Ancient Guru
 
NvidiaFreak's Avatar
 
Videocard: ATI Radeon HD 7660D
Processor: A10-5800K 4.2ghz Turbo
Mainboard: MSI FM2-A75MA-E35
Memory: 8GB DDR3 1600Mhz XMP
Soundcard: Z-5500 5.1
PSU: 380W
Default 08-06-2007, 06:32 | posts: 4,777 | Location: 〓ω〓

in the pc world there are call XHD aka 16:10 and the XHD for pc are 1050p, 1200p and 1600p, when console has 720p and 1080p. pretty much 1050p u can see the two black bars when seeing 720p movie on top when 16:9 u see none so the best 16:10 is 1920x1200 which is better then 1080p one way since i never test a 1920x1200 lcd before.

Last edited by NvidiaFreak; 08-06-2007 at 06:39.
   
Reply With Quote
 
Old
  (#11)
Psytek
Ancient Guru
 
Psytek's Avatar
 
Videocard: 2x 260 GTX 216 SLI
Processor: Core 2 E5200 @3.36GHz
Mainboard: Asus P5N-T deluxe
Memory: PC2-6400 4GB
Soundcard: Sennheiser PC-166
PSU: Corsair HX620W
Default 08-06-2007, 09:12 | posts: 3,370 | Location: UK

Quote:
Originally Posted by NvidiaFreak View Post
1050p, 1200p and 1600p, when console has 720p and 1080p. pretty much 1050p
Will people please stop talking about 1050p and 1600p. Monitors are not TVs.
   
Reply With Quote
Old
  (#12)
eRa`
Maha Guru
 
eRa`'s Avatar
 
Videocard: Palit GeForce GTX 570
Processor: Core i7 920
Mainboard: Asus P6T Deluxe V2
Memory: 6GB G.Skill DDR3-1333
Soundcard: Realtek HD Audio
PSU: be quiet! 850W
Default 08-06-2007, 20:37 | posts: 1,825 | Location: Germany

I can see a somewhat bad flickering in the picture of joeydoo, gonna get a DVI cable soon for my new 22" monitor.
My graphics card haves two DVI-I outputs and my monitor one DVI-D input, are both compatible and what cable do I need for it?

Am I right that I need a DVI-D cable? Or won't fit it with my 8800?

Thanks in advance.

EDIT: Never mind, just googled a bit and the DVI-D will work.

Last edited by eRa`; 08-06-2007 at 21:12.
   
Reply With Quote
Old
  (#13)
GXGamer
Banned
 
Videocard: 2 7800GTX's SLI
Processor: Intel Pentium D 940
Mainboard: ASUS P5N32 Deluxe
Memory: 2 Gigs Of Dominator 800
Soundcard: XFI Fatility Gamer
PSU: Silverstone Zeus 750
Default 08-08-2007, 19:37 | posts: 47 | Location: Home

I was wondering this too.

I have a Samsung with D-sub VGA and it goes to 1366 768 and looks good, but i went to RGB or DVI to HDMI and it flickers around fonts and i get overscan issues.

I would like to settle with RGB cuz VGA isn't HD. Is there a fix for overscan when using calyst drivers and ATiX800 PE to a Samsung 40 LCD TV that supports up to 1080i?
   
Reply With Quote
Old
  (#14)
Year
Ancient Guru
 
Year's Avatar
 
Videocard: EVGA GTX 690
Processor: Intelģ i7 2600
Mainboard: Asus P67 Evo
Memory: G.Skill Sniper DDR3 16GB
Soundcard: Auzentech Bravura 7.1
PSU: Enermax Galaxy 850W
Default 04-25-2012, 01:10 | posts: 11,696 | Location: ♫

the walking dead... resurrecting the thread lol

on my samsung LED monitor, the D-Sub is superior to DVI.

first of all colors are purer, banding is pretty much gone and colors are much more precise, i can now see proper shades of grey etc.

also text don't murder the eyes anymore

and finally the little bleeding that was on the right side is gone.

i came to 2 conclusions

1- the DVI-D cable i'm using is crap

2- D-Sub handles colors better.

performance wise i couldn't notice any difference, no additional latency or lag and fps is exactly the same.

in the end i decided to use the D-sub cable.

but i will investigate further, i suspect the quality of the DVI cable actually makes a difference, i got mine from Radioshack, cheapy.

will buy a quality one and see if it makes a difference, but so far D-Sub wins.
   
Reply With Quote
Old
  (#15)
rflair
Don Commisso
 
rflair's Avatar
 
Videocard: GTX 980/290X
Processor: 3770K@4.4/Q9550@3.83
Mainboard: Asus P877V-Pro/Asus P5K-E
Memory: 16G 2GHz/8G 900MHz
Soundcard: Onboard/Fiio E10
PSU: Corsair HX850/PPC 860
Default 04-25-2012, 01:55 | posts: 2,979 | Location: Canada

Quote:
Originally Posted by Year View Post
the walking dead... resurrecting the thread lol

on my samsung LED monitor, the D-Sub is superior to DVI.

first of all colors are purer, banding is pretty much gone and colors are much more precise, i can now see proper shades of grey etc.

also text don't murder the eyes anymore

and finally the little bleeding that was on the right side is gone.

i came to 2 conclusions

1- the DVI-D cable i'm using is crap

2- D-Sub handles colors better.

performance wise i couldn't notice any difference, no additional latency or lag and fps is exactly the same.

in the end i decided to use the D-sub cable.

but i will investigate further, i suspect the quality of the DVI cable actually makes a difference, i got mine from Radioshack, cheapy.

will buy a quality one and see if it makes a difference, but so far D-Sub wins.
DVI is a digital signal, cable quality is a moot point, unless your cable somehow has a broken wire.

D-Sub looking better is strange, but it could be the analog conversion is creating a smoother color grade whereas the DVI signal is more precise to the content (digital to digital).
   
Reply With Quote
Old
  (#16)
dcx_badass
Ancient Guru
 
dcx_badass's Avatar
 
Videocard: Gigabyte GTX570 1280mb
Processor: I5 4670k
Mainboard: Gigabyte GA-Z87X-D3H
Memory: 8GB DDR3 1333mhz
Soundcard: Realtek ALC892
PSU: 600w OCZ StealthXstream
Default 04-25-2012, 06:16 | posts: 9,869 | Location: Yorkshire [UK]

I second the above, first you can't get crap DVI cables, it will basically work or not (with noticable errors).

DVI is way better than D-Sub, sharper, chrisper, better colours, only way I think you could notice is as mentioned that it's softer so doesn't make lack of quality as obvious.
   
Reply With Quote
Old
  (#17)
Anarion
Ancient Guru
 
Anarion's Avatar
 
Videocard: Gigabyte GeForce GTX 970
Processor: Intel Core i7 3770K
Mainboard: ASUS P8Z77-V
Memory: G.SKILL RipjawsX 16 GB
Soundcard: Sound Blaster Zx + HD 595
PSU: Corsair AX760
Default 04-25-2012, 13:41 | posts: 10,984 | Location: Finland

I have one TN display which does not seem to work correctly when I use DVI. Dithering is the problem. It is static instead of moving which looks rather horrible. I dunno if it is some kind of NVIDIA only issue on that particular displat, but it work correctly when PS3 is hooked to that display. It also looks correct when using analogue cable, not as sharp and crisp though.
   
Reply With Quote
Old
  (#18)
HeavyHemi
Ancient Guru
 
HeavyHemi's Avatar
 
Videocard: SLI TITAN SC @ 1097/3105
Processor: i7 980x 4.3 Ghz 1.35 v
Mainboard: EVGA X58 E758
Memory: 12Gb Corsair Dom 2000
Soundcard: Asus Xonar Phoebus
PSU: CORSAIR AX1200
Default 04-26-2012, 08:55 | posts: 3,557 | Location: Wooing whilst wearing only socks.

Quote:
Originally Posted by Psytek View Post
Will people please stop talking about 1050p and 1600p. Monitors are not TVs.
P stands for progressive the type scanning. I stands for interlaced another type of scanning. Monitors and LCD TV's to a large extent support both.

240p
576p
288p
480p
720p
1080p
1440p
for example...
   
Reply With Quote
Old
  (#19)
HeavyHemi
Ancient Guru
 
HeavyHemi's Avatar
 
Videocard: SLI TITAN SC @ 1097/3105
Processor: i7 980x 4.3 Ghz 1.35 v
Mainboard: EVGA X58 E758
Memory: 12Gb Corsair Dom 2000
Soundcard: Asus Xonar Phoebus
PSU: CORSAIR AX1200
Default 04-26-2012, 08:59 | posts: 3,557 | Location: Wooing whilst wearing only socks.

Quote:
Originally Posted by rflair View Post
DVI is a digital signal, cable quality is a moot point, unless your cable somehow has a broken wire.

D-Sub looking better is strange, but it could be the analog conversion is creating a smoother color grade whereas the DVI signal is more precise to the content (digital to digital).
True to an extent. However, at higher resolutions, (higher frequencies) a better quality cable will allow you more distance before having trouble. With a cheapo HDMI cable @1920x1080 you can start having issues beyond 15'.
   
Reply With Quote
Old
  (#20)
Year
Ancient Guru
 
Year's Avatar
 
Videocard: EVGA GTX 690
Processor: Intelģ i7 2600
Mainboard: Asus P67 Evo
Memory: G.Skill Sniper DDR3 16GB
Soundcard: Auzentech Bravura 7.1
PSU: Enermax Galaxy 850W
Default 04-27-2012, 02:13 | posts: 11,696 | Location: ♫

Quote:
Originally Posted by rflair View Post
DVI is a digital signal, cable quality is a moot point, unless your cable somehow has a broken wire.

D-Sub looking better is strange, but it could be the analog conversion is creating a smoother color grade whereas the DVI signal is more precise to the content (digital to digital).
well i discovered what was the problem, believe it or not it was the cable!

i got an "Eforcity" DVI-D (dual link) cable.

http://www.amazon.com/Eforcity-Black...5488275&sr=1-7

now everything looks great and yes better than vga indeed, the first thing i noticed was the quality of the desktop, you were right, vga was blurring the text and colors slightly, hence the placebo improvements.

now that i think about it, i used to hear a faint wooshish/static sound on my speakers when the radiocrap cable was connected whenever i moved my mouse.

so either the radioshack dvi cable was faulty (even though it was brand new) or it was made of some very crappy unshielded material or something, also the radioshack cable wasn't even gold plated, the effects of this cable on my monitor was similar to having an unshielded speaker close to a monitor with part of it suffering from what appeared to be gamma shifting.

thankfully it was only the cable and not the actual dvi connector on the monitor or the video card acting up.

Last edited by Year; 04-27-2012 at 02:22.
   
Reply With Quote
Old
  (#21)
viren
Maha Guru
 
viren's Avatar
 
Videocard: GTX 570
Processor: I7 870
Mainboard: P55
Memory: 4GB
Soundcard: 7.1 PCI-E
PSU: 800W
Default 04-27-2012, 02:54 | posts: 2,357 | Location: Pune, India

I have noticed some minor flickering on the screen with the dsub cable but thats because and as some said i did not do the right calibration. I wouldnt waste time in calibration but instead get a dvi cable and forget the rest.
   
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Powered by vBulletin®
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
vBulletin Skin developed by: vBStyles.com
Copyright (c) 1995-2014, All Rights Reserved. The Guru of 3D, the Hardware Guru, and 3D Guru are trademarks owned by Hilbert Hagedoorn.