Hey, Currently im using a VGA well im guessing it's still called VGA it's one with the white entry not the blue entry. I don't have a HDMI cable, but do's it matter which I use?
AFAIK only downside to VGA is that its not as bit perfect over the others. Generally speaking though you wouldn't notice a difference at resolutions lower or same as 1080p.
I rather use VGA then HDMI to my HDTV which one reason why i still have my old HDTV cause all them are HDMI now, And I would choose DVI over HDMI on monitors too, along with DP over HDMI,
VGA is better on HDTV's usually because using HDMI makes the TV's switch to some weird interlaced mode or something and makes it fuzzy as hell.
The only different between DVI and HDMI is the connector. The signaling is identical and both are capable of audio transport. A lot of HDTV's have dual mode HDMI inputs. Standard is 30hz on a TV. A lot of TV's allow you to rename HDMI connections and will switch to 60hz afterwards. 720i or 1080i is what a lot of HDTV's use for PC connections.
Just to be clear it's a DVI cause it's the white one. And im using it on a LED Monitor (60hz) not a TV.
Actually with current Videocards and monitors the decision should be between hdmi and Displayport. VGA is being phased out from what I can tell.
VGA is really sensible to EMI and the quality of the cable. I prefer DVI over HDMI, because allow you to overclock the monitor.
Think all of you are ignoring the OP and talking about VGA for no reason. @OP DVI and HDMI are exactly the same, the difference between hdmi is that it also carries an audio signal. Therefore it does not matter what you use.
DVI actually can send audio. I think it just depends on both devices on each end of the cable supporting it through that connection as well.
With HDMI the signal is sent digitally. The information displayed on the screen is what is output by the video card. VGA is analogue, what is displayed on the screen is variations in the signal. Digital (HDMI/Displayport) should be a lot clearer image than analogue (VGA). The reason for this is the analogue waveform is imperfect and inprecise. Also, the frequencies aren't 'precise', and since you are using the signal itself for the signal and not just as a carrier wave for the digital, fine details won't be sharp. This is a gross oversimplification, but it will do . The others are right about outputting HDMI to TV's, HOWEVER it would have to be a pretty old TV, or crappy TV, not to allow and display 1920x1080P@60 Hz (better ones do things like 12 bpc, RGB 4:4:4 etc).