I have a PC with (02 SLI Mode) Nvidia 8800GTS at 512MB each. using a SAmsung monitor, quite good. Now I read about people using High Definition Tv as evry day computer monitor. How that works? Both my Nvidia has 2 HDMI connections each one. Those TV are also using HDMI connections.Will that work fine for my works in 3D applications? Note I'm not using games, I'm talking about my work computer. I just want to know since these High definition are very much around and prices are super competitive comparing to real monitors for pc. Today one 37 " High Definition TV from Samsung with 15.000 contrast is 30% less expensive than a Samsung 27" monitor with 3.000 contrast! For a person that needs ONLY one monitor because of SLI enable for the graphics cards - that is a God's gift to all wishes. Now will it work really as good as a monitor? or do I have to be aware of the troubles to come? thanks for help here..saluto guys!
Depends on whether you are willing to work on a 1366×768 stretched over the 37" or 1920×1080 stretched over 37" in case of FULL HD.
http://global.msi.com.tw/index.php?func=proddesc&prod_no=1371&maincat_no=130&cat2_no=136 my cards are here. now I'm at 1920X 1200 and it's pretty tiny for me...I got myopia...so...
and also matters sometimes if you want screen glare or not when your working. overall, computer moniters have better picture clarity then tv's...
I use as my only monitor, a 47" Westinghouse 1080p LCD HDTV. Haven't had a single prob. Although the HDMI input is preferred, most come with the regular D-sub connector. Go HDMI if you can though. I fire mine with an 8800GT 512 and it's wonderful. I don't have any glare, no reflection issues and because am running the full 1080 from the card, clarity isn't an issue. With my trusty recliner, trackman wheel, and illuminated keyboard, I wonder how I did it all those years without what I have now. Simply breathtaking I'll tell ya! -V- (pretty happy camper)
to do normal viewing like web surfing on a hdtv you need to run it at 720 when playing games or watching movies play at 1080. running at 1080 you just cant read anything but games and movies are perfect. i run my comp on a 64in hdtv and thats how i run my resolutions oh and a side note because all my sound goes to a receiver i use the component instead of the hdmi
It depends on the TV. a 720p TV will run at 1280x720 (most plasmas and DLPs), or 1366x768 (LCDs) native resolution, so that's what you need to set your display adapter to for the best clarity picture. For desktop apps, some people find this resolution too low for a big screen (huge icons and text). A 1080p set runs at 1920x1080, which will be great for desktop apps and browsing the internet. For most games, you'll need a pretty good video card to play at those resolutions, however, most tv's can scale down to 720p, so it shouldn't be much of an issue. I have an LG 42" LCD (1366x768), and have played games on it. If your tv doesn't have a DVI port, I recommend getting a DVI to HDMI cable. It looks noticeably better than the standard 15pin D-sub connection. Audio needs to be hooked up separately. You probably need to adjust your display a bit differently for PC use as well. I have multiple settings on mine, so I've got one setup for the pc whenever I feel like moving it into the living room. Also, some TV's have noticeable input lag. Basically, there is a short time gap from when the signal comes out of the PC (or other source) and then is "processed" by the TV's software before it is displayed on the screen. So some tv's may have almost a half second (or more) of lag time between what your pc/source is doing and when it shows up on the screen. This is bad for gaming, very bad. Not much you can do about it, other than do some research on the tv's you are interested in buying and. My gf's Toshiba has a "gaming mode" which is supposed to cut down input time to a max of 30 milliseconds. I haven't tested it out though.