It looks sharp to me like so (global profile), especially now with nvinspector 1.9.6.5+ and disabled trilinear optimization.
Why wouldn’t it be? Colour is a very big part of an image and how good or bad it is will make a very noticeable difference to the overall quality.
Yes true, but the monitor has final control over colour, not the video card. What you see on your screen is different to what I see on mine, even if we're looking at the same thing. The OP even said this when bitching about the quality of TN monitors (after he blamed the video card)....he contradicted himself without even realizing. Reviewers don't take it into account when analysing IQ....just have a read of Hilberts article.... ^This isn't really directed at you...more towards the OP.
Yeah I agree about that part, was meaning the video card though, as wasn’t wanting to get dragged into an monitor tech discussion as these tend to go round in circles as so much of it is based on personal preference. Meant the video card (drivers) selecting the wrong colour space for the monitor when using an HDMI connection. It’s something that appears to happen to a fair few people when using Nvidia cards, and unlike AMD, Nvidia don’t seem to want to add an option to change it manually. Easily fixed though, albeit abit fiddly. Not saying this is the reason for people claiming Nvidia has worse quality, but when it’s wrong the difference is quite drastic when it comes to black level, contrast, colours and even perceived sharpness, so it might be.
Not true. You may need to adjust your TVs settings to suit the different outputs, same for moving from NVidia to ATI. But once done, they are as good as each other. I'm using my 580 on a LED backlit TN LCD and Plasma TV. It looks great on both. I am a previous ATI crossfire user, you are being conned by someone.
indeed many people around claiming that and that is the reason I'm worried. i used to have the full rpg bug when i had a gtx295 and was connecting it to my projector. i dont understand why nvidiots are getting so defensive.
They're defensive because you came in with a question that is normally troll bait. And having an ATI card in your specs doesn't help. Personally, I love my 570. Great IQ, 3d vision ready, FXAA, now TXAA for future titles, and at the risk of sounding fanboyish - developers seem to support nvidia cards moreso and better than ati. Be it compatbility, speed or features. Nvidia marketing knows how to push it's products on the right people. Something ATI has NEVER en good at, even back in the days of the 9800 Pro when they were actually the top dog for a brief time.
Yep. LED backlighting is mandatory for an IPS. :thumbup: :3eyes: I don't think nVidia has any problems with image quality nor do my two 24" Dell Ultrasharp IPS panels. CCFL backlighting and 102%+ of NTSC color space.
As far as I know Nvidia has at least two issues at the moment (concerning image quality). One is the aformentioned HDMI black level bug which can be fixed easily (although it shouldn't have been there in the first place). The second is color profiles. It's somewhere between hard and impossible to keep a color profile being applied when switching to 3D mode. It may be a minor issue for some but the more you rely on that profile, the worse your 3D image will be in comparison to desktop use.
The trilinear optimization is not active when texture filtering is set to high quality in the control panel, and it hardly affects image quality in the slightest. The fact that you believe a third party tool made any difference with regard to this setting demonstrates your placebo. For as many times as you've been banned for trolling and posting bad info, you're one persistent sob I'll give you that. :flip:
what a laugh... for me nvidia IQ has always been superior to that of atis..no matter what monitor.. colors always seemed to be washed out on ati... and the black levels... might aswell call it gray..
Don't you mean CCFL backlighting is mandatory. =-P I think colour and gamma accuracy are important especially if you want to play games on a computer monitor. Video game devs use IPS monitors, do colour proofing and gamma correction on IPS monitors. If you want to get as close as possible to what the developers intended you to see, then an IPS is the only solution. That being said, playing games on a 40++'' TV trumps any monitor for immersiveness imo. tl;dr - I have a 580GTX and an ASUS P246Q - 100% aRGB, 98% NTSC - colours look fantastic. On my Panasonic P50ST30 - just as amazing
You can solve that problem very easily (for videos), nVidia Control Panel, Video (oh, i forgot name of the option, very last, or one before end), use custom settings, change from limited (16-235) to Full (0-255). And then, movies have almost the same quality as on AMD/ATI cards, there is still difference tho, on nVidia is a bit brighter, and that comes to personal preference, but if you have raw video (uncompressed), that setting do not affect it, and you can compare with encoded one, ans see very small dif. even on 0-255 settings. On the other side, AMD cards play the same Uncompressed vs encoded (as it should), so if you are some sort of designer for some scenes etc, i wouldn't recommend nVidia, sure there is workaround for that I'm 99% sure, but again, why bother.
Only bad thing in IQ from nvidia that i don't see mentioned is dx10/11 "LOD bias" adjusting not being available.
Wow you've got some issues, get over it. And I don't care what you think with "placebo", also please don't talk what I see and what I don't:bang:, Thanks.
Hell has frozen over. Congrats. Hope it serves you well. Just kind of surprised, I thought most 7970 (single GPU) owners were happy with their cards, and surely you knew it would be a side-grade?