Discussion in 'Frontpage news' started by cowie, Apr 3, 2010.
Those pictures look exactly the same, i dont know what the big fuss is about.
Same. Think this counts more like a good optimization - I got better performance out of it, and I didn't see any IQ loss whatsoever.
I agree with you, Copey.
and even if you lose IQ...
do you stop using jpgs because you lose IQ that almost dont notice?.
If i dont notice it, i dont care, both when i was on nvidia or when im on ati (stopping like a bitch looking to a column and then saying, ey, this is optimized ITS not noticing it)
I dislike if they dont tell me that they are doing so but well :infinity:
While im not sure if those pics are legit or not, they definetly don't look the same, its obvious the second you look at them, but i am viewing on a 46inch TV.
Nvidia and ATI are as bad as each other when doing things like this, i rememver nvidia doing it a few years back, to be fair i really am past caring, il just get the card that siuts my wants/needs at the time.
Ati has more texture noise than nvidia,but Ati has 360 degrees(Perfect Cycle) AF
It's a matter of taste which one looks better
New drivers improving performance? Outrageous.
Perhaps some of you would be better off with consoles, could improve your blood pressures...
Call me whatever you want but i can't see difference , i even reinstalled to 10.2 to see if it's true.
If they some how degraded IQ and it gives more performance and i cant see the difference.
I DONT care
I have both brands and I think ATI IQ is better especially with SSAA ... :nerd:
Turn brightness down in the ATI screenshot and they look the same ...lol
I'm sure Nvidia has had reference quality AF since 8 series so picture quality should be the same, options depending.
I think if anyone remembers Quake 3 Ati driver tweaking, then, this isn't anything new. I'm not saying Nvidia doesn't cheat. I'm saying Ati has this as part of their track-record.
Bringing fermi into it doesn't negate what's actually going on.
Also, regarding benchmarks; it does f***in matter when theres 5-10fps difference without the reader realising it cause that means someone is being conned. Times are different, though, and I think for most gamers; any optimizations are welcome as long as it is user changable like the case here.
All I'm saying is that it's a slippery slope when you spend time posting assumptions about what 'those' people would do instead of discussing actual facts.
It's not an assumption, and i obviously don't mean everyone, but you only have to look at the existing Fermi threads to see that is exactly what would happen.
Will go check the other Fermi threads to see if there is less trolling.
I did discuss the facts, i said while there is a very noticeable difference between the image quality, i would like to see this confirmed by more place before i believe ATI lowered image quality for the latest drivers.
No, it's the same.
the so called evil optimizations are applied to "normal" (default) AI Catalyst setting in this case. the review does not mention using "advanced" anywhere. as a matter of fact, the testsystem page clearly states the use of default settings with both drivers.
keep in mind that disabling Catalyst AI will disable all application specific optimizations AND CrossFire. disabling AI is meant for troubleshooting purposes, such as if you notice image corruption in particular games.
to sum it up, aggressive optimizations could be expected in "advanced" mode, but they've applied them to "normal" mode this time.
I don't know if you don't see it or you don't want to see it but entering a thread with a post similar to yours makes you out as the troll here, not the - imaginary - people you're - hypothetically - blaming for bashing NVIDIA over a - nonexistant - similar situation.
I've tried to point out that my impression is that you're better than that, though I certainly won't press the issue.
Exodite, nah i'm probably not better than that, as yes i agree, it does appear troll like to start it before any of the ATI guys even said a thing, but you will have to forgive my pessimism about the situation, the senseless insults and fanboy branding of anyone who doesn't agree with a certain point of view, has gotten out of hand round here recently.
applejack, interesting, i had missed that part, have to say well done ATI, nice to see them being abit more aggresive, and if the 5800 owners didn't notice it not a huge amount of harm done to the existing customer base.
I dont see a big difrence, but I still find it funny.
Wow, the apocalypse is happening right here, on Guru3d. Graphics card nuclear war. I'm honestly surprised this thread hasn't been locked yet considering how much random flaming, trolling, and ridiculous arguments there are.
Disable AI and it disabled crossfire, normal is default without any tweaks. Advanced adds tweaks.
You obviously did not read the rest of my post, as i said, if the drivers took out quality you did not notice, and by notice, i mean not being able to see if you took an image of it and zoomed it on by 100x, then that's fine, but the simple fact is, if it was nvidia doing this, everyone would be all over them about how they are a horrible company and cheating their way through performance and blah blah blah blah blah, yet when ATI does it, it's met with "oh well, i don't notice it, so it doesn't matter", your post even proves that
And no, i am not being a fanboy here, i am stating the facts.