Discussion in 'Videocards - AMD Radeon Drivers Section' started by zer0_c0ol, Mar 3, 2009.
nice,its like collecting monthly rent,i like ATI its a good tenant lol
I don't really care that it's in 2nd place as I play with VSync on so 100+fps has a care factor of 0 for me, I was simply stating a fact. And a quick google search can instantly shoot down your assumption that the IQ has no difference. It's a well known fact that Nvidia lowers IQ to gain more performance out of their cards. Now I'm not saying that if Nvidia didn't do this the 4870X2 would be on top, I'm simply stating that they have been well known for lowering IQ to gain extra fps. The ignorance here is really overwhelming.
hmmm guys pls don t make this topic a flame one plsss.
You had to replace 20% of your GPUs...I get the impression you may be doing it wrong.
those benchmarks are total bs, i play most of those games with the 4870x2 and get much better frames then the results posted there.. im not saying the x2 is better than the 295 or 280/5 SLI but it def does alot better than it shows in those marks.
If it's so obvious and everyone on the internet knows it then why don't you prove it instead of just telling us what you believe? Lets see all these google hits (Not From a Forum) that talk about the difference in IQ between the RV770 and the GT200/GT200b. I don't know if you remember or not but ATI fan boys have been arguing this same thing since the 1k series and even before in the 9000 Series. It's a pretty lame tactic to try and explain why you have lower performance, especially without any proof of superior IQ except your word.
First of all, I didn't have to replace anything. The head of the NC Dept of IT decided to pull out all ATI cards that were giving the BSOD indefinitely. When you have your phone ringing off the hook because our brand new Dell machines at work BSOD on start up, it is kind of an issue.....also I don't know if you read what I typed or not, but Dell installed these cards and their drivers not me....I simply had to troubleshoot the little f*ckers and that is when the dept pulled them.
We simply cut our losses and pulled out the 4830's because stability means a hell of a lot more than anything else in a work environment. Although it is nice to see you are still under the impression that nothing is ATI's fault, only the user can be to blame. Don't you find it odd I've never had any problems with any other piece of hardware in my system or the systems at work, except for ATI based products? Hell I even volt-modded my last 3 NVIDIA cards and they are all still running to this day and still perform flawlessly.
I'm not saying everyone will experience issues with ATI or that you won't with NVIDIA. I'm simply saying that I've seen a lot of issues with ATI drivers and I'm choosing to stay away from them.
It's with an unfair combination of transparency AA. With ATI's drivers they use the quality setting and with Nvidia's they use Multisampling. ATI's quality setting looks significantly better according to numerous resources than Nvidia's multisampling. I have pointed this out to him before but he refuses to acknowledge it, I guess.
Your words aren't proof, show me some review sites comparing IQ in the RV770 vs GT200/GT200b and I'll take a look.
I linked you to numerous sources in the other thread, I thought I made it clear. The card's design doesn't change what AA looks like, just how fast it's done. The actually AA algorithms are what determines it.
And I'll find something, give me a bit.
Haha you never linked anything and you still haven't.....waiting....
Yeah I did, here.
And here's your new information:
nVidia’s transparency super-sampling (TrAA) and ATi’s quality adaptive anti-aliasing (AAA) are functionally equivalent in that both use sample patterns identical to the base multi-sampling pattern currently in operation, and the samples are super-samples.
The scheme works by resubmitting the texture in question multiple times to the pipeline and shifting the sample position, later averaging the samples into one. Thus the performance hit will be proportional to the amount of alpha textures in the scene. The odd grating or fence won’t impact performance much, but large open areas with masses of vegetation can cause a significant performance impact, even more-so than regular super-sampling.
nVidia’s super-sampling setting is equivalent to ATi’s quality setting, providing both vendors are set to the same multi-sampling level.
All of this is from
Which is comparing how a 260 performs against a 4850 with AA (specifically transparency AA). What is important here is the last line. The quality ATI settings is the same as Nvidia's super-sampling setting. Not multi-sampling. Nvidia's multisampling is vastly inferior to ATI's quality. For an accurate comparison, CCC needs to be set to performance, not quality. Or Nvidia's needs to be set at super-sampling.
Oh don't give me that bs, it's not true because your too lazy to search for it it? Are you actually that ignorant or are you trying to get a rise out of me? :heh:
Yes, they are BS, at least today, since the original 8.12's was pretty much owned already with the 8.12 Hotfix (also called 9.1 beta). The x2 took a huge bite out of the GTX295's lead with the hotfix, and ended up on average faster then the GTX295 with AA applied in high res...
Now, the pro nvidia sites continue to use old ATI drivers when reviewing, and only fools believe them or use them for comparison in mars. Catalyst 9.2 is now the current driver, and is ATI's performance today, so 295 vs. x2 is only accurate using these drivers for ATI, and it seems every site always uses the latest officials at the review time for nvidia, odd isn't it....
Here is a review X2 vs. GTX295 using 9.1 beta/8.12 hotfix: http://translate.google.com/transla...rce_gtx_295/#abschnitt_einleitung&sl=de&tl=en
Looks kinda different, doesn't it? Things will sway one way or the other depending on the drivers, or the games tested. Some will desperately look for the ones that suits their view, no matter how wrong it might be...
From your own source:
Did you not bother to read the whole article? NVIDIA IQ wins overall
Don't feed the trolls
And you completely miss the point.
He very clearly states they are different but equal. He prefers Nvidia's AA but recognizes strengths in ATI's as well and recommends you pick the one that's best in the "areas you care the most about"
But this is beside the point. The source I linked you compared Nvidia transparency super-sampling anti-aliasing to ATI quality anti-aliasing and they came up more or less equal with a slight edge to Nvidia in the reviewer's opinion.
Your benchmark on the other hand compares ATI's quality AA to Nvidia's multi-sampling. If ATI's quality setting is a wash with Nvidia's super sampling then why would you compare ATI's quality setting with Nvidia's multi-sampling? It makes no sense and artificially reinforces the lead Nvidia has in your benchmarks.
As for over all AA quality, my judgement is ATI. I prefer when the trees look fuller. I choose r_useedgeAA=2 in Crysis over FSAA because I like the way it makes the trees look. Nvidia gives them a blurry appearance. Again, this is all personal preference and I'm not debating it, it's not my purpose.
My purpose was solely to show you your bench was flawed because you flaunting flawed, biased, benchmarks around does nothing constructive around here.
I never said that benchmark was perfect, I simply said that the IQ between the GT200 and the RV770 had no VISIBLE difference and I see that you agree with me.
Your using it to gauge performance implies that you think it is fair. The only thing I can take from that statement is that you do not intend to give ATI a fair chance and I think that makes you the very definition of a fanboy and completely negates most of the opinions and ideas you have and will give on this forum.
Immense performance increase? Sigh.
Believe it when I see it.
hehe not top dog but second best satisfies me + i got big fat d*ck so i dont have to buy every new piece of hw to build my e-penis
i'm not like nvidia guys
"hey there s a new card? wow!!!? not actually new just 2 cards stacked together... but its fastest around :banana: WOW, AWESOME DUDE!!! but wait who's gonna buy that when "regular" sli will work and often give better results with no microstutter/couple months driver waiting?"
fanboy will buy it cause it have bigger number than the last one
p.s. why dont u play something with your uber card? something like crysis for the "N"th time i c u spend too much time here bashing ati, i hope thats not the only reason why u bought 295gtx