Exploring ATI's Image Quality Optimizations [Guru3D.com]

Discussion in 'Frontpage news' started by Guru3D News, Dec 2, 2010.

  1. Skiddywinks

    Skiddywinks Ancient Guru

    Messages:
    4,559
    Likes Received:
    0
    GPU:
    Asus Matrix 285
    Can't say I argue with any of that. I don't ever touch driver settings either, except a few exceptions like Fallout: New Vegas before Patch 3 or to force AA and others on games that don't have them etc.

    It's unfortunate that ATI didn't mention the optimisations, so hopefully they won't do that again. As for the choice to leave them on by default, the decision is entirely a personal preference. The choice may as well benefit the little more ignorant to PC specifics, as the ones that are more likely to care are the ones more likely to know what to do about it.
     
  2. mameira

    mameira Guest

    Messages:
    1,425
    Likes Received:
    1
    GPU:
    eVGA 470 GTX
    Its not ATI fanboys only, its both fanboys trying to prove each other how they made a better buy getting XXX brand gpu
     
  3. GhostXL

    GhostXL Guest

    Messages:
    6,081
    Likes Received:
    54
    GPU:
    PNY EPIC-X RTX 4090
    I noticed this crap going on with my HD 5870. It wasn't like this at first, just didn't care for the tricks. Whose to say they don't pull other tricks...it's more than probable that they do.
     
  4. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,551
    Likes Received:
    608
    GPU:
    6800 XT
    tbh both companies do optimizations with driver updates...
     

  5. getsuga12

    getsuga12 Ancient Guru

    Messages:
    4,313
    Likes Received:
    0
    GPU:
    Geforce GTX870M
    mmm, I've always had mine set to High Quality, guess I should go give Quality a whirl and see how big of a difference it makes. Oh wait, I run everything with Vsync on, I'm not gonna notice a damn difference in performance (so I'll stick with High Quality).
     
  6. inklimited

    inklimited Ancient Guru

    Messages:
    6,117
    Likes Received:
    0
    GPU:
    Gigabyte 6850 900MHz
    I preferred ATIs AF optimisations when I used to own ATI.
     
  7. ivan.winata

    ivan.winata Guest

    Messages:
    425
    Likes Received:
    0
    GPU:
    GTX2080Ti Zotak
    I can't see the difference!!! that's good, I changed my setting to quality now muahahahahaha :D
    8% more performance is welcome for me
     
  8. Jadawin

    Jadawin New Member

    Messages:
    8
    Likes Received:
    0
    Just one point I want to make: If the AMD standard settings have such a bad effect on image quality, then it would be logical to expect that reviewers and testers will notice it. But 99,9% of all reviewers did not notice it.

    The fact that it takes "knowing how to look and where to look" and experienced experts in 3D graphics to find the difference does seem to prove AMD's point "works for almost everyone without any noticable image quality loss".

    It gets totally theoretical anyway, because basically no one compares such things, most people have only one card at a time and if I play a game and the gfx look good to me and the performance is fine, too... do I need to look if a competitor is 10 pixels per image worse or better?

    The only thing AMD needs to do is to explain the settings clearly, directly under the box. It's their business to set standard settings and if there really is a worse image quality than with Nvidias standard settings, then that's something the reviewers have to mention - IF they notice it. If they don't, then the optimizations work as intened, I'd say.

    Oh, and I had ATI and Nvidia cards basically alternating every 1-2 years, so I don't care about "brands".
     
  9. Evasive

    Evasive Member

    Messages:
    18
    Likes Received:
    0
    GPU:
    EVGA 580 GTX
    I just sold my 5870 couple weeks ago for a 580 GTX.

    Yes its true, I had to set my options from Quality to High Quality in the ATI drivers....this is something I have know since the 1900xt days....

    But since I just installed my 580 GTX. I ALSO HAD TO SET IT FROM QUALITY TO HIGH QUALITY IN THE NVIDIA DRIVERS.

    Why isnt this the article as well?

    Seemed Biased to me.

    And this is coming from someone who just had an AMD product to Nvidia...
     
  10. WhiteLightning

    WhiteLightning Don Illuminati Staff Member

    Messages:
    30,788
    Likes Received:
    3,959
    GPU:
    Inno3d RTX4070
    indeed , its the never ending battle. perhaps we should start a pee contest, to declare the winner
     

  11. Toli001

    Toli001 Member

    Messages:
    11
    Likes Received:
    0
    GPU:
    Geforce RTX 3080
  12. ScaryClown

    ScaryClown Guest

    Messages:
    686
    Likes Received:
    0
    GPU:
    3090 Zotac
    I hate IQ optimizations!
    I took me 3 month to figure it out why dirt roar gets blurry in the distance while playing Crysis.
    This did not happen on my old nv8800.
    ATI should stop using these optimizations on default.
    Its not ok to sacrifice IQ for the sake of ,what,8% of performance?!
    Or if they use optimizations they should clearly state this.

    Also i find it very difficult to use CCC.Everything seems hidden and obscure.
    NV control panel was easer to use,at least for me.
    I say this as an ATI fanboy!
     
  13. Darkasantion

    Darkasantion Master Guru

    Messages:
    982
    Likes Received:
    0
    GPU:
    Club 3D HD5870 1035/1325
    I did know of this "change" of ATi when the HD6870 was released and I never did notice a "downgrade" of graphics, yet I gained 8% of performance. So why should I complain?
    Sure, if you look very closely and know where to look you can spot the difference, but I think if you're not looking for it, nor noticing it, yet gain +8% performance, nobody is disadvantaged and all I can say is hats off to ATi for finding this good way to further increase performance without real IQ loss.

    I did notice the change with Morphological filtering however, screws up your text and stuff :p
     
    Last edited: Dec 3, 2010
  14. alanm

    alanm Ancient Guru

    Messages:
    12,272
    Likes Received:
    4,474
    GPU:
    RTX 4080
    In summation, this is how it seems...

    What ATI did is good for their customers who want the extra 8% perf gain. And I'm sure Nvidias users would not mind something similar if IQ was not noticeably impacted too.

    But that is NOT the issue.

    The issue is - was it done for the benefit of the user or was it more of a marketing tactic to sell more cards by giving the impression their cards had 8% more grunt? Now if this had happened with older cards, drivers, no problem. But it happened with the entry of NEW cards into the market where first impression of reviewers is vital. It may have made a difference in users opting for the 6870 since it appeared to be very slightly ahead in many benches vs the gtx470.

    If you are happy with the premise that ATI was thinking of end users interest more than their own interest, and where profit is ultimately a lesser priority to them in this business, then so be it. I'm sure AMD shareholders might raise an eyebrow, but so be it.

    And for those who seem to think these optimizations were done for the interest of the end users, you would think AMD would blog about it and shout to the world "look what we've done! Significant performance gains, negligible IQ impact"! But they didnt. Could it be they were keenly aware of potentially embarrassing implications? And is why they kept it quiet?
     
  15. Exodite

    Exodite Guest

    Messages:
    2,087
    Likes Received:
    276
    GPU:
    Sapphire Vega 56
    Nvidia apply optimizations at default settings as well, several of them in fact.

    What's hilarious, and sad, is that this is somehow newsworthy. Was this last little optimization, that's still all but impossible to detect, the straw that broke the camels back?

    Nvidia doesn't get more flack than AMD because they're Nvidia, they get more flack because they're more prone to asshattery than AMD. Cue this entire 'image quality' debacle.

    They could have implemented similar optimizations of their own and people would have cheered at the extra performance. Instead they resorted to a mudslinging contest over something that the vast majority of users can't even notice under all but exceptional, and artificial, circumstances.

    Also disregarding the fact that they also use optimizations at the cost of a, similarly unnoticeable, image quality penalty.

    That the first thread on this topic weren't immediately locked were disappointing enough, that we now have an official trolling thread for this is just sad.
     

  16. buddybd

    buddybd Master Guru

    Messages:
    827
    Likes Received:
    1
    GPU:
    EVGA GTX 1070 FTW
    I really don't mind them using optimisations, but what I care about is how it affects benchmarks. If the stock ATI settings produce a lower image quality than that of nVidia's stock settings then obviously ATI will have an unfair advantage. This is the generation of cards where 5-10% is all the difference there is!

    So lets all hope for a G3D 'High Quality' setting games benchmark review :). In fact, since this analysis is already done, it'll be a shame not to do such a review.
     
  17. GREGIX

    GREGIX Master Guru

    Messages:
    856
    Likes Received:
    222
    GPU:
    Inno3d 4090 X3
    heck, just turn common used games .exe to somethin non game related and ull c nvidia drivers optimisations off, so who is cheater then?

    BTW - got 8800 AND 4870 in 2 PC's so i'm not fanboy, just sayin, that both companies arent clean here
     
    Last edited: Dec 3, 2010
  18. sutyi

    sutyi Member Guru

    Messages:
    106
    Likes Received:
    0
    GPU:
    Gainward GTX 660 OC 2GB
    Thanks, I see we got nice uncompressed bmps too. ^^
     
  19. Evasive

    Evasive Member

    Messages:
    18
    Likes Received:
    0
    GPU:
    EVGA 580 GTX
    G3D should of been using High Quality for years. As this AF optimization's issue has been around for about 5-6 years now?

    I am accually shocked a GOOD website like G3D would use quality....This shows me that the reviewer/reviewers have not beeing putting these cards at the maximum High Quality Setting.

    Remember BOTH Nvidia and Amd use quality over High Quality. I just expected G3D to know better....to me it seems like they got caught being ameturish.....
     
  20. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,551
    Likes Received:
    608
    GPU:
    6800 XT
    rofl thats cool stuff you can see that both images have their strong points which is kinda funny in distance nvidia looks really poor compared the texts aren't clear at all :D
     

Share This Page