Exploring ATI's Image Quality Optimizations [Guru3D.com]

Discussion in 'Frontpage news' started by Guru3D News, Dec 2, 2010.

  1. MAD-OGRE

    MAD-OGRE Ancient Guru

    Messages:
    2,905
    Likes Received:
    0
    GPU:
    SLI EVGA 780 Classifieds
    Thanks Hilbert for give us valid info that we can see for our selves.
     
  2. The Postman

    The Postman Ancient Guru

    Messages:
    1,773
    Likes Received:
    0
    GPU:
    MSI 980 TI Gaming
    I dont understand the point of this. It s like it is concluding that ONLY AMD uses optimizations to gain fps.

    This article needs more depth and better comparison. Showing only benchmarks from one side doesnt look fair. The default settings are fine, not everyone own a 5870 and those extra fps for not "visible" image quality loss is more than welcome.

    If we are going to be fair with both sides lets do reviews or benchmarks with all optimizations off.
     
  3. IPlayNaked

    IPlayNaked Banned

    Messages:
    6,555
    Likes Received:
    0
    GPU:
    XFire 7950 1200/1850
    I blame primarily the reviewers here, and to a lesser extent both AMD and Nvidia.

    I mean, review websites exist to make objective comparisons.

    If we accept that review websites exist to make those comparisons, then the review websites need to ensure that they are testing things at similar settings. While I realize this is impossible in all circumstances, I don't think any real effort is made by any big review website to keep settings similar between vendors.

    The vendors could of course make this simpler by offering thorough explanations of all their settings, and comparable settings in the competition. But they are the competitors here, its expected that either of them will try to play slightly dirty by playing up their settings and choices and playing down the competition's.

    The websites need to be the referee, and ensure that these two companies play fair. If we can't trust the websites to find out the testing flaws, who is going to?

    That said, I prefer AMD's approach. I would prefer to have optimizations available to me that I can disable or enable than not have them at all for moral reasons. I just think AMD should have been more forthcoming in that the driver defaults to the optimizations, and been more forthcoming to the reviewers that, in they want an objective comparison, to enable HQ AF.
     
    Last edited: Dec 2, 2010
  4. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,388
    Likes Received:
    18,559
    GPU:
    AMD | NVIDIA

    Hey man, yes I know, trilinear filtering was enforced in the BioEngine.ini

    Guys due to the JPG remark I've replaced the Mass Effect JPG screenshots with losless BMP files (each 12MB a piece), it doesn't get any better then that.
     

  5. k3vst3r

    k3vst3r Ancient Guru

    Messages:
    3,702
    Likes Received:
    177
    GPU:
    KP3090
    Whole point is both sides should have 100% max image quality set as default without any messing about with end user choosing whether they would like to drop quality for extra performance.
    As HH pointed out he uses default driver settings, sure he could change ATI drivers to HQ, so it's at same level as Nvidia drivers for testing reviewing but why should he when it's not default :bang:
     
  6. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,547
    Likes Received:
    608
    GPU:
    6800 XT
    Last edited: Dec 2, 2010
  7. alanm

    alanm Ancient Guru

    Messages:
    12,234
    Likes Received:
    4,436
    GPU:
    RTX 4080
  8. shane_p

    shane_p Member

    Messages:
    46
    Likes Received:
    0
    GPU:
    590GTX HYDRO COPPER CLASS
    wow
    now I am upset, WITH NVIDIA!

    why don't they do the same thing and get my 480 some more performance, for a minor tweak that 99.99% of you guys cannot even see?

    LOL
    come on Nvidia do the same thing here, I see people are upset with ATI, but Nvidia is behind the ball I say...lol
     
  9. WhiteLightning

    WhiteLightning Don Illuminati Staff Member

    Messages:
    30,766
    Likes Received:
    3,934
    GPU:
    Inno3d RTX4070
    Yeah i bet Nvidia espionage tactics by infiltrating someone into ati's driver team really payed off now :)
     
  10. Anarion

    Anarion Ancient Guru

    Messages:
    13,599
    Likes Received:
    386
    GPU:
    GeForce RTX 3060 Ti
    Good! ;)
     

  11. Omagana

    Omagana Guest

    Messages:
    2,532
    Likes Received:
    1
    GPU:
    RTX 2070 Super
    Personally,

    An 8-10% increase in performance by removing some bits of fine detail I can't even notice without comparing two screenshots and looking very closely -sounds like intelligent design. Plus it can be turned off if knowing about it bothers you. When I start too visually notice something, then I'll be "up in arms"

    The only folk bitching about this are Nvidia users who want a reason to claim bigger e-peen lol...sad
     
  12. Kohlendioxidus

    Kohlendioxidus Guest

    Messages:
    1,399
    Likes Received:
    13
    GPU:
    Sapphire Vega 56 Pu
    Seems...you'r right!
     
  13. PinguX

    PinguX Maha Guru

    Messages:
    1,123
    Likes Received:
    303
    GPU:
    Sapphire RX580 8GB
    How do AMD users lose out if there getting extra performance without a noticeble penatly on image quality ?
     
  14. alanm

    alanm Ancient Guru

    Messages:
    12,234
    Likes Received:
    4,436
    GPU:
    RTX 4080
    I see nothing wrong with ATI's little trick from a practical standpoint for its users. In fact I wouldnt mind if Nvidia did this themselves, for an 8-10% perf increase vs negligible IQ penalty. But sadly, it doesnt look like its being done for the benefit of its users, but rather as an advantage in marketing cards that appear to be 8-10% faster vs the competition.

    I'm glad that this happened so both sides are now aware that intense scrutiny will be on all future driver releases.
     
  15. Omagana

    Omagana Guest

    Messages:
    2,532
    Likes Received:
    1
    GPU:
    RTX 2070 Super
    Its an advantage in marketing because its a great "trick", you just said yourself you wouldn't mind if Nvidia did it.

    Obviously its done to benefit users.
     

  16. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,547
    Likes Received:
    608
    GPU:
    6800 XT
    for sure but if ati doesn't have such things blocking it it's always testing ati on let's say 8xAA when nvidia is 4xAA... or so... but really both do optimizations nvidia does it in game profiles and ati does with that arse catalyst a.i and well you can turn everything off and have crappier performance with near to no change in iq unless you go checking really hard.
     
  17. deltatux

    deltatux Guest

    Messages:
    19,040
    Likes Received:
    13
    GPU:
    GIGABYTE Radeon R9 280
    Yes lowering IQ for faster FPS could be consider cheating but really, if you're in a fast pace battle, would you really notice the difference? I think I'd be focus of not getting my ass killed online than say "there's missing textures here, the AF is off here".

    I think ATi put it to "Quality" by default since most people don't give a crap because they don't usually notice it.

    deltatux
     
  18. ClaymoreMD

    ClaymoreMD Member

    Messages:
    25
    Likes Received:
    0
    GPU:
    Asus 4870x2
    My opinion is that both cards should be tested under same visual quality conditions. If that means ATI on high quality and Nvidia on quality then so be it, assuming both look the same. Another option would be to put a note under every benchmark that there is a slight but noticeable difference in visual quality between the cards.

    Even if this is more of a marketting issue than a real issue, I believe the competition should be fair and transparent. It is possible that the evolution of graphics cards will lead to a situation where all the competitors will offer so different solutions and optimizations that it will be impossible to compare them under the same settings. I say same visual quality even if it means different settings or default settings with a note in every test and review, not just this article. People need to know what they are buying.

    I still like ATI, will buy it again and leave this optimization on because with anything but these preselected images, it will be nearly impossible to notice. But the difference is there and competition should be as fair as possible.
     
  19. |Ano|

    |Ano| Master Guru

    Messages:
    275
    Likes Received:
    0
    GPU:
    GTX560Ti Twin FrozR II
    Well since they compare a 6870 (budget card) to a gtx580 (Premium enthusiast card) i think the comparison fails before it has even started.

    BUT!

    I do not like what i'm hearing, if this is true my next card will be from nVidia since AMD pretty much scammed every single customer when releasing 6850 and 6870 with tweaked drivers. Those numbers were not accurate since they downed the IQ.
     
  20. Vistixx

    Vistixx Member

    Messages:
    16
    Likes Received:
    0
    GPU:
    2x GTX 460
    And since when does image quality differ on a budget card from a "premium enthusiast card"?
    Also, i wouldnt call a 6870 a budget card really, it's as fast as a 5850 and costs about 200bucks:p
     

Share This Page