Exploring ATI's Image Quality Optimizations [Guru3D.com]

Discussion in 'Frontpage news' started by Guru3D News, Dec 2, 2010.

  1. Guru3D News

    Guru3D News Ancient Guru

    Messages:
    6,462
    Likes Received:
    0
    A somewhat heated topic amongst graphics card manufacturers is how to get as much performance out of a graphics card with as little as possible image quality loss. In the past both ATI and NVIDIA have...

    More...
     
  2. Mkilbride

    Mkilbride Banned

    Messages:
    8,057
    Likes Received:
    2
    GPU:
    EVGA GTX470 1.2GB
    So for every ATi review, basically I'll take 10% performance off and that'll be the real performance.
     
  3. k3vst3r

    k3vst3r Ancient Guru

    Messages:
    3,702
    Likes Received:
    177
    GPU:
    KP3090
    pretty much
     
  4. Matt26LFC

    Matt26LFC Ancient Guru

    Messages:
    3,123
    Likes Received:
    67
    GPU:
    RTX 2080Ti
    Unless stated in the review that the image quality setting has been changed to HQ. Obviously reviews on this site deduct around 8% as Hilbert i believe said he leaves it on its default setting.

    Don't think it bothers me too much that they've done this, i know i can manually change the setting to HQ if i want too. Not that i own a AMD card anyway :)
     

  5. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Hasn't it been like this for a really long time though, I don't know much about this but as a older generation ATI card user I don't have the new AI options but it is the same technique as with the Mip-Map Filtering Quality option isn't it and by default that defaults to high and not very high and similarly ATI's defaults are "balanced" instead of "very high" for the default, non-advanced view of the CCC and have been for a long time hasn't it?
    (Can't say how it's changed for ATI 5K and 6K series but I imagine the defaults are comparable.)

    EDIT: NVIDIA is similar in a way also no? balanced defaults for the non-advanced view though more control over if you want that to decide, override it with the default settings or use a mix of both along with app profile settings which can also apply optimizations.
    (Mostly related to trilinear and AF optimizations if the view on the second computer with it's 7800 GTX is still accurate, disabled and grayed out when switched to high quality instead of the default quality option.)
     
    Last edited: Dec 2, 2010
  6. Ven0m

    Ven0m Ancient Guru

    Messages:
    1,851
    Likes Received:
    31
    GPU:
    RTX 3080
    The problem arises when we compare 2 cards - one by AMD, one by NV and AMD marginally wins. The first impression for quite a lot of people is that AMD is faster, which is not the case in this scenario.

    Because of that, the readers should be explicitly informed that these tests are performed with different image quality settings, as we're not really comparing cards in 100% fair way. Other solutions - decrease quality settings for NV cards (not cool), let NV get worse scores because they care about IQ more (not cool either).

    We may say that it's a rare case. It's not really visible in many third-person perspective or strategy games, but in games with low camera angle, it may be annoying. If you play racing games / MMOs / FIFA series, etc, then comparing AMD and NV cards at default settings without IQ notice is just unfair.
     
  7. nicugoalkeper

    nicugoalkeper Master Guru

    Messages:
    927
    Likes Received:
    46
    GPU:
    ASUS GTX 1060 DUAL OC 6GB
    Sorry to say it but ATI is doing this for speed not quality ! :bang::bang::bang:
    So +1 to Nvidia (higher price but better quality and other nice features) :banana::banana::banana:
     
  8. cold2010

    cold2010 Member Guru

    Messages:
    180
    Likes Received:
    0
    GPU:
    HD 4870
  9. John Dolan

    John Dolan Ancient Guru

    Messages:
    2,245
    Likes Received:
    0
    GPU:
    2x GTX 780 SLI
    Ive used half a dozen of each brand over the last decade or so and ive always thought that the ATI cards gave better IQ.The older nvidia 7 series used to look particularly bad in comparison.
     
  10. WhiteLightning

    WhiteLightning Don Illuminati Staff Member

    Messages:
    30,737
    Likes Received:
    3,898
    GPU:
    Inno3d RTX4070
    ok its clear, im a cheater for about 2 years now aargh....
     

  11. Undying

    Undying Ancient Guru

    Messages:
    25,207
    Likes Received:
    12,611
    GPU:
    XFX RX6800XT 16GB
    No one is cheater people just get what they pay for.
     
  12. k3vst3r

    k3vst3r Ancient Guru

    Messages:
    3,702
    Likes Received:
    177
    GPU:
    KP3090
    HH says recently

    Currently with the Radeon HD 6000 series release the Catalyst drivers (Catalyst AI) have a new setting which allows control over control texture filtering with settings for 'High Quality', 'Quality' and 'Performance'.

    High Quality turns off all optimizations and lets the software run exactly as it was originally intended to. Quality, which is now the default setting - applies some optimizations that AMD believes remains objective and keeps the integrity of the image quality at high levels while gaining some performance. The last setting is the Performance setting which applies supplementary optimizations to gain even more performance.


    What HH is explaining is newest drivers have optimizations enabled by default where as nvidia don't.
     
  13. Exodite

    Exodite Guest

    Messages:
    2,087
    Likes Received:
    276
    GPU:
    Sapphire Vega 56
    This is getting tiresome.

    Both vendors use optimizations that can adversely affect image quality in the default settings. Both vendors allow such optimizations to be disabled.

    It's completely inane to apply some kind of scaling pulled entirely from the backside to 'compensate' for this when testing different cards. Obviously testing should be done at default settings and image quality compared.

    If there are obvious differences in image quality that has to be accounted for when reviewing a product, obviously, but reading quite a lot of reviews both here and on other sites it's obvious there are no such differences.

    Bar the odd bugs, obviously.
     
  14. VultureX

    VultureX Banned

    Messages:
    2,577
    Likes Received:
    0
    GPU:
    MSI GTX970 SLI
    The first comparison shot is not clear to me.

    Why does it say NV(16xQ) in the left screenshot (looks like an nvidia only anti-aliasing mode to me) and 16xAF in the right screenshot?

    Also the article does not make it clearer:
    And further down:
    The left screenshot looks better, so I guess that would be Nvidia... But do they use exactly the same AA and AF settings? Otherwise this would not be a fair comparison:s

    Another point would be the use of jpg compression... I'd take comparisons and save the screens as lossless png to do it right...
     
    Last edited: Dec 2, 2010
  15. TDurden

    TDurden Guest

    Messages:
    1,981
    Likes Received:
    3
    GPU:
    Sapphire R9 390 Nitro

  16. wolvy

    wolvy Member

    Messages:
    14
    Likes Received:
    0
    GPU:
    4570
    And what about the users with lower range video cards?I`m not a big quality freak, i don`t play games alot , but when i do i want them to be playable, not running with 25 fps on ULTRA HIGH settings.Those extra 8% performance gain are more than welcome for me, "cheating" or not.....
     
  17. Lane

    Lane Guest

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    I have one question anyway, have you make the test of fps using an older driver and the new one with new settings option ? it look like you use only HQ and perf for determine the difference on the same driver .. or each driver profile increase--- it's same for nvidia if you remove all optimisation on the driver ( optimisation/by game ) you will get a similar down on performance.

    it was too the case if you set CAT AI off with old drivers ..( cause there's no more any profile by game and so bug and other fixes, + specific optimisation will not work, including crossfire profile.. )

    If you want to compare fps lost or gain, you need test with the Cat 10.9 or 10.10 with the driver set on Cat "standard " and then set with new catalyst " Quality ", HQ etc... and then you can see how much between you gain loose between " quality on new driver " and standard on old.. i really doubt the difference is 8%, most likely 1% ( so at max 1-2fps, nothing who can make you say they cheat for it.)

    Trackmania on HD5870 is the worst example, as this game fail with a problem on the AF algorythm who bug with noisy textures, you can't use this game for a comparaison, cause whatever is the the reason, this is not due to a Optimisation or wathever ....

    go read the article of 3D center claiming AMD have a hardware problem on HD5870-HD4870-HD3870 with AF, and you will see they have allready used TM ...

    In reality this make 3 years 3Dcenter use TM, HL2 for claming the difference in AF between ATI and Nvidia...

    This is one concern i have about all of this, why don't use BC2, Dirt2, F12010, COD for check quality difference, and use instead games like HL2 ( and strangely enough not L4D), Quake 3, TM, or Oblivion .... since how many year you have see a review who use thoses games ?
     
    Last edited: Dec 2, 2010
  18. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,325
    Likes Received:
    18,416
    GPU:
    AMD | NVIDIA
    Fixed, right is Radeon, obiously.

    On the PNG/JPG. The full blown 24-bit PNG or even BMP is 13 MB per image, I opted JPG at maximum quality (100%) . With the screenshots at 2560x1600 pixels you will not notice the difference.
     
  19. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,181
    Likes Received:
    1,500
    GPU:
    NVIDIA RTX 4080 FE
    When I had my HD 5870 CFX setup, I would always up the (I think they were) texture filtering settings from Quality to High Quality so I was always aware that by not doing that I was compromising the image quality. Now I'm back with NVIDIA I noticed that they too default to a texture filtering setting of Quality and, again, I have raised that to High Quality at the cost of a little performance.

    So, unless I'm misunderstanding this article, both NVIDIA and AMD seem to apply the same texture filtering optimisations and neither use the best quality settings, it is up to the end-user to select them.
     
  20. k3vst3r

    k3vst3r Ancient Guru

    Messages:
    3,702
    Likes Received:
    177
    GPU:
    KP3090
    quality on nvidia doesn't have texture filtering optimizations enabled

    What HH is saying is you get 100% image on quality setting for nvidia cards by default but 99% image quality on ATI cards by default with latest drivers
     

Share This Page