Exploring ATI's Image Quality Optimizations [Guru3D.com]

Discussion in 'Frontpage news' started by Guru3D News, Dec 2, 2010.

  1. chanw4

    chanw4 Guest

    Messages:
    2,362
    Likes Received:
    26
    GPU:
    NITRO+ RX6800XT SE
    Some user said Xfire and AI seems to be separate in the latest driver 10.10e. As I don't XFire I can't confirm it. That user also said it could be RadeonPro that make it works.

    Anyway. Both party use optimization in their driver but AMD's level of optimization cause 'very little' image quality degradation which can't be detected clearly in normal gameplay unless you know how to find them. And no image quality degradation found on Nvidia's default optimization.

    HH make it clear that he will use the driver's default only and it is up to the driver maker to come clean with the optimization.

    The 8% performance gain mentioned in the article is comparing AMD Quality with AMD H Quality. I'm not sure why people keep linking the 8% performance to Nvidia as it is clearly not directly comparable when we don't know how much optimization they use in Nvidia Quality and if there are any optimization in AMD's H Quality or Nvidia H Quality.
     
    Last edited: Dec 3, 2010
  2. MrBozack

    MrBozack Master Guru

    Messages:
    670
    Likes Received:
    0
    GPU:
    Gigabyte 7850OC Windforce
    Thanks for the information. It might explain why my machine hated the Catalyst 10.10 driver. I haven't seen a BSOD since Vista! :3eyes:

    I might try the 10.10e now.... hmm
     
  3. getsuga12

    getsuga12 Ancient Guru

    Messages:
    4,313
    Likes Received:
    0
    GPU:
    Geforce GTX870M
    I had a 8800GT before (which broke down and got switched with a 9800GT) and I can honestly say that with certain games I felt performance wasn't enough (eg, Crysis). My gripe may be that most games are console-ports with sub-par texture quality, but for those games that have exclusive PC-extras like Metro 2033, I wouldn't want to miss out
     
  4. Bo_Fox

    Bo_Fox Active Member

    Messages:
    57
    Likes Received:
    0
    GPU:
    4870 1GB, 8800GTX
    Crysis!! Well, maybe a HD 6990 could finally play Crysis with all of the goodies, heh! I'm waiting and waiting.. to get an ideal card that could play the latest games in 3D on my 65" DLP HDTV that is 3D Vision-ready. GTX 560 seems to be it.

    The next console generation would probably be DX12, Blu-ray, finally. Then IQ would really, really matter for the next PC generation games that are shipped on blu-ray discs with 30+ GB worth of HQ textures. Only if MS could require that 16x AF reference quality be part of the DX12 specification, with true trilinear filtering and zero shimmering or LOD tweaks, then this IQ fuss would be a thing of the past. I cant believe that there are still brilinear optimizations after 10-12 years of games having trilinear filtering in their in-game menu options.
     

  5. morbias

    morbias Don TazeMeBro

    Messages:
    13,444
    Likes Received:
    37
    GPU:
    -
    Well tbh I can see a clear difference between Quality and High Quality texture filtering settings in the nvcp.
     
  6. sounar

    sounar Guest

    Messages:
    706
    Likes Received:
    1
    GPU:
    EVGA 1080
    Now, as another poster spotted, not me....I took the liberty of highlighting the difference in red, which is the lettering on the billboards from both Ati and Nividia side. Clearly Ati lettering's are much more sharper and actually readable unlike the the Nividia word lettering. As well, the background stadium, the highlighted stadium lights and the metal grid bars running the highlight part of the stadium is on the whole much more blurrier on Nividia in comparison to ATI side ...Nividia optimizations could it be :wanker:
    My point is both sides use optimizations and im sure others things that we just don't simply know about..This whole thread is pointless.
    [​IMG]

    Uploaded with ImageShack.us
     
    Last edited: Dec 4, 2010
  7. Bo_Fox

    Bo_Fox Active Member

    Messages:
    57
    Likes Received:
    0
    GPU:
    4870 1GB, 8800GTX
    No, it's not pointless. It's high time for us to put a stop to undue optimizations that cannot be disabled. IMHO, the default settings should be ideal reference AF quality with true trilinear filtering (i.e., High Quality) for high-end cards that cost $500. When people spend $500 on video cards, I think they want the best IQ--at least true trilinear filtering that was one of the more common options in the in-game menu settings for more than 10 years now.

    Maybe the NV setting is also at default Quality too, or maybe the slightly blurry parts are at certain angles that are not perfectly angle-independent 16x AF. I'd still choose NV's IQ over ATI's anytime, due to shimmering that is not shown in still screenshots. Plus the mipmap line that shows the bilinear stage of transition always stays like 10 feet in front of you, reminding you of the Doom 2 days. Well, Doom 2 was fun anyways, wasn't it!
     
  8. TitusTroy

    TitusTroy Guest

    Messages:
    121
    Likes Received:
    2
    GPU:
    Gigabyte GTX 1070
    I don't understand...the Catalyst AI setting has always defaulted to 'Standard'...it was never turned off by default in any previous driver...isn't setting the new AI setting to 'Quality' the same as the previous default setting of 'Standard'?...so why the big uproar now for?
     
    Last edited: Dec 4, 2010
  9. alanm

    alanm Ancient Guru

    Messages:
    12,270
    Likes Received:
    4,472
    GPU:
    RTX 4080
    Not sure if this was addressed, but why is 16xQ (AA) being used vs 16xAF in the pics?
     
  10. kapu

    kapu Ancient Guru

    Messages:
    5,418
    Likes Received:
    802
    GPU:
    Radeon 7800XT
    Awesome spot.

    Anyone wanna comment that ?

    Hilbert what do You think ?
     

  11. alanm

    alanm Ancient Guru

    Messages:
    12,270
    Likes Received:
    4,472
    GPU:
    RTX 4080
    16xQ vs 16AF?
     
  12. Evasive

    Evasive Member

    Messages:
    18
    Likes Received:
    0
    GPU:
    EVGA 580 GTX

    I think its 16X AF Quality setting Nvidia VS 16X AF quality setting AMD.

    Which proves my point. Both Companies do it, have been doing it.

    which is why I do not understand why an article was made to JUST specifically point out ATI/AMD when Nvidia does it?

    Someone's pockets being paid here by someone now?
     
  13. kapu

    kapu Ancient Guru

    Messages:
    5,418
    Likes Received:
    802
    GPU:
    Radeon 7800XT
    I wouldn't just in such conclusions that fast.

    I rather wait for Hilbert comment.

    Also it is confirmed that default setting for nVidia is Qualiy not HIGH Quality ?

    If it is, then article should be changed to see how it really is.
     
  14. alanm

    alanm Ancient Guru

    Messages:
    12,270
    Likes Received:
    4,472
    GPU:
    RTX 4080
    4 separate German sites reported the ATI thing. I am quite sure that when someone has evidence Nvidia is still doing it on the same level ATI had just done, it would be a much bigger story. Scandal seekers are more attracted to Nvidia negative stories imo. :nerd:

    Btw, I would also appreciate some clarification from HH on those 2 screenies.
     
  15. kapu

    kapu Ancient Guru

    Messages:
    5,418
    Likes Received:
    802
    GPU:
    Radeon 7800XT
    Those two screenshots arent big proof already ?

    just look at the roof there, it soo much sharper on ATI.

    There is something going on there.
     

  16. Omagana

    Omagana Guest

    Messages:
    2,532
    Likes Received:
    1
    GPU:
    RTX 2070 Super
    I think some of you need to go outside more often.
     
  17. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    On those two screenshots, sure that one small section of the roof looks a little sharper, but the Trackmania banner looks like its in a slightly different place, is that moving?

    But the difference on the grid is instantly noticeable, rather than having to look for it, i wonder which "optimisation" has the biggest performance hit lol.
     
  18. TDurden

    TDurden Guest

    Messages:
    1,981
    Likes Received:
    3
    GPU:
    Sapphire R9 390 Nitro
    Yes, very easy to spot. Just load image up in free Irfanview software and press H key, you can see difference immediately, details on nVidia side look more blurred. Maybe it's motion blur??
     
  19. kapu

    kapu Ancient Guru

    Messages:
    5,418
    Likes Received:
    802
    GPU:
    Radeon 7800XT
    Good question.Lets test quality vs quality... oh wait these are regular benchies
     
  20. jskyg

    jskyg Master Guru

    Messages:
    434
    Likes Received:
    0
    GPU:
    powercolor 5850@775/1125
    I don't know about anyone else, but I've never used default settings, just max all the settings and set AA and AF to app controlled.

    personally I lower shadow details and AF if I need a few more frames.

    I agree that all benches should be done on max that way we can remove the ATI/Nvidia "cheats", it's the only way to get a real result.
     
    Last edited: Dec 4, 2010

Share This Page