Testing NVIDIA vs. AMD Image Quality

Discussion in 'Frontpage news' started by MAD-OGRE, Nov 20, 2010.

  1. MAD-OGRE

    MAD-OGRE Ancient Guru

    Messages:
    2,905
    Likes Received:
    0
    GPU:
    SLI EVGA 780 Classifieds
    Saw this today, I have no idea if this is true.

    11/19/2010: Testing NVIDIA vs. AMD Image Quality
    By Nick Stam, posted Nov 19 2010 at 12:00:00 PM



    PC gaming enthusiasts understand image quality (IQ) is a critical part of the PC gaming experience. They frequently upgrade their GPUs to play the latest games at high frame rates, while also dialing up the display resolution and graphical IQ effects to make their games both look and play great. Image quality is important, and if it were not important, we’d all be playing at 10x7 with no AA!

    Important Benchmarking Issues and Questionable Optimizations
    We are writing this blog post to bring broader attention to some very important image quality findings uncovered recently by top technology Web sites including ComputerBase, PC Games Hardware, Tweak PC, and 3DCenter.org. They all found that changes introduced in AMD’s Catalyst 10.10 default driver settings caused an increase in performance and a decrease in image quality. These changes in AMD’s default settings do not permit a fair apples-to-apples comparison to NVIDIA default driver settings. NVIDIA GPUs provide higher image quality at default driver settings, which means comparative AMD vs. NVIDIA testing methods need to be adjusted to compensate for the image quality differences.

    What Editors Discovered
    Getting directly to the point, major German Tech Websites ComputerBase and PC Games Hardware (PCGH) both report that they must use the “High” Catalyst AI texture filtering setting for AMD 6000 series GPUs instead of the default “Quality” setting in order to provide image quality that comes close to NVIDIA’s default texture filtering setting. 3DCenter.org has a similar story, as does TweakPC. The behavior was verified in many game scenarios. AMD obtains up to a 10% performance advantage by lowering their default texture filtering quality according to ComputerBase.

    AMD’s optimizations weren’t limited to the Radeon 6800 series. According to the review sites, AMD also lowered the default AF quality of the HD 5800 series when using the Catalyst 10.10 drivers, such that users must disable Catalyst AI altogether to get default image quality closer to NVIDIA’s “default” driver settings.

    Going forward, ComputerBase and PCGH both said they would test AMD 6800 series boards with Cat AI set to ”High”, not the default “Quality” mode, and they would disable Cat AI entirely for 5800 series boards (based on their findings, other 5000 series boards do not appear to be affected by the driver change).

    Filter Tester Observations
    Readers can observe AMD GPU texture shimmering very visibly in videos posted at TweakPC. The popular Filter Tester application from 3DCenter.org was used with its “ground2” texture (located in the Program Files/3DCenter Filter Tester/Textures directory), and texture movement parameters were set to -0.7 in both X and Y directions with 16xAF enabled. Each video shows the split-screen rendering mode of the Filter Tester application, where the GPU under test is on the left side, and the “perfect” software-based ALU rendering is on the right side. (Playing the videos with Firefox or Google Chrome is recommended). NVIDIA GPU anisotropic quality was also tested and more closely resembles the perfect ALU software-based filtering. Problems with AMD AF filtering are best seen when the textures are in motion, not in static AF tests, thus the “texture movement” settings need to be turned on in the Filter Tester. In our own testing with Filter Tester using similar parameters, we have seen that the newly released Catalyst 10.11 driver also has the same texture shimmering problems on the HD 5870. Cat 10.11 does not work with HD 6000 series boards as of this writing.

    AF Tester Observations
    ComputerBase also says that AMD drivers appear to treat games differently than the popular “AF Tester” (anisotropic filtering) benchmark tool from 3DCenter.org. They indicate that lower quality anisotropic filtering is used in actual games, but higher quality anisotropic filtering is displayed when the AF Tester tool is detected and run. Essentially, the anisotropic filtering quality highlighted by the AF Tester tool on AMD GPUs is not indicative of the lower quality of anisotropic filtering seen in real games on AMD GPUs.

    NVIDIA’s own driver team has verified specific behaviors in AMD’s drivers that tend to affect certain anisotropic testing tools. Specifically, AMD drivers appear to disable texture filtering optimizations when smaller window sizes are detected, like the AF Tester tool uses, and they enable their optimizations for larger window sizes. The definition of “larger” and “smaller” varies depending on the API and hardware used. For example with DX10 and 68xx boards, it seems they disable optimizations with window sizes smaller than 500 pixels on a side. For DX9 apps like the AF Tester, the limit is higher, on the order of 1000 pixels per side. Our driver team also noticed that the optimizations are more aggressive on RV840/940 than RV870, with optimizations performed across a larger range of LODs for the RV840/940.

    FP16 Render Observations
    In addition to the above recent findings, for months AMD had been performing a background optimization for certain DX9 applications where FP16 render targets are demoted to R11G11B10 render targets, which are half the size and less accurate. When recently exposed publically, AMD finally provided a user visible control panel setting to enable/disable, but the demotion is enabled by default. Reviewers and users testing DX9 applications such as Need for Speed Shift or Dawn of War 2, should uncheck the “Enable Surface Format Optimization” checkbox in the Catalyst AI settings area of the AMD control panel to turn off FP16 demotion when conducting comparative performance testing.

    A Long and Winding Road
    For those with long memories, NVIDIA learned some hard lessons with some GeForce FX and 3DMark03 optimization gone bad, and vowed to never again perform any optimizations that could compromise image quality. During that time, the industry agreed that any optimization that improved performance, but did not alter IQ, was in fact a valid “optimization”, and any optimization that improved performance but lowered IQ, without letting the user know, was a “cheat”. Special-casing of testing tools should also be considered a “cheat”.

    Both NVIDIA and AMD provide various control panel knobs to tune and tweak image quality parameters, but there are some important differences -- NVIDIA strives to deliver excellent IQ at default control panel settings, while also ensuring the user experiences the image quality intended by the game developer. NVIDIA will not hide optimizations that trade off image quality to obtain faster frame rates. Similarly, with each new driver release, NVIDIA will not reduce the quality of default IQ settings, unlike what appears to be happening with our competitor, per the stories recently published.

    We are glad that multiple top tech sites have published their comparative IQ findings. If NVIDIA published such information on our own, without third-party validation, much of the review and technical community might just ignore it. A key goal in this blog is not to point out cheats or “false optimizations” in our competitor’s drivers. Rather it is to get everyone to take a closer look at AMD’s image quality in games, and fairly test our products versus AMD products. We also want people to beware of using certain anisotropic testing tools with AMD boards, as you will not get image quality results that correspond with game behavior.

    AMD promotes “no compromise” enthusiast graphics, but it seems multiple reviewers beg to differ.

    We have had internal discussions as to whether we should forego our position to not reduce image quality behind your back as AMD is doing. We believe our customers would rather we focus our resources to maximize performance and provide an awesome, immersive gaming experience without compromising image quality, than engage in a race to the IQ gutter with AMD.

    We’re interested to know what you think here in the comments or on the NVIDIA forums.



    http://blogs.nvidia.com/ntersect/2010/11/testing-nvidia-vs-amd-image-quality.html
     
  2. TheHunter

    TheHunter Banned

    Messages:
    13,404
    Likes Received:
    1
    GPU:
    MSi N570GTX TFIII [OC|PE]
    lol.. its been posted i 3 threads, i think we dont need a fourth
     
  3. Indeo

    Indeo Ancient Guru

    Messages:
    1,552
    Likes Received:
    0
    GPU:
    XFX 5970@950\1200 + GT240
    Idk if I should really comment on this... Probably not.
     
  4. Mannerheim

    Mannerheim Ancient Guru

    Messages:
    4,915
    Likes Received:
    95
    GPU:
    MSI 6800XT
    So. Someone found something u cannot see without using magnifying glass. lol

    This is nothing compared XGI Volari cheating =)
     

  5. Sash

    Sash Ancient Guru

    Messages:
    6,947
    Likes Received:
    0
    GPU:
    video
    Nick Stam
    Technical Marketing Director

    no comment
     
  6. Undying

    Undying Ancient Guru

    Messages:
    25,330
    Likes Received:
    12,743
    GPU:
    XFX RX6800XT 16GB
    Indeed. I never had an ATI card but if you cant see/tell the differance in image quality and you get better performance who cares?
     
  7. mameira

    mameira Guest

    Messages:
    1,425
    Likes Received:
    1
    GPU:
    eVGA 470 GTX
  8. TheHunter

    TheHunter Banned

    Messages:
    13,404
    Likes Received:
    1
    GPU:
    MSi N570GTX TFIII [OC|PE]
    nvidia is using same bliniear anisotropic filtering to speed things up, except they're all quiet about it,..

    FarCry2, Fear1, 2, few NFS, Call of Juarez 2 here its painfully obvious even after you forced trilinear in cfg its still the same bilinear aniso filtering..


    this only popped up so they would turn people around and to buy more nv gpus lol..
     
  9. morbias

    morbias Don TazeMeBro

    Messages:
    13,444
    Likes Received:
    37
    GPU:
    -
    Isn't this irrelevant now that the CCC allows you to disable the optimisations separately from Crossfire?

    Maybe when Crossfire was linked to the AI slider it would have been more valid.
     
  10. Tat3

    Tat3 Ancient Guru

    Messages:
    11,863
    Likes Received:
    238
    GPU:
    RTX 4070 Ti Super
    But hey, that was confirmed by few review sites too like 3DCenter.org, TweakPC, ComputerBase and PC Games Hardware (PCGH)...
     

  11. chaotic1

    chaotic1 Ancient Guru

    Messages:
    2,841
    Likes Received:
    0
    GPU:
    PowercolorHD6970 2GB dead
    here we go again pot calling the kettle black lol
     
  12. TheHunter

    TheHunter Banned

    Messages:
    13,404
    Likes Received:
    1
    GPU:
    MSi N570GTX TFIII [OC|PE]
    so what, if they took nv under the magnifying glass you would be surprised that they aren't anything better, if even more back stabbing.

    Like this Transparency(Tr) MSAA - instead of traditional multisampling on older gpus prior geforce 400 its now some kind halfway TrMSAA with worse image, it started with cuda 3.x.x /256+ drivers, when a driver detects non fermi gpu it switches to this uglier technique., but fermi uses normal multisampling.. if you test older drivers 190+ with cuda 2.x.x it uses normal transparency multisampling aa..


    normal TrMSAA is the first one that's disabled, now with cuda 3.x.x. its the 2nd
    - ALL_MODE_REPLAY_MODE_ALPHA_TEST and that has worse image.. normal Tr supersampling is untouched.


    But you wouldn't know about it if you didn't check with nv inspector,. and nv was all quiet about it.. all they claimed was that with 256+ it will be faster and im talking about the whole driver in general and that was it..

    so yea this article is plain anti AMD bs.
     
    Last edited: Nov 20, 2010
  13. MAD-OGRE

    MAD-OGRE Ancient Guru

    Messages:
    2,905
    Likes Received:
    0
    GPU:
    SLI EVGA 780 Classifieds
    Yea sorry if this was already posted I guess I did not word it the same way.
    I 100% agree it was to throw off on ATI , I just had not seen or herd any thing about, guess I had not looked hard enough.

    Feel free to lock or delete it
     
  14. alanm

    alanm Ancient Guru

    Messages:
    12,233
    Likes Received:
    4,435
    GPU:
    RTX 4080
    It wasnt "confirmed" by these other sites. These sites tested and reported it BEFORE Nvidia had said anything about it.
     
  15. nizzen

    nizzen Ancient Guru

    Messages:
    2,414
    Likes Received:
    1,149
    GPU:
    3x3090/3060ti/2080t
    Picture or it did not happend.

    We need many camparing pictures not BS text...
     

  16. Konrad321

    Konrad321 Banned

    Messages:
    259
    Likes Received:
    0
  17. Omagana

    Omagana Guest

    Messages:
    2,532
    Likes Received:
    1
    GPU:
    RTX 2070 Super
    Sigh...another IQ thread.

    Personally I'd lock them on sight.
     
  18. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    Image Quality is dependent on the user. It's impossible to actually compare 2 screen caps and determine which has better image quality. All this fanboy **** seriously needs to stop....
     
  19. Zareph

    Zareph Ancient Guru

    Messages:
    2,626
    Likes Received:
    0
    GPU:
    Sapphire NITRO+ RX480 4GB
    Nick Stam

    This was posted by an Nvidia employee. On an official Nvidia blog. This is not credible news, it's biased as hell.
     
  20. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,140
    Likes Received:
    1,743
    GPU:
    GTX 1080 Ti
    I'm ashamed of you guys. I enjoy playing video games under a magnifying glass.
     

Share This Page