Anti Aliasing question: big dif vs Radeon ??

Discussion in 'Videocards - NVIDIA GeForce' started by AzzKickr, Sep 3, 2015.

  1. AzzKickr

    AzzKickr Guest

    Messages:
    141
    Likes Received:
    6
    GPU:
    Vega64
    Hi guys,

    I have something strange I cannot explain, probably because I don't quite understand the technique behind it.

    Situation: OpenGL based game, 2 fairly identical machines. One is running a GTX 580, the other one the more recent R9 290 from team red.

    The R9 produces an absolutely stunning image with super sampled anti aliasing enabled. It's razor sharp and jaggies free.

    The GTX 580 on the other hand turns the whole scene into a blurry mess when sumper sampled, like looking under water almost.

    So what is the reason behind this ? Does the newer Radeon make use of a newer technique to achieve that result ?

    For the GTX 580 I use NVinspector to enable transparency super sampling, for the R9 I use RadeonPro and set the aliasing technique to super sampling.
     
  2. Guzz

    Guzz Member Guru

    Messages:
    171
    Likes Received:
    66
    GPU:
    RTX 4080
    What game is this?

    Screenshots?
     
  3. AzzKickr

    AzzKickr Guest

    Messages:
    141
    Likes Received:
    6
    GPU:
    Vega64
    In this specific case it's the Sikkmod version of Doom³ (I'm an all-time Doom/Quake fan). I just mentioned it's an OpenGL based game because I noticed some AA techniques are sometimes not available in OpenGL.

    I'll post some screenshots from my own machines as soon as I get home.

    In the meantime I did some more research and found out that as of Catalyst 12.11 the Radeons are capable of something called "SGSSAA" in OpenGL. According to what I can read (it's in German and I'm Dutch, the two langauges are fairly similar) this should be the reason why the Radeon has such an astonishing image quality over the GTX 580 when doing super sampling.

    http://www.pcgameshardware.de/AMD-R...s/Radeon-Supersample-AA-unter-OpenGL-1032919/

    I believe SGSSAA stands for Sparse Grid Super Sampling, but I'm not that knowledgeable in 3D techniques and don't fully understand why Radeons can pull it off and Geforce cards can't.
     
    Last edited: Sep 3, 2015
  4. AzzKickr

    AzzKickr Guest

    Messages:
    141
    Likes Received:
    6
    GPU:
    Vega64
    Found some more interesting stuff on the topic;

    "At this point you are pretty much set up for amazing IQ but we're not done yet. You must set the Sparse Grid Sample number to the exact same amount of multi-samples. In other words if you are using 4XSparse Gride you also MUST use 4xMSAA in the profile. Otherwise you will end up with a blurry mess."

    This comes from:

    http://www.overclock.net/t/1250100/nvidia-sparse-grid-supersampling
     

  5. AzzKickr

    AzzKickr Guest

    Messages:
    141
    Likes Received:
    6
    GPU:
    Vega64
    I'm not sure if the above is correct because I just tried that; matching the 2 AA modes but it's just as blurry as not-matched.

    I'm uploading screenshots right now.
     
  6. AzzKickr

    AzzKickr Guest

    Messages:
    141
    Likes Received:
    6
    GPU:
    Vega64
    The R9:

    [​IMG]

    The GTX 580:

    [​IMG]
     
  7. Veteran

    Veteran Ancient Guru

    Messages:
    12,094
    Likes Received:
    21
    GPU:
    2xTitan XM@1590Mhz-CH20
    Looks like FXAA to me on the 580 and not SGSSAA which is far superior.
    FXAA has heavy blur as you can see here. Peasants version of Anti Aliasing.
     
  8. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    ^
    Nah that's normal TrAA- SS effect by openGL IDtech, and yeah its ugly. I've seen it myself even on 780GTX.

    I would suggest you to use downsampling instead or try custom TrAA flag.


    Or normal MSAA override with this flag.. But that 580GTX will be a bit weak by high values, try max 4x.
    or maybe even old Wolfenstein TrAA - SGSS

    Neue "0x000010C1" kein Blur, mehr FPS
    http://www.forum-3dcenter.org/vbulletin/showpost.php?p=8703440
     
    Last edited: Sep 4, 2015
  9. GanjaStar

    GanjaStar Guest

    Messages:
    1,146
    Likes Received:
    2
    GPU:
    MSI 4G gtx970 1506/8000
    tried this?

    [​IMG]
     
  10. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    ^
    That made it in my case, TrAA - 2x SuperSampling had the least, but it still blurred.
     

  11. AzzKickr

    AzzKickr Guest

    Messages:
    141
    Likes Received:
    6
    GPU:
    Vega64
    No it's not fxaa, fxaa doesn't blurr that much. That's what I have been using until I saw the image quality of the Radeon's SGSSAA and wanted to give that a go on the 580 instead.
     
  12. AzzKickr

    AzzKickr Guest

    Messages:
    141
    Likes Received:
    6
    GPU:
    Vega64
    I doubt it had something to do with ID tech 4; the Radeon is able to pull off astonishing quality, not even blurred slightly. Either the Geforce is wrongly configured, which I'm still testing on, or it's drivers are just not capable.
     
  13. AzzKickr

    AzzKickr Guest

    Messages:
    141
    Likes Received:
    6
    GPU:
    Vega64
    That is what the card was set to when taking the screenshot. Both x2 and x4 settings create a huge blurr, like looking under water. Not even playable imo.
    Both Radeon and Geforce are taking roughly the same percentual performance hit though so they are doing similar stuff.
     
  14. AzzKickr

    AzzKickr Guest

    Messages:
    141
    Likes Received:
    6
    GPU:
    Vega64
    Did some more tests; sparse grid is definitely not working on the GTX, no matter what guidelines I follow, or compatibility bits I use. Tried different drivers, even different machines but no success.

    There seems to be a conflicting post processing shader because when I turn all those off in-game, there is only a very slight blurr with SSAA.
    But that also turns off all eye candy so that's not an option either.

    I guess that I'll have to play on the machine with the R9 for maximum quality and revert to FXAA with some negative lod on the GTX 580 :bang:
     
  15. Phoenix VII

    Phoenix VII Guest

    Messages:
    3
    Likes Received:
    0
    GPU:
    EVGA GTX 970
    From what I read in the NVIDIA compatibility bits master thread and IQ guide it looks like you need to set the Antialiasing - Mode to Enhance instead of Override, set Texture filtering - Driver Controlled LOD Bias to Off, and set Texture filtering - LOD Bias (OGL) to -1.0000.
     

  16. AzzKickr

    AzzKickr Guest

    Messages:
    141
    Likes Received:
    6
    GPU:
    Vega64
    Thanks for your suggestion.

    I didn't work though, result is the same.
     
  17. Veteran

    Veteran Ancient Guru

    Messages:
    12,094
    Likes Received:
    21
    GPU:
    2xTitan XM@1590Mhz-CH20
    You can use an negative LOD to eliminate the blur i would have thought....around -750 for 2x SS.
     
  18. AzzKickr

    AzzKickr Guest

    Messages:
    141
    Likes Received:
    6
    GPU:
    Vega64
    It should, and probably does, but there's a conflicting post processing shader in the mod.
    There is no conflict when SGSSAA is used, but no matter what I do or try I cannot get SGSSAA to work on the GTX 580, while on the R9 it's just a matter of enabling it in the control panel.
     
  19. Veteran

    Veteran Ancient Guru

    Messages:
    12,094
    Likes Received:
    21
    GPU:
    2xTitan XM@1590Mhz-CH20
    For nvidia cards you need a custom AA flag for it too work.
     
  20. AzzKickr

    AzzKickr Guest

    Messages:
    141
    Likes Received:
    6
    GPU:
    Vega64
    I tought so too, but am a bit dissapointed I guess that the AA flag "advertised" as the one for Doom 3 (ID-Tech 4 engines) is not working :(
     

Share This Page