ATI's SGSSAA better than Nvidia's? I don't think so...

Discussion in 'Videocards - AMD Radeon Drivers Section' started by evil_religion, Jul 5, 2011.

  1. evil_religion

    evil_religion Active Member

    Messages:
    94
    Likes Received:
    0
    GPU:
    Radeon 5870 Vapor-X
    Hello guys,
    in this thread I was confronted with the claim that ATI's SSAA would be better than Nvidia's.
    I assumed that this is simply wrong, but I wasn't content with just an assumption, so I temporary replaced my Radeon 5870 by an old passively cooled Geforce 9600 GT.
    Unfortunately, it doesn't support DX11, so I couldn't make any DX11 SGSSAA screenshots.
    But DX9 can also be interesting.
    SGSSAA was set to 8x, AF to 16 and texture LOD to -1.5.

    Crysis x64:
    [​IMG]
    I think the image quality is ok, taking in account that AA usually is a pain with this game. With my ATI I can change AA settings in every possible manner, doesn't help, picture is still jaggy and flickering like hell.
    As you can see in the picture, with Nvidia SGSSAA very most edges are processed and nothing is more blurry than without.
    If you think that there'd still be many jaggies, just play it without SGSSAA and you will see the difference. Looks like sh*t without SGSSAA.

    I wished I could also show you a Crysis 2 screenshot, but the FPS is too low with 8xSGSSAA, it would take ages to take a screenshot at a decent position.
    But let me tell you: Crysis 2 with SGSSAA is no problem with a Nvidia card:
    http://translate.google.de/translat...rsampling-AA-im-PC-Shooter/Action-Spiel/Test/
    With ATI, no AA is possible at all, except of the in-game blur-thingy...

    Far Cry x64 (with shader 3.0, HDR):
    [​IMG]
    Nothing is jaggy, nothing is blurry. It couldn't look any better.
    With my ATI I can't get this game working on Windows 7 x64, typical ATI driver quality...

    Left 4 Dead 2:
    [​IMG]
    Nothing is jaggy, nothing is blurry. It couldn't look any better.
    With my ATI it looks ok too, but it's a bit blurry with it.

    Risen:
    [​IMG]
    The blur comes from the depth of field, I think it isn't more blurry than without SGSSAA. Only very few jaggies, all those alpha blending vegetation looks nice and smooth.
    With ATI, AA isn't possible at all in this game, except of MLAA. It's better than nothing, but compared to SGSSAA it looks awful...
    With ATI there are also little shadow artifacts when moving the camera, doesn't look so nice...

    Apart from SGSSAA, my subjective impression is that the whole pictures looks a bit better with Nvidia, maybe it's because of the better AF, I can't say...
     
  2. perosmct

    perosmct Banned

    Messages:
    1,072
    Likes Received:
    0
    GPU:
    unknown
    every PC expert as i am, we know, that Nvidia is better on quality because of AF...(and hardware without optimizations)

    ATI has broken/cut AF, by hardware level...many years now...
    ATI has hardware level optimizations that cannot be undone...


    do you know that if ATI never used the last years optimizations by hardware they cannot keep up with nvidia...??? ATI is losing since HD series...the "HD" means they meant for HighDefinition Video playback and NOT gaming...

    Nvidia has even better AF than ALU...AMD has fake AF...with a permanent line scan at the end...you cannot have clean DoF...and the textures are always worse...

    i know many people who switched from amd to nvidia and they all saying the same...nvidia have better texture filtering...

    ati is the cheap solution by all means...so the thread is pointless...all the universe and world know that nvidia have also better AA+AF+textures...crisp images and clean DoF...
     
  3. evil_religion

    evil_religion Active Member

    Messages:
    94
    Likes Received:
    0
    GPU:
    Radeon 5870 Vapor-X
    It's not pointless, I showed the reason for it in the second line.
     
  4. perosmct

    perosmct Banned

    Messages:
    1,072
    Likes Received:
    0
    GPU:
    unknown
    +1 ok i got it, i meant it was a bit pointless to answer him and prove anything...

    (because it is well known by developers circle which company has better IQ...):)
     

  5. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    ATI has improved AF over NVIDIA as far as I am aware, how the actual sample quality level compares in-game can't really be accurately measured however.

    As for AA ATI tends to lack profiles and be cautious and disable AA if there are problems with it, which lead to MLAA implementation in the drivers and how NVIDIA countered with their own FXAA as a slightly sharper alternative though I have no idea on performance.
    (MLAA for ATI was dependent on DX11 hardware though that might have made it pretty fast in turn.)

    ATI has also tried other AA algorithms like Edge Detect and the awful "tent" type blur based filters (Edge Detect is neat but really hits performance and requires 4x AA as a base.) unsure how the transparency type AA settings compare but what I read long ago had NVIDIA as the better implementation.

    ATI is improving but driver development definitively needs more resources and time, ideally cut that monthly deal to avoid rushed early releases and make quarterly releases with many beta versions in-between or something.
    (Also re-write CCC in C+ or whatever or upgrade for .Net 4 instead of it's mostly .Net 2 set or so I've understood it, get memory usage down and response time up.)

    EDIT: Oh right SSAA, ATI scales LOD there so by you pre-setting a negative value would over-sharpen things and introduce shimmering wouldn't it?
    http://www.pcgameshardware.de/aid,6...1-bringt-mehr-Bildqualitaet/Grafikkarte/Test/

    EDIT: Found one AF article.
    http://www.rage3d.com/articles/catalyst_2011_image_quality_investigation/index.php?p=9
    (EDIT: Oh ATI defaults to quality now instead of high quality as earlier so there's some optimizations there plus the surface format option and lack of Catalyst AI disabling on newer hardware, so to does NVIDIA though with various AF and AA optimizations outside of the high quality setting for filtering.)

    EDIT: Also I forgot about the Mjolnir project for larger optimizations, though based on feedback 11.7 is actually a bit better in image quality from that beta release plus improved OpenCL performance.
     
    Last edited: Jul 5, 2011
  6. Alpha

    Alpha Master Guru

    Messages:
    225
    Likes Received:
    0
    GPU:
    MSI 6950
    pero are you retarded? :)
     
  7. kcuestag

    kcuestag Master Guru

    Messages:
    905
    Likes Received:
    0
    GPU:
    Gigabyte GTX980Ti G1
    Pretty much yeah. :wanker:
     
  8. GREGIX

    GREGIX Master Guru

    Messages:
    855
    Likes Received:
    222
    GPU:
    Inno3d 4090 X3
    dumb thread...
     
  9. Passion Fruit

    Passion Fruit Guest

    Messages:
    6,017
    Likes Received:
    6
    GPU:
    Gigabyte RTX 3080
    That pretty much ends my interest on perosmct's input on this topic. Just "lol". :bang:
     
  10. Isbre

    Isbre Guest

    Messages:
    196
    Likes Received:
    0
    GPU:
    290X Matrix H2O
    I think it would be nice if you at least could show side by side comparison on IMG Q.

    And i have no trouble as far as i can remember to use given AA in Crysis and farcry, while overriding apps setting.

    Also in my experience i see more often different AA modes has a grater chance of working in various games with ATI. In fact i have just made a tread in the Nvidia driver section asking for help to get Super Sample AA working in GTA: San Andreas which does not work no matter what i do so far. On ATI Super sample worket with no trouble and the image quality is astounding.

    Tests on the subject at tomshardware also made the same conclusion.
     

  11. PrEzi

    PrEzi Master Guru

    Messages:
    723
    Likes Received:
    585
    GPU:
    XFX MERC310 7900XTX
    Clearly a 'trolling/flame' thread.

    evil_religion :
    -1.5 LOD will give you corruption/shimmering.
    What do you want to prove with it ?
     
  12. nexu

    nexu Maha Guru

    Messages:
    1,182
    Likes Received:
    0
    GPU:
    HD4870 512MB (@795/1085)
    I was just thinking the same: "Where are the ATI screenshots to have a comparison when you're bashing ATI for having poor AA?"
     
  13. Scyphe

    Scyphe Guest

    Messages:
    428
    Likes Received:
    2
    GPU:
    ASUS 2060
    -1.5 LOD and not a word about frame rates with/without AA/AF, monitor, resolution used etc..

    Can't take that comparison serious.
     
  14. Plug2k

    Plug2k Ancient Guru

    Messages:
    1,560
    Likes Received:
    34
    GPU:
    3090
    LMFAO, you sir is an idiot.
     
  15. evil_religion

    evil_religion Active Member

    Messages:
    94
    Likes Received:
    0
    GPU:
    Radeon 5870 Vapor-X
    My subjective impression is the opposite.
    Here is the test by computerbase.de of the AF of Radeon 6800 with 10.10 (dated 30.10.2010):
    http://translate.google.de/translat...l/grafikkarten/2010/bericht-radeon-hd-6800/5/
    They came to the conclusion that the AF of ATI is still worse than Nvidia.
    I doubt that there was progress since that test, e.g. because of my "little test".

    This would be great. A process/application profile detection is also needed.
    The community tool Nvidia Inspector is really the best program of its kind.

    Dunno. I tested with Nvidia, there is no auto-LOD and it has to be adjusted manually. When I test with ATI I use auto-LOD of course.

    There is no official option to turn off Catalyst A.I. With Nvidia there is such one.

    Ah, yes. My brother told me it's the same like with Nvidia: You have to enable MSAA in-game + SSAA via driver, then it works. Sorry, I didn't know this.
    As wished, here are the comparison screenshots:
    Crysis:
    [​IMG]
    The palms are a bit blurry.

    Far Cry:
    It simply doesn't work...
    [​IMG]

    Left 4 Dead 2:
    [​IMG]
    The little house in the background is blurry.

    Risen:
    Forcing AA via driver doesn't work at all:
    [​IMG]
    I also tried disabling DOF, but that didn't help.

    So, you've seen the comparison, it's exactly like I said.

    LOL
    Didn't I just show you the opposite?

    Unfortunately I don't have the Nvidia installed anymore, that's why I can't verify your claims. But it sounds very unlikely to me that such an old game without post processing wouldn't work with Nvidia SGSSAA.
    Maybe you tried OGSSAA or what else...
    And if it really doesn't: There's still a good chance that you have a display that comes along with downsampling, which can't be done with ATI on Windows x64 that easily, you need a third party driver for this. Nvidia can do this without such "hacks":
    http://translate.google.de/translat...-in-Crysis-2-Mit-Bildbeweis/Grafikkarte/Test/

    Link/source?

    Before getting rude, you should inform yourself about SGSSAA. The right negative LOD won't corrupt anything with SGSSAA, it fights blur and increases AF. Here is a table about it:
    http://naturalviolence.webs.com/sgssaa.htm

    -FPS aren't the issue here, it's about image quality
    -The monitor is not the issue with SGSSAA.
    -The resolution can be read by the size of the screenshots (1280*1024)
    I can't take you serious because you obviously don't know what you are talking of.

    I'm not a Nvidia fanboy, I even have an ATI card.
    I am just realistic.
    But, obviously all those ATI fanboys aren't...
     
    Last edited: Jul 5, 2011

  16. V1tol

    V1tol Guest

    Messages:
    284
    Likes Received:
    0
    GPU:
    MSI 1070 Gaming X
    Yeah, I still remember, when in 2004 I bought an 22" CRT monitor, and how geforce 2 blurred in 2d on 1600x1200 and old Rage 128 Pro didn't.
     
  17. evil_religion

    evil_religion Active Member

    Messages:
    94
    Likes Received:
    0
    GPU:
    Radeon 5870 Vapor-X
    LOL again.
    What do grandmother's stories have in common with today's hardware?
    :puke2:
     
  18. nexu

    nexu Maha Guru

    Messages:
    1,182
    Likes Received:
    0
    GPU:
    HD4870 512MB (@795/1085)
    Is it such an unrealistic request to ask for side by side comparison screenshots?
    When someone makes a claim X is better than Y?

    How does asking for fair 1 on 1 comparison material suddenly makes people a "fanboy"?
     
  19. evil_religion

    evil_religion Active Member

    Messages:
    94
    Likes Received:
    0
    GPU:
    Radeon 5870 Vapor-X
    I didn't mean this.
    I meant the people who don't bring any arguments at all, just stupid pro ATI, contra Nvidia flaming.
    It's ok to ask for a fair comparison. And don't you agree that it's fair now?
    I may be subtle in arguing, but not unfair, I think. If I am, then tell me why, please.
     
  20. V1tol

    V1tol Guest

    Messages:
    284
    Likes Received:
    0
    GPU:
    MSI 1070 Gaming X
    Unlike many of you, I have an explanation, why nVidia is sh't for me. Only because you cannot describe you point of view you have no rights to judge my posts.
     

Share This Page