NVidia Anti-Aliasing Guide (updated)

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Cyberdyne, Jan 29, 2012.

  1. KainXVIII

    KainXVIII Active Member

    Messages:
    72
    Likes Received:
    0
    GPU:
    MSI GTX 970
    Its just unoptimized game (i can't maintain 1080p/60fps all the time even with FXAA).
     
  2. Tizoc

    Tizoc Active Member

    Messages:
    59
    Likes Received:
    7
    GPU:
    GTX970M
    Isn't A hat in time a Unity game?

    What AA bits would you recommend for Telltale's earlier games, namely Sam & Max and games around and prior to that series?
    Looking at the sheet in the first page, Back to the Future has
    MSAA: 0x00000041
    SSGSS: 0x000000C1

    I am gonna test the 2nd one and see what results I get.

    edit: here are screenshots from the first Bone game using the SSGSS bits I mentioned
    [​IMG]
    ^No SSGSS implemented

    SSGSS implemented (I always notice leaves from trees in games look so nice with SSGSS implemented)-
    [​IMG][​IMG]
     
    Last edited: Jan 24, 2020
  3. KainXVIII

    KainXVIII Active Member

    Messages:
    72
    Likes Received:
    0
    GPU:
    MSI GTX 970
    UE3.
     
  4. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,145
    Likes Received:
    144
    GPU:
    MSI RTX 2080
    Game is horribly optimized. Even just running at a high resolution without AA the performance can be pretty awful.
    Not to mention there are some other visual quirks that come with UE3 that drag the performance way down when using SGSSAA.(Blur when bringing up the wheel/menu thing). I asked for a way to disable that when it was first released but nothing.


    Added thanks.
     
    Last edited: Jan 25, 2020

  5. Tizoc

    Tizoc Active Member

    Messages:
    59
    Likes Received:
    7
    GPU:
    GTX970M
    I will try it with my Sam & Max games too, hopefully I can see some improvement in visual clarity at least.

    EDIT: Oh, Remember me is an EU3 game? Will be fun to fiddle with some of its settings while inject SSGSS AA XP
    I disabled bloom when I played Enslaved, so I will see how the game looks with and without it.
    Bloom generally seems to invoke 'colourful light beam' on everything with these kinda games :v
     
  6. Terepin

    Terepin Master Guru

    Messages:
    753
    Likes Received:
    45
    GPU:
    MSI 2070S GAMING X
    Why they didn't use UE4 to begin with is beyond me.
     
  7. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,145
    Likes Received:
    144
    GPU:
    MSI RTX 2080
    Be careful with UE3 games and post processing. A lot of later engine versions completely tie up all post processing so you can either have all post processing on (including FXAA/MLAA). Or none of it at all.
    You often have to find one specific setting in the ini to disable stuff as often menu toggles won't actually work. (It depends on the game).
    It can make UE3 games irritating to work with.
     
    Tizoc likes this.
  8. Tizoc

    Tizoc Active Member

    Messages:
    59
    Likes Received:
    7
    GPU:
    GTX970M
    Does Half Life 1 Source need any specific bits, or can I just use the pre-set one in Inspector? Furthermore should I enhance or override?
     
    KainXVIII likes this.
  9. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,145
    Likes Received:
    144
    GPU:
    MSI RTX 2080
    Half Life 1 has been updated to use OpenGL so forcing actually doesn't work but by default it has 4xMSAA. You can set Inspector to "Enhance Application Setting" and override it with 8xMSAA if you want. If you want to use SGSSAA just remember to set Transparency Supersampling (Which is SGSSAA in OGL)
     
    Tizoc likes this.
  10. NeoChaos

    NeoChaos Member

    Messages:
    25
    Likes Received:
    0
    GPU:
    GeForce GTX 1070 8 GB
    Doesn't OpenGL SGSSAA still have the blurry textures bug, though? Also, he's asking about HL1 Source, not vanilla HL1.
     

  11. TheDeeGee

    TheDeeGee Ancient Guru

    Messages:
    6,747
    Likes Received:
    1,025
    GPU:
    NVIDIA GTX 1070
    Why play HL1 Source to begin with?

    It's such a messy port.
     
  12. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,145
    Likes Received:
    144
    GPU:
    MSI RTX 2080
    It manifests in different ways depending on the game.
    Some games it may not even be noticeable.
    In Vanilla Doom 3 for example, it shows up only on alpha test textures like grates
    I didn't even notice this really until later. It changes depending on the angle. (Worse at oblique angles. But that doesn't happen often with the problem surfaces)
    http://u.cubeupload.com/MrBonk/shot00085.jpg

    http://u.cubeupload.com/MrBonk/shot00133.jpg
    http://u.cubeupload.com/MrBonk/shot00134.jpg
    http://u.cubeupload.com/MrBonk/shot00316.jpg
    http://u.cubeupload.com/MrBonk/shot00317.jpg
    And then on certain displays with projected textures it can cause an issue but only depending on what angle you look at it at. Something that only happens on a small number of objects overall and it doesn't stand out as obviously wrong until you look deeper. It was a non issue for me when I played last year.

    Also: I forgot HL:source is different than vanilla sorry. I'd have to assume that if it runs on DX9 like Hl2 the same flag I mentioned last page or so would work fine.
     
  13. xvt

    xvt Member

    Messages:
    43
    Likes Received:
    0
    GPU:
    MSI GTX 1070 Z
    Try running the game with DXVK (on Windows, yes), that thing is literally black magic.
    And since it's a wrapper, nvidia inspector settings should still take precedence.
     
    Last edited: Feb 3, 2020
  14. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,145
    Likes Received:
    144
    GPU:
    MSI RTX 2080
    I wonder if that would really change the performance at all?
     
  15. xvt

    xvt Member

    Messages:
    43
    Likes Received:
    0
    GPU:
    MSI GTX 1070 Z
    It does, oddly enough.



    I've personally tested it with Borderlands GOTY Enhanced and in certain scenarios gained up to 25+ fps.
    Couple of other games I've tried it on, there's no gain at all or performs worse due to the wrapper's overhead.
    Not a silver bullet but still worth a shot for bad performing games.

    https://github.com/doitsujin/dxvk/releases

    edit: all you have to do is find out which architecture the game runs on (either x86 or x64) and copy said dlls in the same folder as the game's executable.
    Be sure to copy all the dlls to the game's folder, since sometimes a game might use a legacy/deprecated instruction or whatnot that can be caught by one of the other API wrappers in DXVK; this will help prevent the game from crashing.
     
    Last edited: Feb 3, 2020

  16. NeoChaos

    NeoChaos Member

    Messages:
    25
    Likes Received:
    0
    GPU:
    GeForce GTX 1070 8 GB
    I guess it is a game-by-game basis, because it's REALLY noticable in Jedi Academy.
    https://imgur.com/a/5jlYDjf
    It's also incredibly bad in Hexen II, but I can't seem to take screenshots in that game for some reason.
     
  17. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,145
    Likes Received:
    144
    GPU:
    MSI RTX 2080
    Yikes, yeah you might try using one of the HSAA modes instead. Like 16xS, which should work in OGL. May be useful for games where you can't downsample because the UI becomes too small to be readable.
    Also: Does it do that if you enhance the in game MSAA instead of forcing it?
    The OGL AA fix doesn't work anymore work? I can't remember.


    Well dang that is pretty interesting. I'll have to try this out sometime. It might make higher framerates at higher resolutions more feasible in this game.
     
    Last edited: Feb 4, 2020
  18. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,145
    Likes Received:
    144
    GPU:
    MSI RTX 2080
  19. NeoChaos

    NeoChaos Member

    Messages:
    25
    Likes Received:
    0
    GPU:
    GeForce GTX 1070 8 GB
    Enhance AA setting is the same as disabling, seems like you have to use Override to get AA to work in OpenGL. That said, I just tried the HSAA settings (16xS, as well as 32xS for the hell of it) and that seems to do the job just as well - the distant trees in that Jedi Academy level show a hell of a lot less shimmering when my character is moving.

    As for the OpenGL AA fix, that has been broken for about a year and a half now, and the relevant setting is no longer visible in newer version of the drivers/Inspector. It's rather sad that nVidia doesn't seem to really care about fixing this feature, especially as SGSSAA overrides the Transparency AA setting for OGL games. Hate to be the unsuspecting gamer playing these older games, enabling TrSSAA in NVCP expecting it to clean up the transparencies and get greeted with a blurry mess of the game's textures.
     
    Last edited: Feb 4, 2020
  20. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,145
    Likes Received:
    144
    GPU:
    MSI RTX 2080
    Ok yeah thanks for jogging my memory. That's what I thought. Which sucks if you have a newer GPU and can't roll back drivers. That makes a certain case to keep around a relevant Maxwell series card for just such occasions but that's a pretty darn niche of niche use case. (Interestingly, in PPSSPP forcing AA in OGL doesn't seem to cause any texture blurring issues from my limited testing. But it's not a very useful scenario either because you have to use unbuffered rendering)

    I wonder if problematic OGL titles should be listed in the document too. ( I added a note about Jedi Academy to start) Testing all the OGL games out there would take a while probably, though the number that work these days is probably pretty low. It seems like only older OGL versions prior to 3.0 can have AA forced. (Doom 3 vs BFG for example. D3 used OGL 2.0 and BFG used 3.2)

    Nvidia probably doesn't care because it's basically a driver hack. But I don't know if anyone ever opened a ticket with Nvidia to see if they would fix it either.
     
    Last edited: Feb 5, 2020

Share This Page