Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Cyberdyne, Jan 29, 2012.
Its just unoptimized game (i can't maintain 1080p/60fps all the time even with FXAA).
Isn't A hat in time a Unity game?
What AA bits would you recommend for Telltale's earlier games, namely Sam & Max and games around and prior to that series?
Looking at the sheet in the first page, Back to the Future has
I am gonna test the 2nd one and see what results I get.
edit: here are screenshots from the first Bone game using the SSGSS bits I mentioned
^No SSGSS implemented
SSGSS implemented (I always notice leaves from trees in games look so nice with SSGSS implemented)-
Game is horribly optimized. Even just running at a high resolution without AA the performance can be pretty awful.
Not to mention there are some other visual quirks that come with UE3 that drag the performance way down when using SGSSAA.(Blur when bringing up the wheel/menu thing). I asked for a way to disable that when it was first released but nothing.
I will try it with my Sam & Max games too, hopefully I can see some improvement in visual clarity at least.
EDIT: Oh, Remember me is an EU3 game? Will be fun to fiddle with some of its settings while inject SSGSS AA XP
I disabled bloom when I played Enslaved, so I will see how the game looks with and without it.
Bloom generally seems to invoke 'colourful light beam' on everything with these kinda games :v
Why they didn't use UE4 to begin with is beyond me.
Be careful with UE3 games and post processing. A lot of later engine versions completely tie up all post processing so you can either have all post processing on (including FXAA/MLAA). Or none of it at all.
You often have to find one specific setting in the ini to disable stuff as often menu toggles won't actually work. (It depends on the game).
It can make UE3 games irritating to work with.
Does Half Life 1 Source need any specific bits, or can I just use the pre-set one in Inspector? Furthermore should I enhance or override?
Half Life 1 has been updated to use OpenGL so forcing actually doesn't work but by default it has 4xMSAA. You can set Inspector to "Enhance Application Setting" and override it with 8xMSAA if you want. If you want to use SGSSAA just remember to set Transparency Supersampling (Which is SGSSAA in OGL)
Doesn't OpenGL SGSSAA still have the blurry textures bug, though? Also, he's asking about HL1 Source, not vanilla HL1.
Why play HL1 Source to begin with?
It's such a messy port.
It manifests in different ways depending on the game.
Some games it may not even be noticeable.
In Vanilla Doom 3 for example, it shows up only on alpha test textures like grates
I didn't even notice this really until later. It changes depending on the angle. (Worse at oblique angles. But that doesn't happen often with the problem surfaces)
And then on certain displays with projected textures it can cause an issue but only depending on what angle you look at it at. Something that only happens on a small number of objects overall and it doesn't stand out as obviously wrong until you look deeper. It was a non issue for me when I played last year.
Also: I forgot HL:source is different than vanilla sorry. I'd have to assume that if it runs on DX9 like Hl2 the same flag I mentioned last page or so would work fine.
Try running the game with DXVK (on Windows, yes), that thing is literally black magic.
And since it's a wrapper, nvidia inspector settings should still take precedence.
I wonder if that would really change the performance at all?
It does, oddly enough.
I've personally tested it with Borderlands GOTY Enhanced and in certain scenarios gained up to 25+ fps.
Couple of other games I've tried it on, there's no gain at all or performs worse due to the wrapper's overhead.
Not a silver bullet but still worth a shot for bad performing games.
edit: all you have to do is find out which architecture the game runs on (either x86 or x64) and copy said dlls in the same folder as the game's executable.
Be sure to copy all the dlls to the game's folder, since sometimes a game might use a legacy/deprecated instruction or whatnot that can be caught by one of the other API wrappers in DXVK; this will help prevent the game from crashing.
I guess it is a game-by-game basis, because it's REALLY noticable in Jedi Academy.
It's also incredibly bad in Hexen II, but I can't seem to take screenshots in that game for some reason.
Yikes, yeah you might try using one of the HSAA modes instead. Like 16xS, which should work in OGL. May be useful for games where you can't downsample because the UI becomes too small to be readable.
Also: Does it do that if you enhance the in game MSAA instead of forcing it?
The OGL AA fix doesn't work anymore work? I can't remember.
Well dang that is pretty interesting. I'll have to try this out sometime. It might make higher framerates at higher resolutions more feasible in this game.
Enhance AA setting is the same as disabling, seems like you have to use Override to get AA to work in OpenGL. That said, I just tried the HSAA settings (16xS, as well as 32xS for the hell of it) and that seems to do the job just as well - the distant trees in that Jedi Academy level show a hell of a lot less shimmering when my character is moving.
As for the OpenGL AA fix, that has been broken for about a year and a half now, and the relevant setting is no longer visible in newer version of the drivers/Inspector. It's rather sad that nVidia doesn't seem to really care about fixing this feature, especially as SGSSAA overrides the Transparency AA setting for OGL games. Hate to be the unsuspecting gamer playing these older games, enabling TrSSAA in NVCP expecting it to clean up the transparencies and get greeted with a blurry mess of the game's textures.
Ok yeah thanks for jogging my memory. That's what I thought. Which sucks if you have a newer GPU and can't roll back drivers. That makes a certain case to keep around a relevant Maxwell series card for just such occasions but that's a pretty darn niche of niche use case. (Interestingly, in PPSSPP forcing AA in OGL doesn't seem to cause any texture blurring issues from my limited testing. But it's not a very useful scenario either because you have to use unbuffered rendering)
I wonder if problematic OGL titles should be listed in the document too. ( I added a note about Jedi Academy to start) Testing all the OGL games out there would take a while probably, though the number that work these days is probably pretty low. It seems like only older OGL versions prior to 3.0 can have AA forced. (Doom 3 vs BFG for example. D3 used OGL 2.0 and BFG used 3.2)
Nvidia probably doesn't care because it's basically a driver hack. But I don't know if anyone ever opened a ticket with Nvidia to see if they would fix it either.