Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by zandathrax, Nov 15, 2017.
So far so good for my games.Will see tomorrow when i will play battlefront II.
let me elaborate. I hope I'm not derailing this 388.31 driver discussion thread into DSR scaling discussion.
What I meant by using 4k DSR on 1440p native resolution display could look worse than running 2560x1440 native resolution was about overall image quality including anti-aliasing view point. (Not necessarily HUD and/or text)
Ideally, you want to scale to a shared divisor of your current resolution. If you are on 1080p screen, your only clean option is to scale to 4k (3840x2160), since then every pixel will have been quartered and rasterizing an image from that is easy as pie.
If you're on 1440p screen, you would want to scale to 5k (5120x2880) with 0% DSR smoothness.
Those are DSR factor 4x (2x horizontal, 2x vertical = 2x2=4).
for 1080p screen, 1920*2 = 3840
for 1440p screen, 2560*2 = 5120
for 1080p screen, 1080*2 = 2160
for 1440p screen, 1440*2 = 2880
If it is not an even number in each axis (horizontal & vertical), you have go through a heavy interpolation which results in degrade in image quality in both information and aliasing. This is the reason Nvidia included DSR smoothness factor (to blur pixel information and entire screen to counter scaling artifacts caused by non-integer scaling).
To run 4k (3840x2160) resolution on 2560x1440 screen, you need a DSR factor of 2.25 which is 1.5x horizontal and 1.5x vertical (1.5x1.5=2.25). By doing this, scaling artifact (and aliasing) are introduced. Which defeats the purpose of raising resolution from 1440p to 4k. To counter this, you need to use DSR smoothness to hide uneven pixel information and blur the entire image.
I tend to avoid DSR other than 4x factor (2x H, 2x V) with DSR smoothness at 0%. This means, in my opinion that optimal DSR resolution on
1080p screen is 4k (3840x2160) and
1440p screen is 5k (5120x2880).
for 1080p I found 2880x1620 or yes exactly x2.25 most ideal, then 4K if OC'ed 980Ti can handle it.
I use 17% smoothness, seems like cleartype smoothness If I DSR in desktop.
About 388.31, its ok I didnt have any stutter with 388.13 either, even with power monitoring, so I'll just keep these since they have more fixes.
This set seems good for CoD:BO3, no more weird green texture and weird text bug.
Is there a way to disable gsync for a certain game and enable it for others?
I use 1620p (3k, x2.25 DSR) in games that run well like FIFA 18, Metro 2003, Doom etc. on a 23" 1080p monitor at 26% smoothness with ReShade Lumasharpen. Combined with some decent AA it takes care of jaggies and shimmering (which I hate with a passion) really well. 4K is pointless on a 23" monitor and just wastes fps. More demanding games I stick with 1440p (x1.78) DSR if I can get close to stable 60fps. Strategy games I don't even bother with DSR because most of them don't have scalable UI.
nah its enough to look at new titles released and how ridiculous their sys requirements are compared to graphics quality on screen, and how different games looked in the past, to know that we are being robbed of performance because planned obsolescence is most important. remember invisible tessellated oceans. they are still there
@akbaar Great sharing & yeah, this driver doesn't perform well in LatencyMon.
I'm using v18.104.22.168, but manually installed. The GFE automatically auto logged in with remember me on this PC is ticked.
Good drivers,no problem
Does anyone else had issues with 8xMSAA is old OpenGL games like Star Wars: Knights of the Old Republic and Star Wars: Knights of the Old Republic 2? The quality is absolutely horrendous - more like 2xMSAA.
It may be my PC or it might the game or it might the graphics drivers; it could be all three, but DX12 continues to be incredibly lacklustre on my GTX 1080 Ti.
Well, yet another game is released with DX12 support - Star Wars: Battlefront II - that runs poorly compared with DX11. In this case, I am playing at maxed out 2560x1440 settings with 150% resolution and dynamic resolution enabled yet the Arcade Mode map, which is available while the game downloads, stutters and jerks and even pauses for split-seconds, from the opening X-Wing flyby and Imperial ship landing through to the gameplay, giving an all-round unpleasant experience. It is the same even if I disable dynamic resolution and lower the resolution scaling to 100%. It lessens the more you play but never goes completely.
Switch to DX11 on the *exact* same settings and it's a night and day difference as the opening flyby plays smoothly and the game instantly feels smooth to play with little to no stuttering at the start (any that does appear is very minor and the result, IMO, of the game still downloading assets as I start to play as the hard drive light is still showing - the game is installed to a 7,200rpm SATAIII hard drive by the way) and very smooth gameplay for the duration of the match thereafter.
I just don't understand this at all. It was exactly the same in the beta - DX12 a juddery, stuttery mess but DX11 perfectly smooth and playable. Of course, you don't lose anything in terms of image quality using DX11 over DX12 (one of the many disappointments of this overhyped IMO new API) but you do lose the dynamic scaling option when was useful for running the game above the native 2560x1440 resolution of my display at up to 3840x2160 (which is 150% of that resolution as set in the graphics options). Is it any wonder that so many people are underwhelmed by DX12. AMD graphics card owners are likely the only ones happy with DX12 but that's really only because AMD's DX11 driver was not as well optimised as NVIDIA's (as many tech sites have pointed out, including Digital Foundry).
Not only that but this is the *first* game I've played since getting my GTX 1080 Ti which drops the GPU clock below 1,900 MHz (to 1,861 MHz) due what looks like high power usuage (I saw it hit 119% at one point). Every other game I've played, including all other recent ones, have always ran at 1,950+ MHz so I'm not sure quite why this game uses so much power that it causes the clocks to drop. I do have a pre-overclocked EVGA card, the SC2 model, using a +75 MHz offset on the GPU and +500 MHz on the VRAM. Anyone else noticed this on their 1080 Tis?
P.S. I'm using the latest v388.31 WHQL driver for this game, supposedly "Game-Ready" for Star Wars: Battlefront II!!!
@Darren Hodgson DX12.0 in FIFA 18 is unplayable then what do you expect from SW:BFII? So far I know one title is playable with DX 12.0 for my side is RotTR only.
P.S. Every Origin game included DX 12.0 are crippled.
FYI, every EVGA Pascal VBIOS has a tendency to hit the power limits & same goes to my card too, so install Precision XOC to enable K-Boost or flash custom VBIOS from Asus Strix.
Makes no difference.
8xMSAA (300% zoom):
4xMSAA (300% zoom):
AC:O runs like a charm with this driver!
@Anarion Forced MFAA make any differences?
Maybe nvidia put less effort toward dx 12 optimization? when i had r9 390, in bf1 the difference was 10 fps from dx 11 to dx 12. Dx 12 worked best. It's just a theory.
You can force SGSSAA or something like 32xS in such old games.
It's hard to tell what the game is even doing when asking for 8x AA since I think that back when it launched MSAA topped out at 6x on Radeons.