Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by cucaulay malkin, Nov 16, 2021.
This will be interesting to test on older games to see if it works with forced AA in DX9 games and could help out a lot in resolution locked games for a 4k display. (As they say, Better input = Better Output. Meaning a very high pixel quality lower resolution game. Typing of The Dead Overkill for example, if you want to use the Post Processing. Locked to 720p IIRC. Game looks great with forced AA though.)
If they are forcing a lower resolution at the driver level, makes you wonder why they couldn't do the inverse and force higher resolution for downsampling purposes. (For resolution locked games for example. IE: Anything made by Tecmo Koei
This is pretty amazing but I was wondering about an AI upscaler in realtime for MOVIES. My friend upscaled his Star Wars Blurays and wow... what a huge difference... That alone would be a mass adaptation techolony. I'm not talking about shaders that sharpen movies but ones that add real details via AI in realtime.
no it isn't interesting,it's an upscaler.
I turned it on in rotr and hated it.
spatial upscalers only make textures blurry and shimmering.
it's a piece of crap in response to a turd that that is fsr.just for marketing purposes,that's all.
only makes you wonder why amd can't get fsr to work globally via the driver since apparently it's possible.
comparing at OPTION 1 - 4k, 8840x2160, 10-bit, max eye-candy except no vignette, no DLSS, no motion blur, no depth of field and no film grain (yes, max visual without MS fubar HDR) and use maybe variant 2 of TAA
and compared to OPTION 2 the DLSS or upscale images..... well, none of the upscaler and DLSS (even set to quality) look as good as the image in option 1, period.
So why do these even exist if they make things look worse?
example every time I see "DLSS" the first thing to come to mind is "reduced resolution"
Why do we even have:
film noise / grain?
The original 4k 8k resolution max candy always, always, looks better than any DLSS implementation or upscaler.
I'm so puzzled why folks would buy 4k, and 8k monitors just to then reduce the visual quality?!
A lot of those settings reduce resolution just to make a marketing FUD because RTX is such a poor performer and not worth the penalty in performance - and they wanted to push a new product using RTX as the lure.
Please stop buying 4k and 8k monitors if yer just going to dumb down the images.
It might, but not everyone will enjoy playing with ultra low fps while still want to have high res available for programs, movies and crap.
As for better looks, both yes and no, while DLSS can remove some details it can also do the complete opposite and often looks better than adding the blur that comes with other build in AA options.
I AGREE with you for low rez, but I have not seen that at 4k and 8k.
Cyberpunk. DLSS in cyberpunk 4k only softens textures and removes detail even at quality setting, even tried 2.3.4
Chernobylite, same thing
Necromunda, same thing
Metro Exodus. also same reduction in detail
and Succubus. all have reduced fine detail.
detail is reduced, easy to see the difference
I understand that FPS is important, but over 60-100 fps I'd rather preserve the detail.
Well... not identical. But nearly
- Reposted from b3d -
4K monitors are great for non gaming applications as well. Different priorities.
Luckily they can still have an enjoyable gaming experience too due to DLSS.
dumb opinions, 4K DLSS Performance looks better than 1440p Native
Let take CP2077 for example
4K DLSS Performance should look as good as 90% resolution scale from 4K, which is very little visual loss compare to 4K Native but miles better than 1440p Native, while offering much higher FPS.
By your standard everyone should be buying 1440p panels to enjoy all those Native images LMAO. What next? are you gonna suggest people must play with Ultra settings because High settings is not pure gaming?
I find the best usage for NIS is when combined with DLAA, it offers better temporal stability and less ghosting than DLSS Quality
I concede, your absolutely right! ....your comparison is of 1440 with 4k dlss.... is correct.
Yeah, that is when I had stopped reading and you'd lost this debate.
Wouldn't NIS + DLAA just mean there was no scaling going on at all so NIS wouldn't be doing anything? Correct me if I'm wrong, but I figure if you turn on FSR or NIS while running at native resolution internally then I wouldn't expect anything to happen. DLAA is native + Anti-Aliasing unlike DLSS which reconstructs up to native from lower internal resolution.
NIS is just a simplified solution for creating custom resolutions + Upscaling
You could do Upscaling without NIS by creating your own custom resolution in NVCP and tick "Perform Scaling on GPU", but not too many people know of this method.
In the case of DLAA + NIS, this is the same as me creating a custom resolution of 1836p (85% of 2160p), DLAA will output the game in 1836p, sharpened with Lanczos and Nvidia own Sharpening filter then scale it to full screen.
Interesting, so if NIS is ON in the control panel it will output the game at 85% + upscale even if the game is set to native resolution+fullscreen? That is not what I expected. I thought NIS ON in the control panel just added the 85% (etc) output options to the resolution dropdown, but if you don't select them there/if you select 100%/native I didn't expect it to do anything.
Wait, I'm saying the same thing you are saying, NIS is active when you choose a resolution that is lower than Native + Fullscreen mode (windowed or borderless won't work).
If NIS is not ON in NVCP, there won't be additional screen res in the game menu and the Upscaling could be done on the Display itself and not the GPU (some display have graphic processors that can do better upscaling than the GPU)
i find the image produced by dlss much more stable in motion,this is the main reason I like it more than native+aa.no shimmer,no flicker.it's absolutely incredible.spatial upscaling only makes it worse,and current aa is mediocre to begin with.
im not gonna pixel peep screenshots at 1440p,but you go ahead at 8K.
Ah, gotcha thanks -- my bad, I misundestood your meaning there before.
I get what you are saying, it's just that some don't have a card or can afford one that can get stable 60fps at 4/8K, so DLSS does help out there in games even with some details gone.
Personally I stick to 2.5K myself, for 4 and 8K I would need to big a monitor to suit my needs, not a fan of desktop scaling beyond 100% since that brings some issues along.
Other downside with most monitors that only got 4 or 8K as their native resolution is that lowering resolution for games make them look even worse than using DLSS, think the only consumer monitor I have seen being able to fairly handle both 2.5 and 4K so far is one Corsair released recently.
Hopefully we will se more displays being better at handling resolutions below native in the future.
HP Reverb G2 at roughly 2700x2700 (per eye). SkyrimVR with 300+ mods. ENB enabled, with AMD CAS. Openvr FSR with the nvidia scaling. End result is steady 90fps - and yes, this looks better than running at ~2100x2100 without the nvidia scaler. But for my 1440p 144hz IPS monitor I rarely bother to play around with DLSS, but I could imagine that someone in a 3060 world might want to use it.