Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by RealNC, Mar 15, 2021.
Thats a dlss1.0 which was was quite bad. With 2.0 they admited its a big improvement.
nvidia freestyle was a thing before dlss was released and included sharpening,which was never a substitute for image reconstruction.
games had their own sharpening slider even before that.
Hm. "Navi's secret weapon for combating nvidia." So if you don't have a Navi GPU, you can't use a sharpening shader.
Right. That channel seems to make some outright ridiculous statements, it seems to me.
they must be shrooming if they think people are gonna drop resolution and sharpen the crap out of the image then call it as good as native.
granted dlss 1.0 was bad but their proposition is ridiculous.I personally avoid sharpening or use as little as possible.it ruins the image quality.you're not getting any of this resolution you're dropping in any way either,so it's not like dlss trying to reconstruct it with varying degree of success.and you're still gonna have to use crap taa for aa whereas dlss does that better.
It bugs the crap out of me the Nvidia profile for Division 2 has a sharpening profile enabled, by default. It does not look better or increase performance over using the game native sharpening slider and using native resolution with the game resolution scaler. One thing that does seem slightly better is using Nvidia's AF versus the game. Nvidia's 4X AF looks as nice as the game 16X AF with less FPS hit.
Could be Susan's doing as well. YT is an authoritaran IngSoc crap nowadays.
Happens to me too. Your comment probably got autodeleted.
To be fair, those DLSS videos were done a long time ago for the initial version of DLSS which was frankly not ready and not good enough — in its original state, DLSS deserved much of the flak it received seems to me.
Later they walked it back when DLSS 2.0 came out I recall and now they’ve reversed their stance since DLSS has improved dramatically.
As for the Nvidia Sharpening filter, I tend to agree that they should’ve done more thorough testing like they did for the AMD CAS Sharpen — as it is, their video on Nvidia’s more or less just says “eh, looks about the same to me as AMD’s” and leaves it at that without talking about any of the negative side effects or noticing any differences.
For example, I’ve noticed that the Nvidia Control Panel sharpening makes aliasing look much worse/more noticeable even at more modest levels whereas in the cases I’ve seen AMD’s CAS sharpen in games like RAGE 2 or The Medium/where it’s been implemented at the game level the results have been noticeably better looking to my eye (though perhaps that’s because it was implemented at the game level? I’m unable to test AMD’s control panel forced one as I have an Nvidia GPU).
So, back to the subject.
It's anyone testing Threaded Optimization?
Yes i've started enabling it, makes ghost recon wildlands run a bit smoother.
I don't know how they can make three videos on this subject without testing this setting. Actual pain.
Wasn't Threaded Optimization for openGL only ?
As far as I know it works in all APIs.
I could be missing something, but it was my understanding that the whole point of AMD’s CAS Sharpen was that it was a “better” Sharpening solution than other available options. HU did compare in their videos to some older Sharpening solutions iirc (such as vs Nvidia’s older sharpening option) and they found the newer CAS Sharpen looked a lot better by comparison with fewer visual artifacts.
Given that was the case I figured that the ReShade solution would probably be inferior overall, no? Or did ReShade quickly incorporate AMD’s own approach?
Sadly I’ve yet to see it tested I betcha some outlet will get around to it eventually though eh?
I know it affects game performance, I did a rudimentary "test" with it in HZD, in which I actually got higher highs and lows with it on, but I don't know what that means, and I have neither the equipment not the time to check it.
I also know it's been an item for tweaking for years now, especial in the simulation communities.
HZD can be tricky when it comes to CPU tests due to it possibly compiling/validating shaders in the background. I suggest trying e.g. Hitman 2/3 or Shadow of the Tomb Raider.
I can't be sure, but I think AMD's solution is able to apply the sharpening shader after the GPU upscaling step, while nvidia has to sharpen before the GPU upscale step? If that's true, then it wouldn't be a surprise that AMD has the better results.
ReShade can only sharpen the original frame of the game. Meaning pre-upscale. So even if you use the ReShade port of CAS, it is applied before the GPU upscales the frame.
Here's a comparison of threaded optimization on a old game (single threaded) c&c generals zerohour (dx8 running with dx9 wrapper)
I like their Monitor reviews which is done by the 2nd guy, but the first guy seems a bit pro AMD. If you look at their historical content he has tended to bash Intel cpus more than AMD, and what I find odd about these Nvidia overhead videos is instead of treating both vendors equally he seems to be looking specifically for Nvidia problems.
The part 2 still hasnt tested any DX9 games, it appears to be dominated still by DX12 titles as well as newer DX11 multi core cpu friendly games which are known to minimise Nvidia's past benefits, its starting to feel like he is on a mission.
If they are deleting comments that are simply providing technical info that makes the video look bad then he is going down a bad path, and needs to rethink what he is doing here.
Yes, let's artificially cut down a completely different architecture than the G4560, compare it to an FX-8350, show no charts of your own and call it a day.