Why isn't Nvidia's CAS Sharpen a separate setting from NIS in the Control Panel?

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by BlindBison, Feb 16, 2022.

  1. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    numbers and pictures please
     
  2. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    Did you read this?
    The clue is globally enabled, disable it and you have your normal performance, enable it and set the fitting resolution in game when needed and disable it when done.
    With nice words, that implementation isn't user-friendly at all,

    Astyanax posted a workaround that reverts back to the old scaling:
    Code:
    Windows Registry Editor Version 5.00
    
    [HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\nvlddmkm\FTS]
    "EnableGR535"=dword:00000000
    @ManuelG
    With these issues, why does Nvidia's development team hold back on giving the option to chose between the 2 methods?
    I mean, both options are there, so why not give people to chose one over the other?
    Would be for the best if they can't figure out to assign them pr. game profile, you get what I'm saying?
     
  3. RealNC

    RealNC Ancient Guru

    Messages:
    5,090
    Likes Received:
    3,374
    GPU:
    4070 Ti Super
  4. RealNC

    RealNC Ancient Guru

    Messages:
    5,090
    Likes Received:
    3,374
    GPU:
    4070 Ti Super
    Yes, I saw it. Still doesn't excuse nvidia making these GPU generations slower through a driver update. Who the heck is supposed to know about undocumented, arcane registry tweaks like that? Most people will instead consider buying a new GPU. Make your GPU slower so you go buy a new one.

    People with older GPUs depend on sharpening for modern games to get good frame rates with lower resolutions and not have the image look like a blurry mess. Let's make that use case slower then, shall we?

    How convenient for nvidia.
     
    Smough likes this.

  5. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    Let's get one thing straight, the driver does NOT make your older GPU slower.
    Enabling scaling on a global level should make you think twice as well.

    But as I try to address to Nvidia's community manager above in hope of that he forwards it to development, the way it is implemented and that there is no option to toggle between the two is far from ideal.
    On far sight bad discissions can harm future sales, not saying that the alternatives are any better, but they are still there.
    So no, it's not so convenient for Nvidia.
     
  6. NoLikes

    NoLikes Member Guru

    Messages:
    163
    Likes Received:
    58
    GPU:
    1080TI
    There a chance I'm witnessing something along this.
    The issue I'm seeing with the 980TI roughly translate to a loss of CPU performance on the desktop and a chunk of score reduction with CPUz benchmark and other application.
    The 2080TI it's not affected by this unswitchable CPU performance hit past the 472.12 driver.
    The EnableGR535 setting doesn't revert those CPU performance hit on desktop.
     
  7. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    So you are saying that there is a CPU performance difference if you switch between 472.12 and 511.79 today on the same windows installation with the exact same patches/updates?
     
  8. NoLikes

    NoLikes Member Guru

    Messages:
    163
    Likes Received:
    58
    GPU:
    1080TI
    Nope, underlying hardware is different, same version of OS. Cards cannot be swapped duo the water cooling condition.

    EDIT:
    By just switching driver between 472.12 and 511.79 right now on the same windows installation (currently 19044.1526) bind to the system with the 980TI there a CPU performance hit!
    The performance hit seems to be appear with 496.13 onward.
    Reverting to an older driver remove the CPU performance loss!
     
    Last edited: Feb 20, 2022
    Smough likes this.
  9. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    ? I meant only the 980Ti system where the only thing that has been changed between benchmarking is the driver, nothing else .
    Aside from that, how many runs to out-rule margin of error and how large of a difference are we talking about?
     
  10. NoLikes

    NoLikes Member Guru

    Messages:
    163
    Likes Received:
    58
    GPU:
    1080TI

    There a CPU performance hit system wide by changing just the driver with the 980TI past the 472.12.
    The CPU performance loss is noticeable by sight and realistically reproducible on anything running with the CPU.
    Loss on the 980TI system with anything greater 472 is almost never less 40 on multi-core, to give you the metric on that system having afterburner enabled reduce the score by 10.
    Sometime, past 472 on those CPUz run I saw the score dipping as low as the card begin actually disabled on the device manger.
    System is overly stable around 460/2280 pretty much each run with 472 and older driver.
    On the 2080TI I did just 3 CPUz run after the system settled before anyway reverting to a previous system image.
     
    Last edited: Feb 19, 2022
    Smough likes this.

  11. Mineria

    Mineria Ancient Guru

    Messages:
    5,540
    Likes Received:
    701
    GPU:
    Asus RTX 3080 Ti
    40 is quite a noticeable drop with 2280 as multicore baseline score, is that with an i7 4770/4790K?
     
  12. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    If true, that sucks :( I have no idea what's possible on Nvidia's end (I would "hope" changes like what we're suggesting would be possible), but I appreciate you providing your rationale in any case.
     
  13. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    I really hope what you're saying isn't true since it sucks crashing up against CPU limits and getting stuttering as a result. If this video is to be believed:



    And, assuming I've understood it right, AMD seems to be using a hardware scheduler and typically nabs superior CPU side performance in Vulkan/DX12 games right now while Nvidia uses a software scheduler and does worse/has a higher CPU overhead in DX12/Vulkan titles (while having a better driver for DX11 ones). Again, I'm not an expert so correct me if I'm wrong, but Hardware Unboxed was talking about this sometime ago when they're CPU benchmarks came out painting Nvidia in a bad light.
     
    Smough likes this.
  14. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,930
    Likes Received:
    1,044
    GPU:
    RTX 4090
    All of this is pure BS on part of HUB.
     
  15. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    That's a relief to me as I have an Nvidia GPU, but how do we know that? At least at the time of the video their testing did show Vulkan/DX12 games with worse CPU performance on Nvidia cards and "if we assume" the video I linked above is true (perhaps it's not) then I could see why if AMD uses a hardware scheduler while Nvidia only has a software scheduler. My understanding of this is pretty limited though so I could definitely be wrong/missing something.
     

  16. NoLikes

    NoLikes Member Guru

    Messages:
    163
    Likes Received:
    58
    GPU:
    1080TI
    Considering it's a semi-permanent reduction of the CPU capability without any user usable feature and explanation, something is definitively off. I agree.
    It's a skylake 6700 system by the way and I don't know if this issue is related with this new NIS or something else, changed or added after 472.12 and in between 496.13 driver.

    On the only one game I've tested, with a simple run the performance impact was negligible and within an error margin, even the CPU scored on the downside of the error margin in percentile total time.
    So, hopefully it's an unseen bug with the driver since it's observable within the desktop.
     
    Last edited: Feb 19, 2022
  17. Smough

    Smough Master Guru

    Messages:
    984
    Likes Received:
    303
    GPU:
    GTX 1660
    If we can disable sharpening from the NVCP with a registry edit and use something like Reshade and leaving NIS upscaler on, i think this could solve the performance drop, because apparently the problem is that NIS Sharpening is still active even at 0%. Or just enable the old sharpening, but keep NIS on, that could work, but i don't think it can be done.
     
  18. Smough

    Smough Master Guru

    Messages:
    984
    Likes Received:
    303
    GPU:
    GTX 1660
    Means nvidia cpu driver overhead problem is even worse now. Amazing. I also had a similar problem with a GTX 1060, the newer the driver, the more my CPU would suffer, this is also with any "mitigations" disabled (spectre&meltdown with inspectre, registry keys to remove them as well, so any CPU slowdown is taken out of the question), the only drivers that would work normally would be the 446.14 ones and before, anything after that would give me a CPU hit for no reason (more usage in games, 15 to 20% for no reason) and even after 2 days with any of those drivers, the issue would remain, so is not any sort of cache being build up.

    Obviously, the newer your CPU, the less noticeable this is, how convenient, don't you think? At first, a lot of us never believed spectre&meltdown BS fiasco, so we got our performance back by disabling that "security" stuff to gimp our CPU and we were happy with older, but good CPU's. Now we have nvidia drivers that make those CPU's work much harder for no logical reason (which means in a way, we are getting CPU performance with some form of penalty hit) and probably forcing people to upgrade, if this isn't a possible planned obsolescence example, then I don't know what is.
     
    BlindBison likes this.
  19. NoLikes

    NoLikes Member Guru

    Messages:
    163
    Likes Received:
    58
    GPU:
    1080TI
    Around 457 NVIDIA did changed something about the rendering pipeline, anyway, begin this a device driver the performance changes within the desktop is the giveaway of something wrong with.
    To me, since it come without any feature enhancement, on mine 980TI until fixed it's a specific direct sign of the driver unsafeness.
    For performance all around in the working driver with the 980TI I'll just stick with reshade 4.8.2.931 and fideltyFX while taking advantage of FSR if the application give support and jolly the profile with NVprofileinspector, like it has always been. I wasn't going to use the DCH versioning on those system, so, not a big deal.
     
    Last edited: Feb 20, 2022
  20. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    Is there a guide you used to disable those CPU performance ruining security patches? (Spectre/Meltdown, etc) — EDIT: nevermind, found one Googling round: https://www.howtogeek.com/339559/ho...pectre-patches-from-slowing-down-your-pc/amp/ will try this later

    People seem to hate on HUB (perhaps deservedly I really don’t know) for bringing the Nvidia CPU overhead to light but it does seem to be a real problem going off the tests I’ve seen.
     

Share This Page