Nvidia Has a Driver Overhead Problem

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by RealNC, Mar 15, 2021.

  1. RealNC

    RealNC Ancient Guru

    Messages:
    4,959
    Likes Received:
    3,235
    GPU:
    4070 Ti Super
    Of course they can. But they have to then accept the fact that they are shills. A reviewer shouldn't be promoting anything, unless being a shill is OK with them.
     
    BlindBison likes this.
  2. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,414
    Likes Received:
    1,141
    GPU:
    RTX 3070
    Except I’m not — one can go look at the much more comprehensive tests from GN or HU in this case and see that Rocket Lake sucks balls. Meanwhile DF showed only 2 games with very minimal gains and focused on Flight Sim. Gamers Nexus and HU did not cherry pick in their test suite for 11th gen, DF did in that round table.
    On the subject of 11th gen Intel, I think we’re just gonna have to agree to disagree.

    It’s my position that 11th gen is even worse than Bulldozer — at least BD had a niche in heavily multithreaded workloads on release. What niche does 11th gen have? It doesn’t have one at all losing the productivity front to Ryzen and often losing gaming to 10th gen and/or Ryzen 5000 depending on title — all they can do is compete on price, but even then 10th gen will perform almost identically in most cases for less money — unless you think the Gamers Nexus tests are faked?

    11th gen is a disaster
     
    Last edited: Apr 6, 2021
    cucaulay malkin likes this.
  3. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    well,I don't think bulldozer was on par with that generation's best,like rkl is with zen3.isn't high refresh gaming more of a niche than buying bulldozer for cores back then ?
    but agree with the disaster part,this should never come out on 14nm

    performance is very good,same or higher than 5800x,and price is still lower than 5800x.
    11600kf is 80eur cheaper than 5600x,that is pretty significant for a mid-range cpu.

    but those power draw numbers,good grief,50-70w difference in gaming.intel should just eol these chips after the reviews.

     
    BlindBison likes this.
  4. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,414
    Likes Received:
    1,141
    GPU:
    RTX 3070
    Unrelated, but I love your username
     

  5. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    it was a lot funnier in my head,written down it looks lame.
     
    BlindBison likes this.
  6. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,886
    Likes Received:
    1,015
    GPU:
    RTX 4090
    It doesn't lose gaming to 10 series, that's the point. There are some games where a 10C/20T CPU fair better than a 8/16 one - but that's to be expected. When comparing equally "cored" parts 11th gen is generally faster than 10th, sometimes enough to beat Zen3 competition - which is good since it means that there's an alternative to Ryzen 5000 now.
    Gamers Nexus CPU tests are limited but even in them there are enough cases which show that 11th gen is better than 10th - it's just not "good enough" for what is generally expected from Intel and the timing of its launch is weird since the successor is expected this year already. Is it enough to call the lineup "crap"? Nah.
    The comparison to BD is a weird one since RKL doesn't try to compete in the number of cores at all. The 11900K is a pointless product for sure but its not the only product in the family. Which is precisely the point DF was making.
     
    cucaulay malkin likes this.
  7. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    Right, it is understood, that apart from just opinions on the quality etc, or the personal opinion we might have about NVIDIA, DLSS and RTX are actually the largest steps for consumer graphics since T&L, right? I actually don't understand how people would be "reserved".

    I would rather have reviewers that are less reserved both in praise but also in criticism. The prices are atrocious, and have been for some time. AMD sold a pipe cleaner that nobody should have bought, which will age terribly (Navi), and NVIDIA sold an overpriced card that wouldn't reach any potential until it was almost too late (Turing).

    Currently, NVIDIA doesn't offer enough VRAM on the mainstream offerings, which will render these cards dead much faster than their price would dictate, and AMD is using RDNA 2.0 as an architectural pipe cleaner, for their first all-around well performing architecture, which is going to be the next one.

    I would love to have more reviewers say things like that this way.
     
    BlindBison likes this.
  8. RealNC

    RealNC Ancient Guru

    Messages:
    4,959
    Likes Received:
    3,235
    GPU:
    4070 Ti Super
    This is a point I very much disagree with. Hardware T&L was a huge step. It suddenly allowed us to go from low framerates to a 60FPS minimum. RTX and other things NVidia has been pushing in the past (PhysX, Hairworks, etc) went the other direction. PhysX and *Works were always a kind of "stamp of crap" in my eyes. As soon as I enabled any of these features, my framerate tanked. It's the same with RTX.

    DLSS is the only good feature in my opinion. It doesn't tank your framerate. It increases it.
     
    BlindBison likes this.
  9. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,361
    Likes Received:
    1,822
    GPU:
    7800 XT Hellhound
    Yet it infests PC gaming with vendor lock-in via CUDA/NGX in D3D/Vk. As a Linux user it might also interest you that DLSS can't run in Wine due to it being a blackbox blob.
    Sadly tech media fails miserably when it comes to pointing out such things.
     
  10. RealNC

    RealNC Ancient Guru

    Messages:
    4,959
    Likes Received:
    3,235
    GPU:
    4070 Ti Super
    True, but DLSS is not going to be a requirement in any game, so it's fine.
     

  11. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,361
    Likes Received:
    1,822
    GPU:
    7800 XT Hellhound
    I really doubt tempting developers to be lazy by relying on (not the be confused with to be dependent on) 3rd party tech they can't modify themselves is a good direction for this industry.
    Not to speak of what it looked like if AMD and soon Intel did the same, causing a totally fragmented mess of features with vendor-locks...
     
    AsiJu likes this.
  12. kman

    kman Master Guru

    Messages:
    497
    Likes Received:
    89
    GPU:
    3080 tuf OC edition
    You reckon this is related to the driver overhead issues?
    https://www.reddit.com/r/buildapc/comments/mduotq/3080_tuf_oc_unexplained_micro_stutters_on_1440p/

    Been trying to wrap my mind on why I would get these micro stutters on 1440p 144hz/165hz but not (or greatly reduced) on 1080p 144hz.Kinda dumbfounded.

    Been speaking with nvidia support for 2 months back and forth and 3 weeks ago I got directed to a level 2 tech which supposedly is trying to re-create the issue (has yet to reply back)

    The only thing left for me to do is legit get a new rig (besides the 3080) or sell my current 1440p 165hz monitor and get a 1080p 144hz (The issue is mostly gone )
     
    BlindBison likes this.
  13. EdKiefer

    EdKiefer Ancient Guru

    Messages:
    3,128
    Likes Received:
    394
    GPU:
    ASUS TUF 3060ti
    No IMO, driver overhead issues talked about would be an issue where your CPU% is very high (high 90's-100%) which then causes lower framerates.
    It would probably manifest more at lower res, 1080P than 1440P.
     
  14. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,414
    Likes Received:
    1,141
    GPU:
    RTX 3070
    Well, I have to concede I was overly harsh before. My mistake — it would seem at the lower end to midrange 11th gen is alright.

    The 11400F and the 11600K are decent at MSRP and consistently beat their predecessors.

    I still think this gen from Intel is quite disappointing, but I was too harsh before. Apologies
     
  15. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,414
    Likes Received:
    1,141
    GPU:
    RTX 3070
    One thing you could try just to totally rule out a CPU bottleneck would be to enable DSR in the control panel and see what happens when you render the game at 4K or 5K. This will lower framerates, but you’ll definitely definitely be GPU bound. If the stutter persists then you know it has nothing to do with the CPU end (barring a busted CPU or something at least).

    Another thing to try would be to use an FPS limiter at a consistently achievable value (RTSS or driver level limiter for example). The RTSS overlay will tell you your current FPS and show drops so you can try something very aggressive then work you’re way up to see if it helps/stops the stutters.

    People will say with a 9900K there’s no chance to be CPU bottlenecked at 1440p, but Digital Foundry found with a 3900X and 2080 Super they were still getting some cpu bound microstutter at 1440p depending on game and you are playing at very high framerates so ... who knows? Could also maybe be some part failure or something overheating, unsure.

    The 3080 is a better fit for native 4K seems to me — even at 1440p with a 3080 I’d want the best CPU I could get to ensure the GPU got the maximum utilization possible. Still, the 9900K is a very fast gaming CPU so constant stuttering seems odd — it’s not like you’re at 1080p or something. It does incur more of a perf hit than 3900X for example though if you’re running a lot of other apps in the background going off the tests I’ve seen though — are you streaming or anything?

    Other question I’d have would be is this happening for all games or just some? DOOM Eternal basically never stuttered for me while other games stutter tons and there’s not much one can do about it in my experience.

    Other thing could be VSync implementation that you’re using — unless you’re using Gsync/Freesync then Vsync on can look really juddery/stuttering when drops occur (since frames will persist for varying amounts of time). Fast Sync will look really stuttering basically all the time due to the nature of how it works (see the Digital foundry video on Vsync types for more here).
     
    Last edited: Apr 11, 2021

  16. kman

    kman Master Guru

    Messages:
    497
    Likes Received:
    89
    GPU:
    3080 tuf OC edition
    From what I noticed once gpu usage hits the 80% mark on 1440p is when they start to happen.Wow seems to be the least effected because gpu usage never goes above 60%


    there's more info in the video description.

    I did try to limit the fps to 120 with riva tuner on 1440p and the micro stutters where gone in warhammer.I also tried 144 and they where less but still there.The hell?I could have 144fps on 1080p with no issues tho.

    It happens in most games that push the gpu over 80%+.The random unexplained micro stutter I mean.

    Besides discord/steam and occasionally chrome nothing is running (I also tried with nothing in the background and it made no difference)

    Also no parts are overheating.I've checked with hwinfo/msi afterburner.

    Think im just gonna sell my 1440p monitor and get a 1080p 144hz monitor.
     
    Last edited: Apr 11, 2021
  17. RealNC

    RealNC Ancient Guru

    Messages:
    4,959
    Likes Received:
    3,235
    GPU:
    4070 Ti Super
    This is reproducible 100% of the time here, but only for games that either don't support fullscreen mode (almost every Unity engine game), or when I use borderless instead of fullscreen in games that do otherwise support true fullscreen.
     
    BlindBison likes this.
  18. kman

    kman Master Guru

    Messages:
    497
    Likes Received:
    89
    GPU:
    3080 tuf OC edition
    I'm tend to play all my games in fullscreen tho and most of the games I play don't use unity. :/ Also why would it go away simply by playing on a lower resolution to 1080p 144hz or putting an fps limit of 120 on 1440p?

    Hmm...the only reason I could see putting a 120 limit on 1440p work is because it lowers gpu usage in general.That extra 20 fps difference is enough of a gpu usage difference between micro stutters and no micro stutters.
     
  19. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,414
    Likes Received:
    1,141
    GPU:
    RTX 3070
    Are you running V-Sync ON or OFF? If so, what kind? (traditional double buffering/linear triple buffering/adaptive v-sync/fast sync). Or, do you use G-Sync/Freesync?

    If the microstuttering occurs only when the GPU is being taxed 80%+ and goes away when the GPU is NOT being taxed (sub 80%) then that sounds wonky as hell/I'd worry if there's a defective component or something.

    Low GPU utilization would normally indicate that you're running up against some other limitation beyond the GPU/that the CPU can't keep the GPU fed well enough for it to be the limiting component which isn't great usually -- Digital Foundry talked about this in their 3900X and 2080 Super reviews where they discussed how being bound by the CPU for example can cause you to crash into any part of the simulation and get microstutters/the GPU "stalls" basically since it has nothing to work on while it waits on the CPU to feed it more instructions.

    Are you using any settings like Ultra Low Latency or modifying the default pre-render queue? If you're CPU bound/if you're GPU utilization is low then I'd recommend leaving those on the default since they help keep the GPU fed with stuff to do if the CPU can't keep up periodically.

    I'll repeat my recommendation that you should try using Nvidia DSR in the control panel as a test -- try downscaling the game (something like Apex perhaps -- RTS games are notorious for thrashing the crap out of CPUs from what I gather) from something ludicrous like 5K then check your GPU usage -- it should be at 99-100% ish then. Do the microstutters stop? Or do they get worse/only happen when the GPU is taxed? (that would be weird). Of course overall framerates will be depressed doing something like DSR, I'm just trying to rule out a defective GPU or something.
     
    Last edited: Apr 11, 2021
  20. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,414
    Likes Received:
    1,141
    GPU:
    RTX 3070
    What the heck? Is this a known issue then? Lord do I wish games would just all have an exclusive fullscreen mode.
     

Share This Page