EVGA RTX 2070 Super Stuttering Issues when using Optimal Power Mode

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by mr1hm, Apr 12, 2020.

  1. mr1hm

    mr1hm Active Member

    Messages:
    97
    Likes Received:
    2
    GPU:
    2070 Super XC Ultra
    Hello,

    I picked up a open box EVGA RTX 2070 Super Black yesterday from my local shop and I'm getting a lot of weird FPS drops while playing Rust. I haven't been able to check out other games at the moment.

    If I set my Power Management Mode to "Prefer Maximum Performance" the issue seems to go away, but this doesn't allow the GPU to downclock itself when idle or when the extra power isn't necessary (is this normal? I remember my GTX 970 downclocking even when it was set to "Prefer Maximum Performance" when I'm just on the desktop).

    Here's my GPU Usage and Clock Speeds when running Rust today with power management mode set to "Optimal Power":

    https://ibb.co/MV4gHz1

    What do you guys think? If this doesn't seem normal, I'd like to return it ASAP and pick up a brand new one instead.

    My PC:
    - CPU: i7 3770k @ 4.4
    - MOBO: Asus Maximus V Extreme
    - RAM: 2x8GB 1600Mhz CL11
    - PSU: Cooler Master V700

    EDIT: I'm currently using 442.59 WHQL
     
    Last edited: Apr 12, 2020
  2. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    mr1hm likes this.
  3. mr1hm

    mr1hm Active Member

    Messages:
    97
    Likes Received:
    2
    GPU:
    2070 Super XC Ultra
    Thanks for the quick response. I read through the post and setting "Prefer Maximum Performance" on a per game basis, seems to work well. Clock speeds downclock properly after the game process has ended.

    It seems a bit concerning though considering some other people seem to be running games fine without having to set the specific game to "Prefer Maximum Performance". Do you personally have a similar problem as well?
     
  4. dezo

    dezo Member Guru

    Messages:
    196
    Likes Received:
    128
    GPU:
    RTX 4090
    From my experience optimal mode is the worst for games. I play mostly older games and they were full of weird fps drops, stuttering and other anomalies in this mode. For years I used global adaptive mode but last few drivers produced some hiccups on my setup while constantly changing clocks in less demanding games. So I switched to max. performance mode per game and leave global to default (optimal). Everything is now smooth as butter and downclocking on destop works fine. The drawback is maybe slightly more power consumption in games which don't need it but whatever - this is a desktop PC, not some weak-ass overheating notebook.
     
    bobblunderton, Smough and mr1hm like this.

  5. mr1hm

    mr1hm Active Member

    Messages:
    97
    Likes Received:
    2
    GPU:
    2070 Super XC Ultra
    Hmm that's pretty much the conclusion I'm coming to... Do you ever have times when your RTX 2080s doesn't boost the core clock to it's max speed?

    The EVGA RTX 2070 Super's advertised boost core clock speed is 1770, but it never hits that. I tested in Rust, Overwatch (not too demanding anyway), and Metro Exodus (on High settings + Advanced Physx and hairworks [1920x1080]).

    I felt like Metro Exodus should've definitely used the full potential of the GPU, but it didn't... Stayed solid at 1605 instead. I can't tell if this is normal or something is wrong
     
  6. dezo

    dezo Member Guru

    Messages:
    196
    Likes Received:
    128
    GPU:
    RTX 4090
    It depends on GPU utilization - mine has the max. perf. in range 1650-2050. The less demanding games with 30% or less utilization usually stay around 1650 the whole time, modern ones with complex 3D engines (like Dying Light, Doom E., etc.) shoot to 2050 and then stay around 1950 with 50-90% utilization. Also depends on temperature I guess. I don't use manual overclock, things may change with it.
     
  7. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    Max boost is rarely achieved. There is usually insufficient workload for turbo clocks (low GPU utilization usually results in stable base clock, potentially lower, no turbo at all) or some other limiting factor like temperature (GPU and/or VRM), power draw (usually the advertised TDP limit as default), or potentially some other factors (whatever sensor inputs and safety limits the TurboBoost algo might use). These cards can easily hit their default TDP limit even without any frequency overclock, they almost certainly start hitting it if you manually increase the Voltage limit. You can even run out of the maximum allowed power limit (manually increased) without any manual Voltage limit increase (so the maximum allowed power draw is rarely enough for significant Voltage increase). And TurboBoost is very prone to temperatures, it can skip the last few frequency steps even with full-cover water blocks (it might starts loosing some "bins" around 50 °C or so).
    With that said, strangely enough, I often observe higher turbo clocks than whatever GPU-Z reports or even what's advertised on the manufacturer's site.
    This is more of a hit and miss (or in case of manual OC, a trial and error) thing.

    Try to increase your TDP limit (literately to the maximum allowed) and undo any Voltage limit increase (if present) and see what happens. Keep monitoring your power draw (in percentage relative to set TDP).
    Now try to over-cool your GPU, set the fans to fixed 100% manually for a test. See what happens with the clocks (keep the TDP at max).
    (Don't use Ray-Tracing or DLSS for now. I have no experience with how those parts of the GPU effect the total power draw.)
     
    Last edited: Apr 12, 2020
    mr1hm likes this.
  8. christian bouchard

    christian bouchard Guest

    Messages:
    1
    Likes Received:
    2
    GPU:
    RTX 2070 SUPER
    its easy to see you probably suffering of a BIG CPU bottleneck
     
    BlindBison and mr1hm like this.
  9. mbk1969

    mbk1969 Ancient Guru

    Messages:
    15,605
    Likes Received:
    13,614
    GPU:
    GF RTX 4070
    Set "Adaptive" power mode globally, and then set "Prefer Performance" mode per game. Solved.
     
    evgenim, Smough and mr1hm like this.
  10. mr1hm

    mr1hm Active Member

    Messages:
    97
    Likes Received:
    2
    GPU:
    2070 Super XC Ultra
    Wow, 50C? That seems so low. I increased my power limit yesterday in MSI Afterburner to the max 111% and since the temp limit was linked to it, that was automatically set to 88C (from the default 83C when power limit is set to 100%). I gave it +15 to the core clock and tested in Metro Exodus. The card was still solid @ 1605Mhz throughout the 30min I played the game. The game was set to the "High" setting and I turned on Advanced PhysX and hairworks (1920x1080).

    Metro Exodus is the only game I currently have that's pushing the GPU's temperature above 60C. The GPU hit a maximum of 63C in the 30min that I was able to play it.

    I made sure GPU-z was showing the correct boost clock speed of 1785 (since I did a +15 to the core clock). Does this mean that the GPU thinks that it still doesn't need to boost the core clock?

    I agree, that in certain games my CPU will be the limiting factor, but I just don't have the means to upgrade my CPU at the moment. I saw my CPU being the bottleneck in Call Of Duty: Warzone (especially at the start with 200 players) @ 1920x1080 on High/Normal settings with Resolution Scale set to 110%.

    Yep, that did the trick. I still don't see why my core clock would be stuck at a solid 1605Mhz though... My power limit and temp limit was maxed out and my temperatures never went over 63C.


    EDIT: I'm gonna start slowly bring my CPU back to stock settings and see if that changes anything.

    EDIT #2: Not seeing a difference while my CPU was at stock settings. Also ran Metro Exodus on Ultra settings with Advanced PhysX and hairworks (left Ray-Tracing and DLSS off). That seemed to put more stress on the card since I noticed FPS drops here and there, but the core clock still seems to be stuck at 1605mhz
     
    Last edited: Apr 13, 2020

  11. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti

    Power limit and temp limit raise what is allowed. By themselves alone, they will not increase your clocks. They can decrease them, if you have them set too low. Some folks do that intentionally for folding. But, as was mentioned, Adaptive globally and Max Perf set per profile has been generally accepted as best practice.
     
  12. The Goose

    The Goose Ancient Guru

    Messages:
    3,057
    Likes Received:
    375
    GPU:
    MSI Rtx3080 SuprimX
    Have you tried adding your games to system/settings/graphic settings and setting a game profile to performance mode
     
  13. loracle

    loracle Master Guru

    Messages:
    442
    Likes Received:
    212
    GPU:
    GTX 1660 super
    First, EVGA is a shitty brand better change for Gigabyte or Asus, then just let 3770k to default clocks to overclock it to 4.4 is not a good idea, and the fps gain is minimal, also don't mess with driver power management its way better to let it by default, in general overclocking is just a placebo, maybe you can get some 3-5 fps or little bit more, but you will experience lot of problems like fps drops , throtling and so on, (don't listen to people who push you to overclock), and Rust is not a big demanding game so first thing to do is to change your GPU brand,and maybe its faulty, your rig can run all demanding graphics games on ultra without problems, maybe you need to add some ram, and verify the power supply, that's all.
    NB: and i don't think it's a cpu bottleneck cause 3770k with a good fan still good for all games.
     
  14. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    OC is not placebo, LOL. In case of the CPU, ~10% increase in core clocks is pretty much ~10% increase in processing power. Of course that doesn't necessarily mean a ~10% increase in game frame rates (but it potentially could if the system is very CPU limited for the game). GPU OC is a bit less linear but unless a game engine or a GPU design is stupid (or two weird cases meet each other) you can expect nice gains until you hit power or heath limits (mind that even stock boost clocks can hit those, it depends on many things).

    I am not sure about the 3770k but my 2500K was clocked to something around 4400 MHz (the highest it could go with water) and it wasn't enough for neither my older 290X or it's replacement GTX1070 in some single-player Frostbite3 games (and they say multi-player is even more CPU intensive) for 1080p60. Now the 8600K at 4900MHz is almost similar for 1080p120 (barely enough) or even 1440p120 at times (although I started using that resolution after switching to a slightly faster RTX2060 from the GTX1070).
    Sure, in some games, the CPU is barely used, hovers around ~10% or so. In other games, the GPU behaves somewhat similar. But in some games, they both hit ~100% at the same time (like AC:Unity on my PC).
     
  15. loracle

    loracle Master Guru

    Messages:
    442
    Likes Received:
    212
    GPU:
    GTX 1660 super
    I would advice to buy for intel an i7 x700k series ( a little bit expensive but you'll be ok for a very long time) and a factory overclocked gpu, than overclocking (even with the K of intel cpu, you can guess that the factory want you to kill your cpu rapidly in order to buy the new one next year), and even if it's not a complete placebo "OC" gives you a minimal gain in fps, but can cause weird behavior of your system or specific part instability, and overheating that will kill your rig prematurely, but it's up to you.
     

  16. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    Any why is that? Hyper-Threading rarely ever helped anything with game performance (or most workloads in general, safe for a few highly multi-threaded and even fewer SMT-optimized software like HD video encoders) and it's efficiency got crippled with the spectre/meltdown patches (effectively rendering it useless). In case I had an i7 already anyways, I would probably disable HT on it, effectively converting it into an i5 (save for the bigger cache but that's yet again something not overly important for most game engines or generic user software with only a few exceptions).
    Thus, buying an i5 and setting it to i7 clocks yield significantly better performance/price. Right now, it's a no-brainer, actually (in the past you could argue HT might worth it in a few cases but not anymore since it's also a security risk now).
    If you don't know what you are doing (how to run some meaningful stability tests to check your OC or how to handle the potential dangers of intermittent instability, etc) then sure. But that thing about the K series is simply stupid. They charge an extra because it allows you to get more performance out of it. You get what you payed for. No conspiracies there. I always ran my CPUs at "insane" voltages and nothing bad ever happened (I had my 2500K for 5+ years, always overclocked with high voltages, regurarlly checked with linx/prime95, it never degraded with water cooling), Intel themself specify really insane absolute peak operating voltages (if you care to read the spec sheets).
     
  17. bobblunderton

    bobblunderton Master Guru

    Messages:
    420
    Likes Received:
    199
    GPU:
    EVGA 2070 Super 8gb
    A-men to THAT. I could never do content creation or lengthy intensive work on a laptop without inadvertently having a wiener roast goin' down. God bless the desktop PC.

    Keep in mind older processors lack newer instruction sets. It might be time to upgrade, not just for processor speed - but for memory bandwidth and newer drive ports / USB too. 4.4ghz is pretty decent for Ivy Bridge, but even my 3700x shamed my old 4.4ghz i7 4790k that was delidded / liquid metal / air cooled, with much better minimum FPS marks, and a lack of performance degradation due to security-flaw patches. So the grass definitely is greener, but I'd definitely try and fix the existing system first if just changing the power options doesn't entirely resolve it.
    Overvolt + overheat = degradation, the severity of which depend on how much and how long. If you keep it cool enough, most processors will readily handle increased voltage.
    However, provided he keeps his system cool and doesn't exceed common known maximum voltages for the cpu cores for 24/7 usage, he'll be okay with it for quite a while. Often it's the motherboard going up in smoke at the VRM area before killing the cpu.
    An intel i7 is not generally recommended at this time, for any budget-conscious buyer. If it's a more-money-than-sense built, then feel free to get an i9 9900k/kf/ks. It's very close to a new generation release for intel so best wait if upgrading the processor soon.
     
  18. loracle

    loracle Master Guru

    Messages:
    442
    Likes Received:
    212
    GPU:
    GTX 1660 super
    Any why is that? Hyper-Threading rarely ever helped anything with game performance (or most workloads in general, safe for a few highly multi-threaded and even fewer SMT-optimized software like HD video encoders) and it's efficiency got crippled with the spectre/meltdown patches (effectively rendering it useless). In case I had an i7 already anyways, I would probably disable HT on it, effectively converting it into an i5 (save for the bigger cache but that's yet again something not overly important for most game engines or generic user software with only a few exceptions).
    Thus, buying an i5 and setting it to i7 clocks yield significantly better performance/price. Right now, it's a no-brainer, actually (in the past you could argue HT might worth it in a few cases but not anymore since it's also a security risk now).


    @janos666

    I tried with disabling HT and i can assure you that's not the same thing, cpu with hyper threading enabled is way better, i bet you never had an i7 cpu, and advice you to never install specter/meltdown patches it's just a hoax, with my 5 years old cpu, i never got any trouble with that crap, maybe a very little drop in performance but nothing bad, it still very good until now, ut when installing those patches cpu goes mad, i tell you a scoop, intel will implement Hyper Threading in all I5 next cpu generation, so i don't think they are too stupid when doing that, unlike what you are saying.
    And be logic and honest which is better to bye a strong I7 cpu with HT and stay with default or push an I5 cpu to its max ?[/QUOTE]
     
  19. loracle

    loracle Master Guru

    Messages:
    442
    Likes Received:
    212
    GPU:
    GTX 1660 super
    If you don't know what you are doing (how to run some meaningful stability tests to check your OC or how to handle the potential dangers of intermittent instability, etc) then sure. But that thing about the K series is simply stupid. They charge an extra because it allows you to get more performance out of it. You get what you payed for. No conspiracies there. I always ran my CPUs at "insane" voltages and nothing bad ever happened (I had my 2500K for 5+ years, always overclocked with high voltages, regurarlly checked with linx/prime95, it never degraded with water cooling), Intel themself specify really insane absolute peak operating voltages (if you care to read the spec sheets).

    @janos666

    Please stop hurting your cpus, its just stupid to do that, and buy an I7 and then we'll talk, and the K, it's not a reason to overclock cause intel is saying that, use your brain, they would be happy if you kill your cpu rapidly, but i advice you to buy a K cpu cause it has a good margin and is enough strong even if pushed to the max all the day it will last for a long time, last thing please quit overclocking cpus, gpus, rams, all what you are doing is decrease the lifespan of your rig, and take care with covid19.
     
  20. Merlena

    Merlena Active Member

    Messages:
    78
    Likes Received:
    16
    GPU:
    RTX 3070 8G
    I agree with @Ioracle here. If you are shopping for a CPU, get the high-end Ks - Good silicon, you can overclock it, but it is not needed if you ask me. I'm not an experienced OC, but I've used quite some time on trying to push my 4790K to some crazy speeds people claim are "stable" and that has just lead to high temperatures, possibly decreased lifespan, and system instability.

    You should rather use your valuable time (yes it is valuable!) doing something productive rather than OCing an already out-of-the-box great CPU - but that is just my opinion!

    Stay safe!
     
    loracle likes this.

Share This Page