NVIDIA GeForce 526.47 WHQL driver download & Discussion

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Alberto_It, Oct 27, 2022.

  1. Terepin

    Terepin Guest

    Messages:
    873
    Likes Received:
    129
    GPU:
    ASUS RTX 4070 Ti
    That is factually incorrect, as proved by Blur Busters.
     
    Last edited: Oct 31, 2022
    Cave Waverider likes this.
  2. andy rett

    andy rett Member

    Messages:
    32
    Likes Received:
    21
    GPU:
    3090/16gb
    RELAX . It's only a sensor bug .
    My rtx 3080 is on 38 C° with this Task manger GPU to 100% .
    Maybe Nvidia will fix this on the next driver release .
     
  3. andy rett

    andy rett Member

    Messages:
    32
    Likes Received:
    21
    GPU:
    3090/16gb
    RELAX . It's only a sensor bug .
    My rtx 3080 is on 38 C° with this Task manger GPU to 100% .
    Maybe Nvidia will fix this
     
  4. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,355
    Likes Received:
    1,815
    GPU:
    7800 XT Hellhound
    Except that nothing but vsync ensures getting 100% free of tearing result and it doesn't increase lag at all when used with VRR + an fps limiter (ideally in-game/Reflex):
    [​IMG]

    Video:

    You get more lag with RTSS limiter without vsync + occasional tearing. Hard to get that superstition out of people, it seems...


    Nvidia limiter draws flat line in RTSS graph, I wonder how your "better" is supposed to look like...

    Then you'll get a much steeper increase in input lag vs. NULL once you drop below your fps limit by the tiniest margin. Not sold for it.
     
    andy rett, Cave Waverider and Undying like this.

  5. Undying

    Undying Ancient Guru

    Messages:
    25,334
    Likes Received:
    12,743
    GPU:
    XFX RX6800XT 16GB
    I've been reading through its forums on the topic and there is alot of opinions on the matter. General consensus is that vsync(nvcp)+gsync is the the way only becouse its 100% tear free.

    Still for NULL :

    "Regarding Low Latency Mode "Ultra" vs. "On" when used in conjunction with G-SYNC + V-SYNC + -3 minimum FPS limit, I'd currently recommend "On" for two reasons:

    1. "On" should have the same effect as "Ultra" in compatible games (that don't already have a MPRF queue of "1") in reducing the pre-rendered frames queue and input lag by up to 1 frame whenever your system's framerate drops below your set FPS limit vs. "Off."

    2. Since "Ultra" non-optionally auto-caps the FPS at lower values than you can manually set with an FPS limiter, for the direct purposes of point "1" above, you'd have to set your FPS limiter below that when using "Ultra" to prevent it from being the framerate's limiting factor, and allow the in-game (or RTSS) limiter to take effect. At 144Hz, you would need to cap a couple frames below 138, which isn't a big deal, but at 240Hz, "Ultra" will auto-cap the FPS to 224 FPS, which I find a little excessive, so "On" which doesn't auto-cap, but should still reduce the pre-rendered frames queue by the same amount as "Ultra" in GPU-bound situations (within the G-SYNC range) is more suited to such a setup. "

    https://forums.blurbusters.com/viewtopic.php?f=5&t=5903&start=30#p44825
     
  6. jorimt

    jorimt Active Member

    Messages:
    73
    Likes Received:
    69
    GPU:
    GIGABYTE RTX 4090
    I have addressed this point in my article's comments section and the Blur Busters forum almost too many times to count now. There's been a Closing FAQ entry (#2) in my article for over three years that targets it specifically:
    https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/15/
    The answer is frametime variances.

    “Frametime” denotes how long a single frame takes to render. “Framerate” is the totaled average of each frame’s render time within a one second period.

    At 144Hz, a single frame takes 6.9ms to display (the number of which depends on the max refresh rate of the display, see here), so if the framerate is 144 per second, then the average frametime of 144 FPS is 6.9ms per frame.

    In reality, however, frametime from frame to frame varies, so just because an average framerate of 144 per second has an average frametime of 6.9ms per frame, doesn’t mean all 144 of those frames in each second amount to an exact 6.9ms per; one frame could render in 10ms, the next could render in 6ms, but at the end of each second, enough will hit the 6.9ms render target to average 144 FPS per.

    So what happens when just one of those 144 frames renders in, say, 6.8ms (146 FPS average) instead of 6.9ms (144 FPS average) at 144Hz? The affected frame becomes ready too early, and begins to scan itself into the current “scanout” cycle (the process that physically draws each frame, pixel by pixel, left to right, top to bottom on-screen) before the previous frame has a chance to fully display (a.k.a. tearing).

    G-SYNC + V-SYNC “Off” allows these instances to occur, even within the G-SYNC range, whereas G-SYNC + V-SYNC “On” (what I call “frametime compensation” in this article) allows the module (with average framerates within the G-SYNC range) to time delivery of the affected frames to the start of the next scanout cycle, which lets the previous frame finish in the existing cycle, and thus prevents tearing in all instances.

    And since G-SYNC + V-SYNC “On” only holds onto the affected frames for whatever time it takes the previous frame to complete its display, virtually no input lag is added; the only input lag advantage G-SYNC + V-SYNC “Off” has over G-SYNC + V-SYNC “On” is literally the tearing seen, nothing more.

    Also, I'm not sure why everyone keeps confusing sync latency with pre-rendered frame latency, the latter of which can occur with or without G-SYNC/V-SYNC.

    FYI, a brief non-exhaustive breakdown of G-SYNC + V-SYNC + LLM and/or Reflex behavior below:

    G-SYNC enabled (+ V-SYNC off): prevents tearing only when 1) the average framerate is within the refresh rate and 2) the frametime of a frame is within the scanout time of the currently set physical refresh rate of the display. For any frame that has a frametime well above or below the physical refresh rate's scanout time, G-SYNC will opt to allow tearing instead of syncing the affected frame(s). This is why I refer to G-SYNC on + V-SYNC off as "adaptive" G-SYNC in my article.

    V-SYNC enabled (+ G-SYNC on): When paired with the G-SYNC option, it adheres all frames to the scanout time of the currently set max refresh rate (for average framerates within the refresh rate), whether the frametime of the frame is well above or below the current physical refresh rate's scanout time, preventing tearing in all instances.

    LLM On: limits the max pre-rendered frames to "1" in DX11 and older APIs IF the given game supports external manipulation of the render queue (not all do). If not, LLM "On" = "Off." LLM (be it On or Ultra) does not apply to DX12 or Vulkan games, as they handle the render queue internally.

    LLM Ultra: limits the max pre-rendered frames to "1," and attempts to deliver frames just-in-time (whatever that exactly means; I've never personally observed a difference between it and LLM On in this respect). If used in conjunction with G-SYNC and NVCP V-SYNC, the framerate is automatically limited a few frames below the currently set physical refresh rate to keep the framerate within the refresh rate, ensuring continual G-SYNC engagement. The auto limiting function of LLM "Ultra" when used with G-SYNC does not apply to all games, and doesn't appear to engage in DX12 or Vulkan at all (at least in those I've tested; there may be exceptions, but none I've found as of yet).

    Reflex: uses an engine-level FPS limiter to dynamically limit the framerate just below the currently achievable average framerate as to prevent the GPU usage from maxing, eliminating additional pre-rendered frame generation due to any GPU-limitation. If paired with G-SYNC + VSYNC (typically the NVCP variant), similar to LLM "Ultra," it will automatically (statically) limit a handful of frames below the currently set physical refresh rate separate of the dynamic limiting method used during GPU-limitation. Additionally, when Reflex is enabled, LLM, no matter what it's set to in the NVCP, will be overridden and disabled since Reflex is considered its replacement.

    Reflex + Boost: same behavior as Reflex, but like "Prefer maximum performance," it will prevent the GPU from downclocking below its base max clock.
     
    Last edited: Oct 31, 2022
    lionhad, Xtreme512, Undying and 4 others like this.
  7. Martigen

    Martigen Master Guru

    Messages:
    534
    Likes Received:
    254
    GPU:
    GTX 1080Ti SLI
    Thanks for the summary, I remember reading the original article years ago, it's good to get a refresher.

    Two questions that come to mind:

    1) How does this work out with Fast Sync? I could be wrong, but I don't think Fast Sync was around then. And the way it currently works now, with or without Gsync as far as I'm aware, is that it delivers the tear-free experience of Vsync On with the low-latency (or very close to) of Vsync Off -- basically the best of both worlds, and no need to use anything else. But how does that impact Gsync + frame capping etc?

    2) For LLM -- we always talk about the best option when the framerate cap is easily met -- but what about a demanding game on old hardware where you never get close to your framerate cap: here, I imagine, having LLM set to Off would be better? It may introduce an extra frame or two of latency, but it should also in theory smooth out frame pacing where the GPU is overloaded, and thereby provide a better overall experience?

    Thanks for any insights!
     
  8. jorimt

    jorimt Active Member

    Messages:
    73
    Likes Received:
    69
    GPU:
    GIGABYTE RTX 4090
    Fast Sync was indeed around when I wrote the article, and I addressed it in part 8 here.

    If the legacy game in question isn't GPU-bound, and LLM override is supported in said game, it won't do anything to reduce render queue latency further in that scenario.

    That said, I don't want to derail this driver thread by triggering an impromptu AMA, so I'll leave it there. Anyone is free to revisit the article and leave a comment there or in the BB forums and I'll follow-up where I can.
     
    ReaperXL2 and Mapson like this.
  9. CYP3ORG

    CYP3ORG Active Member

    Messages:
    96
    Likes Received:
    47
    GPU:
    ASUS STRIX RTX 2060
    I'm baffled to see that BFV framerate has gone down considerably since 522.25.

    About 5% of loss in framerate... can't hit 120 flat anymore :(
     
  10. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,355
    Likes Received:
    1,815
    GPU:
    7800 XT Hellhound
    It feels more direct than on and affects frame times (not necessarily in a bad way, but well, sometimes) and sometimes even avg fps. Extreme example is GoW, where ultra causes similar hit like Reflex:
    On:
    [​IMG]

    Ultra (Reflex really is set to off):
    [​IMG]
     
    jorimt likes this.

  11. jorimt

    jorimt Active Member

    Messages:
    73
    Likes Received:
    69
    GPU:
    GIGABYTE RTX 4090
    Right, I'm actually aware there is a functional difference, I was just being rhetorically dismissive, since I find the "just-in-time" component (which Nvidia does break-down, albeit briefly in their original release article) of Ultra problematic at worse, and underwhelming at best.

    It's simply rarely appreciable in real-world scenarios in my experience (probably because support is hit-or-miss per game), and when it is, like you mentioned, it may affect frametime performance (mostly negatively; which is what happens when you try to delay a frame until last minute at the driver instead of engine-level) and/or lower average framerate over LLM "Off" or "On."

    Reflex is obviously the better choice for the same purpose (when there is one).

    Also, I completely forgot GoW only runs in DX11 for a second and was wondering why Ultra was applying in your captures xD
     
    aufkrawall2 likes this.
  12. kens30

    kens30 Maha Guru

    Messages:
    1,225
    Likes Received:
    93
    GPU:
    RTX 3070 GAMING OC
    It has been a while since i posted in this section of the forums (Waiting 2 years to upgrade finally to a RTX 3070, although i really wanted a RTX 4080 for the extra vram). Anyway it would be nice and useful before someone posts an issue or bug to mention at first which Windows version they are running 10 or 11.

    To have a whole picture of the issue being described for any user.... It is most likely that Nvidia is mostly pushing fixes for Win 11 rather than 10 so it would be nice to know before hand.
    So far the only issue that is annoying me is the rainbow texture bug in Forza Horizon 5, but i don't really expect a fix unless Playground Games fixes it's texture streaming mechanism.

    And finally to Astyanax or should i say NAstyanax dude i feel sorry for you, stuck on a pc monitor all day.Always trying to prove that you know everything and you don't give a s--t about anyone's opinion, you always find a way to disprove them all wrong and i don't like that at all. Cheers!!! Have a Nice Day...
     
  13. chispy

    chispy Ancient Guru

    Messages:
    9,979
    Likes Received:
    2,693
    GPU:
    RTX 4090
    Any idea when a new hotfix driver will be release ?
     
  14. cricket bones

    cricket bones Master Guru

    Messages:
    946
    Likes Received:
    365
    GPU:
    EVGA RTX 3080 FTW
    Probably sometime after now.
     
    Wrinkly, Passus and chispy like this.
  15. Jefry_st

    Jefry_st Member

    Messages:
    25
    Likes Received:
    33
    GPU:
    Vega 8
    What good in 526.47 is that it have new Vulkan driver ver. 1.3.224 (vs 1.3.205 in 522.25). Every new version 1.3.206, 1.3.207, 1.3.208 etc... bring lot of extentions, optimithation and fixes. Nvidia don't very often update Vulkan driver in own drivers. So it's huge jump
     
    Last edited: Nov 1, 2022

  16. NotReallySure

    NotReallySure Member

    Messages:
    15
    Likes Received:
    4
    GPU:
    RTX 3070 ti
    So has this driver prevented the gpu fans from cooling during Vulkan based games? If so I think it fried my card, so please beware. Has anyone else had a similar issue?
     
  17. SoppingClam

    SoppingClam Member

    Messages:
    14
    Likes Received:
    8
    GPU:
    RTX 3080 12gb MSI X
    That's different for me. My FPS has only gone up and more stable. Could be something other than Nvidia drivers though for me, if everyone else is getting a decrease in fps for BF
     
  18. XantaX

    XantaX Master Guru

    Messages:
    706
    Likes Received:
    316
    GPU:
    RTX 4070 TI 12G
    Imo game developers releases games too soon for the public, Gotham Knights, Cp2077 and Call of duty Mw2 for example.
     
  19. XantaX

    XantaX Master Guru

    Messages:
    706
    Likes Received:
    316
    GPU:
    RTX 4070 TI 12G
    No BF 2042 increases Fps and run smoother....
     
    OnnA likes this.
  20. XantaX

    XantaX Master Guru

    Messages:
    706
    Likes Received:
    316
    GPU:
    RTX 4070 TI 12G
    This driver clocks down perfectly on my 3060, idle temp 34 C fan 10%
     

Share This Page