1. klunka

    klunka Member

    Messages:
    37
    Likes Received:
    6
    GPU:
    1080ti / 11gb
    Hi,
    I stumbled over this reddit post I will link below. Don't understand half of what it says, but as a simple gamer looking for tweaks to optimize input lag and performance this caught my attention.

    https://www.reddit.com/r/Amd/commen...ll_been_no_more_complete_test_on_how/dg81ds7/
    Can we remove this functionality from the driver, maybe even safely? (quick google search showed some people did it for linux drivers)
    Anyway I thought this would fit here and from my search this wasn't discussed before on g3d.
    Have a nice day!
     
    Dragondale13 and enkoo1 like this.
  2. kurtextrem

    kurtextrem Master Guru

    Messages:
    251
    Likes Received:
    40
    GPU:
    NVIDIA GeForce GTX 970
  3. theahae

    theahae Active Member

    Messages:
    67
    Likes Received:
    12
    GPU:
    GTX 1060
    iam waiting for nvcp defenders to start shilling
     
  4. mbk1969

    mbk1969 Ancient Guru

    Messages:
    15,540
    Likes Received:
    13,556
    GPU:
    GF RTX 4070
    https://forums.developer.nvidia.com...struction-causes-huge-latency-spikes/31643/11

     

  5. mbk1969

    mbk1969 Ancient Guru

    Messages:
    15,540
    Likes Received:
    13,556
    GPU:
    GF RTX 4070
    https://sites.utexas.edu/jdm4372/tag/accelerated-computing/

     
  6. mbk1969

    mbk1969 Ancient Guru

    Messages:
    15,540
    Likes Received:
    13,556
    GPU:
    GF RTX 4070
    ^ I suspect if it was that simple to not use WBINVD then NV devs would not use it. I suspect that NV devs could implement very complex, sophisticated algorithm without WBINVD (or not to use it on certain rigs), but decided to use the instruction as a simplest implementation/workaround.

    Developers are humans, so they are lazy and they stop to improve the source code right at the moment when it starts to work.
     
  7. Martigen

    Martigen Master Guru

    Messages:
    534
    Likes Received:
    254
    GPU:
    GTX 1080Ti SLI
    Interesting, but it seems odd @mbk1969 that NV needs to do this at all, when other drivers and software presumably don't need to? And why does AMD not need to do it?

    Referring to the other part of the post, I've always wondered why MSI isn't enabled by default for NV. Never heard or seen of an issue related to this anywhere, and again AMD doesn't do this. I manually enable MSI mode every driver install. Never had an issue, and my PC hardware (CPU/mobo etc) is 6 years old and using 1080Tis in SLI.

    @ManuelG -- be interested to hear from you if the NV dev team can provide input.
     
  8. mbk1969

    mbk1969 Ancient Guru

    Messages:
    15,540
    Likes Received:
    13,556
    GPU:
    GF RTX 4070
    @Martigen
    Without seeing AMD drivers source files I would not assume their devs do not use WBINVD or any other means.
    Also it can be that AMD devs solved the cache coherency problem without WBINVD (or just have not tried to solve it with WBINVD).

    Without seeing the NV source files I would not assume that NV devs used WBINVD in all places/paths (OpenGL, OpenCL, Vulkan, DitectX) and for all CPUs. It can be that instruction is used only for CUDA API, for example.

    And overall the subject is so deeply hardware related, that knowing the amount of possible HW-FW-SW combinations I would not blame devs for choosing the simplest (but maybe not the most efficient) implementation.

    PS But if the case is in the laziness, if devs just abandoned certain pieces of code and the code became a legacy one - not needed anymore - then of course would be nice to get rid of it.
     
  9. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    Any day now... keep waiting
     
  10. mbk1969

    mbk1969 Ancient Guru

    Messages:
    15,540
    Likes Received:
    13,556
    GPU:
    GF RTX 4070
    Also it can be that the devs who wrote that code (with WBINVD) left the company and new devs do not dare to touch the code because they do not understand it.
    I saw such situation at one of my previous work places.
     
    Last edited: Aug 1, 2020
    kurtextrem and hemla like this.

  11. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,011
    Likes Received:
    7,351
    GPU:
    GTX 1080ti
    Manuel won't comment on the situation, even if he could, an engineer write up would be required to make for a suitable explanation of why and where.


    That said, anything you have seen regarding the use of this instruction is from programmatically ignorant nobodies that latch onto something they read and attribute it to all their woes.

    nvidia's code is obfuscated and without symbols, so nobody except nvidia knows what is being used and when.

    Nvidia still supports PCIE controllers and Bios going back to Gen 1 with only Turing finally breaking? support for chipsets that implement it,

    many of these did not implement MSI's to spec, be it from their own chipsets or even AMD's.


    The irony is nvidia could still use this instruction and still utilise cpu's for better performance in dx9 and 11 than the competitor.
     
    Last edited: Aug 1, 2020
    mbk1969 likes this.
  12. hemla

    hemla Master Guru

    Messages:
    239
    Likes Received:
    27
    GPU:
    nvidia
    Did someone say spaghetti code?
     
    mbk1969 likes this.
  13. Smough

    Smough Master Guru

    Messages:
    984
    Likes Received:
    303
    GPU:
    GTX 1660
    All I can say is that turning MSI mode on my GPU lowers DPC latency dramatically (if I leave it as stock, it's a bit higher and I can prove with it screenshots). As of why nVidia only does it on certain products or why AMD already set MSI modes on their drivers; no clue, but I don't see any negative effects by having MSI modes on.
     
  14. mbk1969

    mbk1969 Ancient Guru

    Messages:
    15,540
    Likes Received:
    13,556
    GPU:
    GF RTX 4070
    Ain`t that a contradiction?
     
  15. tiliarou

    tiliarou Active Member

    Messages:
    61
    Likes Received:
    13
    GPU:
    Nvidia 980M
    It's not.
    MSI mode enabled = lower DPC latency
    Disabled (= legacy) = higher DPC latency
    (I'm just reflecting the sense of the sentence, I didn't test this on my end)
     
    Smough likes this.

  16. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,011
    Likes Received:
    7,351
    GPU:
    GTX 1080ti
    MSI's don't do anything for DPC's, all they do is remove a step in serving an interrupt, its your ISR's times that will show an actual difference.

    if you see DPC's drop its because your system has way too much crap bogging it down and with the interrupt heavy RGB crap and utilities that tell a aio block to keep going at X speed every 50ms im not surprised
     
    Last edited: Aug 2, 2020
  17. AsiJu

    AsiJu Ancient Guru

    Messages:
    8,806
    Likes Received:
    3,368
    GPU:
    KFA2 4070Ti EXG.v2
    Or GPU shares an IRQ with another device by default and setting MSI mode removes that conflict.

    I'm fairly certain this is the explanation for most cases where setting MSI mode for GPU makes a notable difference.

    (FWIW I've tried it a few times and saw no difference one way or the other.
    This with both Maxwell and Turing).
     
  18. mbk1969

    mbk1969 Ancient Guru

    Messages:
    15,540
    Likes Received:
    13,556
    GPU:
    GF RTX 4070
    I meant the wording: first Smough used word "dramatically" and next he wrote "a bit higher". To me "dramatically" =/= "a bit".
     
    tiliarou likes this.
  19. klunka

    klunka Member

    Messages:
    37
    Likes Received:
    6
    GPU:
    1080ti / 11gb
    How hard would it be to remove the bwinvd flushing functionality from the driver and what's the worst case scenario?
    I think that's what it comes down to.
    The guy who did it on linux said he ran his system for at least weeks without problems.
    I'd be willing to test it on my machine if chances of permanent damage are low. Sadly I can't do much else, I'm not educated in any of this.
     
    Smough likes this.
  20. Angantyr

    Angantyr Master Guru

    Messages:
    901
    Likes Received:
    315
    GPU:
    MSI 2070 Super X
    That actually made me recall an amusing example where Razer & Docker both used the same wrong answer from Stackoverflow, which resulted in the programs not being able to run simultaneous.
    The whole Twitter thread about it is here, and is quite amusing https://twitter.com/Foone/status/1229641258370355200

    Another amazing example is this Stackoverflow answer describing how to make such a mutex and gives an example with a hardcoded string of {8F6F0AC4-B9A1-45fd-A8CF-72F04E6BDE8F}.
    So if you're curious to see if someone has been lazy and not changed that string, search for the string on Github and you'll find a fine collection of programs that won't run at the same time because somebody copy-pasted an answer off stackoverflow :p

    https://github.com/search?q=8F6F0AC4-B9A1-45fd-A8CF-72F04E6BDE8F&type=Code
     
    mbk1969 likes this.

Share This Page