1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Nvidia Inspector introduction and Guide

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by MrBonk, Nov 5, 2015.

  1. GuruKnight

    GuruKnight Master Guru

    Messages:
    860
    Likes Received:
    12
    GPU:
    2 x 980 Ti AMP! Ex
    Obviously the "0x000000F5" profile is far too simple to work for trouble-free implicit DX12 Multi GPU.
    It is basically just the equivalent of DX11 AFR rendering without any special compatibility bits enabled.

    NVIDIA would have to invent special SLI bits and undefined SLI values for DX12 for it to be useful.
    However it still might happen as DX12 evolves, just like it was the case with DX11 over the years.

    But I do agree, that explicit DX12 Multi GPU done by developers is definitively the better and cleaner option compared to potentially "hacky" driver solutions.
     
    Last edited: Oct 11, 2016
  2. dr_rus

    dr_rus Ancient Guru

    Messages:
    2,980
    Likes Received:
    322
    GPU:
    RTX 2080 OC
    I think you misunderstand. In DX11 (and 9, and others for that matter, it's just that SLI/CF-breaking temporal algorithms weren't such wide spread back then) the dev had to code the renderer avoiding things which would have certainly made it incompatible with AFR rendering, yes. They did not however code the renderer specifically for SLI in any way, it was just some rules which they had to adhere to to make the renderer compatible with driver-side SLI/CF implementation.

    In DX12 even the implicit mode requires the renderer to specifically enable SLI/CF (multiadapter) while still adhering to the same general rules. So if some DX12 game did adhere to these rules but the devs for some reason didn't make use of implicit multiadapter - there won't be anything on the driver side which would force such game to see SLI/CF config.

    Hence why I'm saying that having DX12 SLI bits in the driver is a bit pointless - as any DX12 game which will use implicit multiadapter will do so as a conscious result of devs wishes, and it will (well, should, according to DX12 specs) work with or without any driver bits set for it anywhere.

    The only tiny way of tweaking here is if some DX12 game will support implicit mGPU but for some reason will work better with a different DX12 SLI bit in the driver. You won't be able get SLI support for a game which doesn't support implicit mGPU in the renderer this way - like you could with DX11 and earlier APIs.
     
  3. jiminycricket

    jiminycricket Member Guru

    Messages:
    191
    Likes Received:
    1
    GPU:
    GTX 1080
    That sounds like linked explicit mGPU a la ROTTR and DX:MD.
     
  4. dr_rus

    dr_rus Ancient Guru

    Messages:
    2,980
    Likes Received:
    322
    GPU:
    RTX 2080 OC
    Implicit is similar, the difference is that in implicit the driver handles the resources management between GPUs while in explicit the renderer must do this, afair.

    Basically, all mGPU options in DX12 require game support, meaning that they must be programmed into the renderer, the difference is only in the amount of programming required. Implicit is the easiest one for the dev, explicit linked is of moderate difficulty and explicit unlinked is the most difficult one but the only one allowing mixing GPUs from different vendors for example.
     

  5. Danny_G13

    Danny_G13 Master Guru

    Messages:
    383
    Likes Received:
    10
    GPU:
    MSI 1080Ti FE @ 2GHz
    Hi guys, I have a question.

    The newest Nvidias have Mafia III SLI profile, but they also, for me, ruin Deus Ex.

    I'm fine on 370.90s for DE but obviously they have no SLI Mafia profile.

    How can I import the official one into Nvidia inspector and get the best of both worlds?
     
  6. GuruKnight

    GuruKnight Master Guru

    Messages:
    860
    Likes Received:
    12
    GPU:
    2 x 980 Ti AMP! Ex
    Never mind.
     
  7. apex84

    apex84 New Member

    Messages:
    3
    Likes Received:
    0
    GPU:
    NVIDIA GeForce GTX 960M
    Is this tool incompatible with GTX 960M? Parts of it seem to be working while others don't.

    This is what I see when launching the app:
    ht tp://i.imgur.com/slZIHrJ.png

    The temperature field is blank but it can be read during the first couple of seconds after the app has been launched.

    The Voltage Offset slider jumps back to its starting position if I hit Apply Clocks & Voltage without actually changing the voltage. The Base and Memory Clock sliders seem to be functioning as the clock readings in the left fields change when I hit Apply. I'm guessing the voltage part could be locked in BIOS but since this is a mobile GPU in a notebook, the BIOS doesn't have any options to unlock it (if it's locked that is and not because of app incompatibility). I'd really like to be able to undervolt this GPU to lower the temperatures as the notebook can get quite hot when both the CPU and GPU are under load while playing more demanding games.

    The Profile Inspector part seems to be working as intended.
     
  8. jiminycricket

    jiminycricket Member Guru

    Messages:
    191
    Likes Received:
    1
    GPU:
    GTX 1080
    This is due to Optimus. Keep your dGPU activated by running a 3D app at the same time and temperature reading will stay on.

    You can't increase voltage in software without an unlocked vBIOS. Undervolting in software is not possible on Maxwell mobile GPU, you have to edit the vBIOS manually.
     
  9. GanjaStar

    GanjaStar Maha Guru

    Messages:
    1,149
    Likes Received:
    1
    GPU:
    MSI 4G gtx970 1506/8000
    Did I just not notice it before, or is there a new setting exposed in nvinspector now since 375.57 hit?

    "OpenGL version override"
     
  10. khanmein

    khanmein Ancient Guru

    Messages:
    1,649
    Likes Received:
    71
    GPU:
    EVGA GTX 1070 SC

  11. GuruKnight

    GuruKnight Master Guru

    Messages:
    860
    Likes Received:
    12
    GPU:
    2 x 980 Ti AMP! Ex
  12. Enclose

    Enclose Member

    Messages:
    26
    Likes Received:
    3
    GPU:
    GTX 980TI
    Hey guys,

    I got a problem.

    I tried to cap the FPS ( V2 Limiter ) for my games to 60. I have a 120HZ Monitor from Benq.

    I have litteraly no screen tearing anymore. Even without Vsync, i just have to cap it with Inspector to 60 fps.
    But now i saw that i have on the bottom of my Monitor a little of Tearing. I dont know what this is. The heavy most important part (90% of it) of my Monitor has no Tearing anymore but the small bottom part of it has existent problems :/
    Never saw anything like this before, everytime i move my Character for example in the Game just the Bottom of my Monitor has Tearing, just some CM, everything else just works fine :/

    Can anyone explain me what this is? You guys can try it out when you have no Vsync active and no Gsync, its just very irritating. I dont know if im the only one with the problem.


    Would appreciate some help! :)

    Thanks!
     
  13. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,357
    Likes Received:
    893
    GPU:
    1080Ti H20
    That's normal and the only solution is to enable vsync.
     
  14. RealNC

    RealNC Ancient Guru

    Messages:
    2,992
    Likes Received:
    1,256
    GPU:
    EVGA GTX 980 Ti FTW
    The nvidia frame limiter seems to automatically enable adaptive vsync sometimes. I have not figured out why and when it does so. But the result is that sometimes there's tearing, sometimes there's not.

    I recommend using RTSS, cap to 60, enable 1/2 vsync in inspector if you're on 120Hz, and use CRU to change your refresh rate from 120.00hz to 120.05hz. This will give you no input lag and still perfect motion without microstutter.
     
  15. aufkrawall2

    aufkrawall2 Master Guru

    Messages:
    415
    Likes Received:
    3
    GPU:
    MSI RX 580 Armor
    For me, this happens clearly when the fps-limit is close to refreshrate or a part of it. I'm absolutely sure about it.
     

  16. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,357
    Likes Received:
    893
    GPU:
    1080Ti H20
    I still get the same issue even at 130fps limit,14fps is not close.
     
  17. aufkrawall2

    aufkrawall2 Master Guru

    Messages:
    415
    Likes Received:
    3
    GPU:
    MSI RX 580 Armor
    With which refreshrate? Maybe you are close to a divider of it. The higher the refreshrate is, the more likely this is.

    To disable this behavior, you can set the value for "Frame Rate Limiter 2 Control" to 0x00000040.
    Then choose a fps value from the fps limtier v1 values which come with the Inspector.
     
  18. Enclose

    Enclose Member

    Messages:
    26
    Likes Received:
    3
    GPU:
    GTX 980TI

    so this way gives me no tearing on the bottom and no input lag at all?
    is there any other solution for no input lag and no tearing?
     
  19. RealNC

    RealNC Ancient Guru

    Messages:
    2,992
    Likes Received:
    1,256
    GPU:
    EVGA GTX 980 Ti FTW
    You get no tearing, and the input lag is the vsync input lag of your refresh rate. So it's about 17ms. Normally, you get at least the double of that, usually more, sometimes close to 100ms lag even, if you just use vsync without any tweaks whatsoever (like not setting pre-rendered frames to 1.) That's because frames pile up in internal buffers. Capping to 0.05FPS below the refresh rate helps to keep the frame buffers empty so they don't pile up. It blocks the game from rendering too many frames that would fill the buffers. And 0.05 is low enough for there to not be any noticeable microstutter.

    Gsync and freesync.

    Or fastsync, which works on all monitors (set the vsync value to "fast" in the nvidia panel, disable vsync in-game.) However, for it to work well you need to reach 200FPS or more in games. Otherwise, there's gonna be microstutter.

    Fastsync has virtually zero input lag. But you need very high FPS for it to look as smooth as vsync. Otherwise, it's gonna be somewhat stuttery. But input lag is pretty much the same as vsync off regardless.
     
    Last edited: Oct 26, 2016
  20. RealNC

    RealNC Ancient Guru

    Messages:
    2,992
    Likes Received:
    1,256
    GPU:
    EVGA GTX 980 Ti FTW
    That doesn't work. The frame limiter is simply broken. I've set 0x00000040 for the control, set 51FPS as the limit, but nope. I get 60FPS in-game. I've set 27FPS as the limit, nope, I get 30FPS.

    It's just a broken PoS, we have to accept that :)

    Just use RTSS. It works.

    It is amazing though that nvidia is unable to code a simple frame limiter. Like, really? You can't get a frame limiter implementation to work right?

    Seriously?
     

Share This Page