Nvidia Inspector introduction and Guide

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by MrBonk, Nov 5, 2015.

  1. GuruKnight

    GuruKnight Master Guru

    Messages:
    865
    Likes Received:
    15
    GPU:
    2 x 980 Ti AMP! Ex
    I suspect you are referring to "MFAA" :)
    This is simply a temporal filter, which enhances ingame MSAA in certain supported DX10+ titles.

    It shouldn't be confused with real hardware transparency AA methods, such as SGSSAA and TrSSAA.
    Combining MFAA and SGSSAA/TrSSAA can't be recommended.
     
  2. dr_rus

    dr_rus Ancient Guru

    Messages:
    2,985
    Likes Received:
    333
    GPU:
    RTX 2080 OC
    Why not? MFAA should work with SGSSAA in the same way it works with MSAA.
     
  3. GuruKnight

    GuruKnight Master Guru

    Messages:
    865
    Likes Received:
    15
    GPU:
    2 x 980 Ti AMP! Ex
    Actually not, since SGSSAA is dependant on MSAA to work correctly and requires driver LOD adjustments.
    "Enhancing" that with MFAA can cause various issues in my experience.

    But however enhancing ingame TXAA settings with SGSSAA can look nice in some cases (e.g. Watch Dogs).
    But this of course requires more hardware power, and should preferably also be combined with some downsampling to prevent blurring.
     
  4. heymian

    heymian Master Guru

    Messages:
    622
    Likes Received:
    0
    GPU:
    ASUS Strix GTX 1080 Ti
    Yes sorry, I meant MFAA. Will edit post. Also just to clarify a couple things, there was an update March of this year that expanded MFAA support to pretty much any title that uses MSAA. I am currently using it with CS:GO which is a DX9 title from 2012.

    Also I'm not directly comparing MFAA against SGSSAA etc. Just pointing out that MFAA cleans up transparent textures, not just polygon edges. Refer to this video: http://youtu.be/fbyM5yYi-3c

    Anyways, don't wanna swing this thread way off topic. My hardware is not powerful enough to push the kind of frame rates I like with TrSSAA enabled anyways. I was more or less curious.
     
    Last edited: Nov 10, 2015

  5. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,117
    Likes Received:
    132
    GPU:
    MSI RTX 2080
    MFAA and SGSSAA are a bad idea. (Especially since we are limited to DX10+ and SGSSAA can only be enhanced.)_
    Not to mention the minimum FPS floor for MFAA to work without blurring and smearing in motion even with just MSAA is ~40FPS.

    And the game has to have MSAA support already built in.
     
    Last edited: Nov 10, 2015
  6. scaramonga

    scaramonga Member Guru

    Messages:
    156
    Likes Received:
    13
    GPU:
    1080 Ti FTW3 W/C
    Hmm.

    'Prefer Maximum Performance' on 'global' just ensures your card is running hotter than it needs to be, even while the system is doing nothing. Not a good idea if your cooling ain't up to much. It is nice to have it clock down whilst doing mediocre tasks, such as browsing etc ;)

    'Adaptive' on 'global', then 'Max' on gaming is what I have set :) I'm on water, and the difference is 10c, on idle, between the two.
     
    Last edited: Nov 11, 2015
  7. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,117
    Likes Received:
    132
    GPU:
    MSI RTX 2080
    The only way to get Adaptive to kick in globally for me is to reset the Display Drivers.

    I'm on air, and the difference is about 9c idling.

    Honestly, Adaptive can cause problems for a lot of games. And the average person probably doesn't want to or remember to have to go in and set Prefer max for every single game they play.

    That's the reason for my recommendation but perhaps I should add a note for this?

    Which I have now done.
     
    Last edited: Nov 11, 2015
  8. bjoswald

    bjoswald Member Guru

    Messages:
    156
    Likes Received:
    6
    GPU:
    Intel UHD 630
    I just set it to 'Prefer Maximum Performance' and forget about it. I can't be bothered to reset it for every game. Honestly, even when running full-screen videos and such, I haven't noticed any difference.
     
  9. A49ER08

    A49ER08 Member Guru

    Messages:
    186
    Likes Received:
    0
    GPU:
    EVGA GTX 1080 SC2
    has anyone had any success tinkering with G-Sync using this program?

    for example.. is it possible to turn G-Sync on and off ?

    for some odd reason.. my G-Sync turns off and is not visible in nVidia CP when the PC goes to sleep (monitor shuts off after 10 mins of inactivity)
     
  10. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,117
    Likes Received:
    132
    GPU:
    MSI RTX 2080
    Hopefully someone can help you out here.

    Also though, shouldn't there be a setting in windows that tells your monitor when it's allowed to sleep?

    Pretty sure it's in Power management or something.
     

  11. acknowledge

    acknowledge New Member

    Messages:
    7
    Likes Received:
    0
    GPU:
    nvidia GTX-660/970
    Is there any conflict when both ingame and inspector AF set to 16X? Or when inspector AF is set, ingame one automatically disabled?
     
    Last edited: Jan 1, 2016
  12. GuruKnight

    GuruKnight Master Guru

    Messages:
    865
    Likes Received:
    15
    GPU:
    2 x 980 Ti AMP! Ex
    Technically you should always disable ingame AF, when possible.
    But issues related to conflicts between ingame AF and driver AF are very rare.
     
  13. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,117
    Likes Received:
    132
    GPU:
    MSI RTX 2080
    9.99/10 times it doesn't matter as the driver override ignores any setting in the game.


    I can't remember any specific case where there is any conflict.

    I remember Crysis 2 and I think 500 series cards had an issue with driver AF.
     
  14. acknowledge

    acknowledge New Member

    Messages:
    7
    Likes Received:
    0
    GPU:
    nvidia GTX-660/970
    Thank you guys for the answers:)
     
  15. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,117
    Likes Received:
    132
    GPU:
    MSI RTX 2080
    Ok, everything is complete now except for SLI section. I do not know what those are. And if you are using SLI you probably already know.
     

  16. GuruKnight

    GuruKnight Master Guru

    Messages:
    865
    Likes Received:
    15
    GPU:
    2 x 980 Ti AMP! Ex
    Never mind.
     
    Last edited: Jan 25, 2016
  17. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,117
    Likes Received:
    132
    GPU:
    MSI RTX 2080
    If you have something to add or some criticism. Please feel free to say something! ^^^
     
  18. CybrDaath

    CybrDaath New Member

    Messages:
    1
    Likes Received:
    0
    GPU:
    Dual SLI Geforce 950 gtx
    Would love to see more added to this! Specifically, I'm curious as to what is going on in the Other section.



    Thanks,
    CybrDaath
     
  19. Thanks MrBonk for creating this detailed, updated guide for Nvidia Inspector Driver management! I was reading up on an article regarding Tom Clancy's The Division about how the game tries to cache as much as possible into the VRAM and takes the graphics data out in chunks while replacing the now free chunks of VRAM with new graphics data. This whole concept is streaming graphics on the fly, but in Tom Clancy's The Division it is supposedly making full use of caching graphics data, while not slowing down rendering.

    Anyway, with this said, I was curious if the greyed out "Memory Allocation Policy" in the drivers could further help with streaming in textures: Nvidia Inpsector Driver Memory Allocation Policy

    I decided to try one game for now, which is Just Cause 2 because it has a repeatable benchmark and can be still performance intensive on max settings.

    Below are several line graphs, which show the differences in VRAM usage, framrate, and frametime for all three types of memory policies (As Needed, Moderate, and Aggressive). While this is a initial test, it looks like there is some gain with minimum framerates.

    VRAM Usage:
    VRAM Usage Chart

    Framerate:
    Framerate Chart

    Frametime:
    Frametime Chart

    Below is the settings used as well as the average framerate given at the end of the benchmark in Just Cause 2 called "Dark Tower".

    As Need Memory Policy Benchmark Results:
    As Needed Memory Policy Benchmark

    Moderate Memory Policy Benchmark Results:
    Moderate Memory Policy Benchmark

    Aggressive Memory Policy Benchmark Results:
    Aggressive Memory Policy Benchmark

    As Needed Memory Policy Benchmark Results 2nd run:
    As Needed Memory Benchmark Results 2nd run

    I think moderate memory allocation policy seems to offer good results with average framerate and minimum framerates/frametimes. It seems to not offer much of a framerate increase, but I think with further testing it might offer higher minimum framerate/frametime. There is not much know about the Memory Allocation Policy, which might or might not work with DirectX.
     
  20. MrBonk

    MrBonk Ancient Guru

    Messages:
    3,117
    Likes Received:
    132
    GPU:
    MSI RTX 2080
    That is very interesting! Thanks for checking it out. I'll add that to the OP
    I'm curious, how did you make the graphs?
    I tried doing this once but I could never quite get the graph to turn out right.
     

Share This Page