Nvidia Inspector introduction and Guide

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by MrBonk, Nov 5, 2015.

  1. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
    Nvidia Profile Inspector introduction and Guide

    [​IMG]
    ___________________________________________________________________________________________________

    Since there is not really a thorough or even basic guide about inspector and how to set it up initially and use it and what some things mean.
    I thought I would make one. I am taking a portion of this from another guide I made on Steam and it is a WIP at the moment.

    This will mainly be focused around "Nvidia Profile Inspector" for the Driver Profiles editing, rather than the card stats/OC mode of regular "Nvidia Inspector".

    _____________________________________________________________________________________________________
    |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

    Introduction:

    (Taken from the download description for the most part)
    However, Nvidia Inspector has another function.
    Pulling up and editing Profiles within the driver for Applications.
    Much like the "3D settings" page in Nvidia Control Panel, but not only more in-depth but MUCH more responsive. NVCP is incredibly sluggish to use and causes you to rely on Nvidia for just about everything. Problem is they don't always expose the most functionality and maybe might make some wrong assumptions about any given game. Etc.

    This has been split off into the Open Source "Nvidia Profile Inspector" by Orbmu2k https://github.com/Orbmu2k/nvidiaProfileInspector
    https://www.guru3d.com/files-details/nvidia-profile-inspector-download.html
    This is what this guide is focused on.

    Inspector is what gives us the ability to
    Force High Quality Anti-Aliasing in Direct3D9 backends 2
    HBAO+
    Improved and additional SLI functionality for games that don't have it 2
    Among others.

    ______________________________________
    ||||||||||||||||||||||||||||||||||||||||||||
    Download Nvidia Profile Inspector:

    * Download the latest build from here https://ci.appveyor.com/project/Orbmu2k/nvidiaprofileinspector/build/artifacts


    *Save it to a location that you can permanently use. Extract it and run NvidiaProfileInspector.exe.
    _____________________________________________________________
    Setup Guide for Elevated Profile Settings mode if using version older than 2.0:
    Skip this section if you download the standalone Nvidia Profile Inspector from above.
    * Download the latest version here
    In the future if the application needs to update. It will tell you so and automatically download any updates.

    *Save it to a location that you can permanently use. Extract it.

    *When you open it for the first time you will be presented with
    [​IMG]

    Click the Tool icon next to the "Driver Version" Box to open up the driver profile settings.
    ______________________________________________________________
    *You are now greeted with this screen and the default "Global Home" Profile.
    [​IMG]

    These are the settings that globally apply to any application. It is VERY IMPORTANT that you minimize what you force globally to only basic things such as Texture Filtering settings


    *Here is my Global profile. You don't have to copy it, but I at least recommend setting AF to override and HQ at the very minimum.

    http://images.nvidia.com/geforce-co...interactive-comparison-001-on-vs-off-rev.html
    This is why you force 16xAF globally. Developers seemingly can't be trusted.

    It is worth noting however that whether you want to use "Adaptive" performance mode or "Prefer Max Performance" is up to you.
    Adaptive WILL cause problems with many games that aren't the latest AAA fare.
    The card will do a variety of things such as not maintaining the base clock as a minimum. Therefore causing performance issues because the card will essentially start going back to idle clocks.
    On the flip side;
    Adaptive will reduce idle temperatures depending on individual circumstances and the card will downclock correctly otherwise


    So, if you want to use Adaptive. Just remember to set Prefer Max in each profile for each game you play. Or follow the below
    • If you intend on using Adaptive
    • Set it to "Prefer Maximum Performance" each time globally you are going to play a game and reset the Display driver to make sure it takes effect. (Adaptive won't work for me unless I reset it first, same with PMP. Your results may vary)

      To reset the Display Driver, open "Device Manager" in Windows, right click your display adapter and hit "disable", then do it again to re-enable it.
    • Or Set every single game you play individually to "Prefer Maximum Performance" on their profiles. Personally, this is a pain in the ass. But, they are all viable methods.
    [​IMG]
    ________________________________________________________________________________________________
    ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

    First steps for tweaking driver settings for any given game:

    *The first thing you will always want to do for any given is search to see whether Nvidia has already made a profile for your game.
    To do this, left click in the white "Profiles" box, and start typing in the name of the game. The application will search through the list and narrow the list down the more characters you type.

    *If your game does not have a profile, you will need to create one.
    You can do so by following this graphic.
    [​IMG]

    *If you attempt to add an executable to a profile and it prompts you with a message that the .exe already belongs to a profile.
    Pay attention to the name of the profile/s it gives you. For example if the game you were looking for didn't show up in a search, it's possible you may have worded it differently than they have it set in the driver.
    HOWEVER, if the profiles it mentions to you aren't related to the game at all in any way we will need to set up the profile to use an .exe based on directory.

    When you go to select an .exe click the drop down box in the right hand corner and set to "Absolute Application Path"
    [​IMG]

    * Don't forget to hit Apply Changes to any changes you make!
    ________________________________________________________________
    ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||

    List of basic options and explanations:

    1 - Compatibility

    • Ambient Occlusion Compatibility: This is where you would enter or paste an 8 digit Hexadecimal code (Always with the prefix "0x") to get HBAO+ to work with any given game.
      There is an official list of Flags built into the driver configured for specific games (Though not necessarily well). These flags are not a combination of functions put together to form a flag as with Anti-Aliasing, but rather are directions pointed a set of programmed parameters for each flag.
      The first 2 bytes of the flag are reserved for Flags pertaining to DX10 and above. While the second set of bytes are reserved for Flags pertaining to DX9 only.
      Code:
      0x[B]0000[/B]0000" < DX10+, 0x0000[B]0000[/B] < DX9 
      Each of the 8 slots goes from a value of 1-16
      Code:
      0,1,2,3,4,5,6,7,8,9,A,B,C,D,E,F
      Giving them an approximate total of 65536 potential Flags for each set of APIs.
    • Antialiasing Compatibility: This is where you would enter or paste an 8 digit Hexadecimal code (Always with the prefix "0x") to get various forms of Anti-Aliasing to work with any given DX9 game.
      This Hex code unlike AO Compatibility, is actually a combination of possible functions that tell the driver what to do in regards to what kind of buffer formats to look for, how to process them among other things.
    • Antialiasing compatibility (DX1x): This is where you would enter or paste an 8 digit Hexadecimal code to force AA in DirectX10+...IF YOU HAD ONE!!:eyes:
      Nvidia's effort sadly fell off the wagon here. There are few functions available for this and none of them work in any meaningful way. Heck, I don't even think the ones it has that are set up for early DX10 games work all that well. They only work for MSAA for the most part from what I remember.
    • Antialiasing Fix: This is somewhat a mystery almost. Initially this was apparently made just for an issue relating to Team Fortress 2. (As such was originally known as the TF2Fix). But as it turned out, this affected a very large number of games.

      Currently the only description of the function available is
      Code:
      "FERMI_SETREDUCECOLORTHRESHOLDSENABLE" (Fermi> Set Reduce Color Thresholds Enable)
      This would suggest it's a Fermi issue, but it applies really to at least Fermi and everything after it.
      It's also interesting that turning the AA fix on disables this. The default setting of "Off" is actually a value of 0x00000001. The On value is 0x00000000(Team Fortress 2) (It should say just ON in Inspector, but there is a bug currently that just gives you the hex code here) (*As of 05/22/16)

      DO NOT Enable this Globally As what it does depends on a per game basis. It is noted whether a game needs it or not in the Anti-Aliasing flags thread or whether it causes issues.
    • SLI Compatibility bits: This is where you would enter or paste an 8 digit Hexadecimal code (Always with the prefix "0x") to get SLI working in DX9 applications. if the application doesn't already have a flag in the driver. OR if the official flag doesn't work well and is of poor quality.

      Like AA compatibility bits. These flags are a combination of functions within the driver.
    • SLI compatiblity bits (DX10+DX11): This is where you would enter or paste an 8 digit Hexadecimal code (Always with the prefix "0x") to get SLI working in DX10+ applications. if the application doesn't already have a flag in the driver. OR if the official flag doesn't work well and is of poor quality.

      Like AA compatibility bits. These flags are a combination of functions within the driver.
    • SLI compatiblity bits (DX12):This is a new one. I assume it is the same as the other two. Currently there are only 2 flags in the driver. But as more DX12 games come out i'm sure there will be more and this should actually be interesting to see how it plays out.
    2 - Sync and Refresh
    • Frame Rate Limiter: This setting will enable the driver's built in Frame Rate limiter in a series of pre-defined values non whole number values. It's worth noting that the quality of it has historically been a bit spotty. The behavior was changed at some point so whole numbers aren't possible. I think it's simply due to how it works(Some sort of prediction or more complicated system that has never been exposed to the user), before the limiter with whole values would never really stick to those numbers.
      I think it's worthwhile now to do more investigating on the limiter as it is now. The 60FPS setting is 59.7 or 60.7, with Vsync enabled it might work differently too.
      Personally though, through all of my experienceUnwinder's RTSS generally is more useful as it lets you set the value yourself and is more stable consistently.
    • GSYNC Application Mode: When Using GSYNC it is important to keep in game Vsync disabled to avoid conflicts.
    • GSYNC Requested State:
    • GSYNC Global Feature:
    • GSYNC Global Mode:
    • GSYNC Indicator Overlay:
    • Maximum pre-rendered frames: Taken from
      Values from 1-8, the default is 3 by the driver and I would not recommend higher than that. A value of 1 or 2 will reduce Input Latency further at the cost of slightly higher CPU performance cost.
      When using 1/2 Refresh Rate Vsync a value of 1(Sometimes 2 will suffice but 1 generally reduces latency more) is essentially required. As 1/2 sync will introduce significantly more Input Latency.
      In addition, setting "30 fps ( Frame Rate Limiter v2 )" may also help reduce more input latency when using this. You may want to try the V2 30FPS limit with 60hz sync as well. Might have better latency
    • Triple buffering: Enables Triple Buffering for Vsync, but ONLY for the OpenGL API. For a run down of TB here is an article. If you wish to enable TB for D3D APIs you can download and use D3DOverrider..

      It's worth noting that GSYNC makes the concept of Double and Triple Buffering entirely irrelevant. This is only for standard sync monitors.
    • Vertical Sync Smooth AFR behavior:
    • Vertical Sync Tear Control: This controls when a frame drop is detected whether Vsync should be disabled to maintain performance or sync should drop to the next syncable rate. At 60hz, without adaptive the frame rate will drop to 30FPS because it's the next syncable rate; 1/2.
      You can use TB as mentioned above instead of adaptive, or as long as you ensure you have enough power to sustain the peformance you are aiming for it shouldn't be an issue.

      Adaptive in my experience can be hit and miss, but so can Triple Buffering. In some cases TB can increase Input Latency, stay the same or decrease it. (Despite what anyone may say).
      It's up to you what you prefer to use. I prefer to not use adaptive. And again GSYNC makes this irrelevant.
    • Vertical Sync: Controls whether Vsync can be enabled for any given application. Typically it's set to "Application Controlled". Which means instead it's up to the individual application itself to enable/disable or offer the option for Vsync.
      If you have a GSYNC monitor and install a driver, this is forced to ON automatically. Disabling it will disable Gsync globally.
      One recent example is Fallout 4. The game has no Vsync option, but it is forced on no matter what.
      You can disable it by setting this to "Force Off" on the Fallout 4 profile.
      Remember GSYNC makes this irrelevant. (AFIK) It is also important to keep in game Vsync disabled to avoid conflicts with GSYNC enabled. Though in one specific case with The Division it might not work right.
    3 - Antialiasing
    • Antialiasing - Behavior Flags: These mostly exist as a method of governing usage of AA from Nvidia Control Panel (They are mostly useless today except for the ones that disable all usage).
      BUT, they also affect Inspector as well. So you will want to make sure you are clearing out ANY flags in this field for a game profile when forcing AA!
      As it WILL interfere and cause it not to work if you aren't careful.
    • Antialiasing - Gamma Correction: Gamma correction for MSAA, this is defaulted through function in the drivers to ON for any GPU Fermi generations and forward. It was new about 10 years ago with Half Life 2 and there is no reason to disable it on modern hardware.
      http://www.anandtech.com/show/2116/12
    • Antialiasing - Line Gamma: From what I know this is only for OpenGL and i'm not sure what it actually does. If you know please post!
    • Antialiasing - Mode: This has 3 settings
      Code:
      [LIST=1]
      [*]Application Controlled
      [*]Override Application Setting
      [*]Enhance Application Setting
      [/LIST]
      
      When overriding AA in and game you will want to set "Override" and not either of the other two. Enhancing AA will become entirely dependent on the implementation of MSAA in the game you are modifying a profile for.
      More often than not, especially in modern DX10+ games. This is a total crap shoot, either it doesn't work, breaks something, or looks very bad.

      However, there are a few exceptions here. In certain game/game engines like Monolith Productions mid 2000s LithTech based games that use some sort of MSAA based FSAA the above 3 settings will generally not matter.

      A specific example: If you lhave the correct AA Flag on the profile and leave it at Application Controlled but have 8xQ MSAA and 8xSGSSAA enabled below it and enable F.E.A.R 2's FSAA it will enhance the FSAA.
      In this specific case, this actually looks far better than 8xSGSSAA or FSAA by themselves!

      Another example is Red Faction Guerrilla, you can't force AA in this game. However you can Enhance the in game MSAA with various methods of AA to some decent results. But it shines when you combine the in game AA+Enhanced AA and then downsample(Using DSR) while also enabling FXAA in the game Profile.
      (FXAA works when Enhancing in game AA. It used to when Overriding as well, but has been broken since after 331.82. It is applied last in the chain so it doesn't cause conflicts with other AA, though it's not recommended to use it at native over enhanced AA if that makes sense. Oversampling from Downsampling negates any smoothing issues)

      This is a rather unique exception as most games don't yield this good of results.

      Here are a few comparisons showing it off.
      http://screenshotcomparison.com/comparison.php?id=103126
      This first one shows no AA by default | Vs | The game running at 2x2 Native resolution with 2xMSAA enabled in game with "Enhance Application Setting" enabled and set to 4xS (1x2 OGSSAA + 2xMSAA) together with 2xSGSSAA. Finally with FXAA enabled on the profile.

      http://screenshotcomparison.com/comparison.php?id=103127
      This second one is cropped from the native 3200x1800 buffer with 2xMSAA+4xS+2xSGSSAA |Vs| That with FXAA also enabled showing that there are still some rough edges that FXAA cleans up before it is downsampled back to native 1600x900


      The 3rd comparison shows 2x2 native resolution + 2xMSAA | Vs | 2x2 Native + 2xMSAA+4xS+2xSGSSAA+FXAA cropped and upsampled with point filtering by 2x2 to show how much more aliasing is tackled and resolved.
      http://screenshotcomparison.com/comparison/161297
    • Antialiasing - setting:This is where you would set any primary form of forced Anti Aliasing. Which can be MSAA,CSAA(G80 to Kepler GPUs ONLY), OGSSAA or HSAA (12xS for example).
      If you are going to use SGSSAA, you can use MSAA modes ONLY. The number of Color Samples have to match.
      This image from GuruKnight's thread explains this well.
      http://u.cubeupload.com/MrBonk/revisedaaoverviewtgr.png
    • Antialiasing - Transparency Multisampling: http://http.download.nvidia.com/dev...parency/docs/AntiAliasingWithTransparency.pdf
      The number of games this works with is unknown, but the results can be nice when it does work.
      http://screenshotcomparison.com/comparison/149642

      Nvidia has a demo of this you can download that also includes a demo of 4xTrSSAA
      https://www.nvidia.com/object/transparency_aa.html
    • Antialiasing - Transparency Supersampling:
      The only options here are Transparency Super Sampling and Sparse Grid Super Sampling.
      In reality they are both SGSSAA, but they are different in their approach. TrSSAA is formally SGSSAA while SGSSAA is actually FSSGSSAA(That's a mouthful ). Which stands for Full Scene Sparse Grid Super Sampling Anti Aliasing.

      This works by replaying the pixel shading by N times number of color samples. With TrSSAA(SGSSAA) however it is decoupled from the main MSAA pass in that it only applies SGSSAA to Alpha Test surfaces like flat textures that come in all forms and varieties.

      While SGSSAA (FSSGSSAA) is coupled with MSAA and needs the sample counts to match for it to work properly as it uses the MSAA sub samples for the entire scene.

      Guru Knight's image again explains some of the usage of this.
      http://u.cubeupload.com/MrBonk/revisedaaoverviewtgr.png

      Do note that again, usually these require AA compatibility Flags to work!
    • Enable Maxwell sample interleaving (MFAA): This enables Nvidia's new Multi Frame Anti Aliasing mode. This only works in DXGI (DX10+) and requires either the game to have MSAA enabled in the game or MSAA to be forced (Good luck with that. Few and far in between number of games that work).

      What it does is change the sub sample grid pattern every frame and then is reconstructed in motion with a "Temporal Synthesis Filter" as Nvidia calls it.
      There are some caveats to using this though.
      • It is not compatible with SGSSAA as far as I have been able to test in a limited fashion with DX10+
      • With TrSSAA in one case I tested could cause some blur on TrSSAA components
      • It causes visible flickering on geometric edges and other temporal artifacts depending on the game and it's MSAA implementation. Part of this is nullified with Downsampling though. So it's GREAT to use with downsampling to improve AA/Performance
      • With screenshots and videos captured locally, there will be visible sawtooth patterns.
      • It has a Framerate requirement of about 40FPS minimum. Otherwise the Temporal Synthesis Filter seems to fall apart in a strange way depending on the game.
      • It is not compatible with SLI (Yet?)

        For example, with Grandia II Anniversary or Far Cry 3 Blood Dragon when the game is motion you'll notice severe blurring and smearing in motion. Making it unplayable. Strangely enough though if you record video of while the framerate is under this it will not be visible in the recording at all. Bizarre.

        In Lost Planet DX10, at 40FPS and under, the flickering and temporal artifacts simply increase, making it look irritating.
    • Nvidia Predefined FXAA usage: Simply tells the driver whether FXAA is allowed to be turned on Nvidia Control Panel (Primarily) or Nvidia Inspector.
    • Toggle FXAA indicator on or off: This will display a small green icon in the left upper corner if enabled showing whether FXAA is enabled or not.
    • Toggle FXAA on or off: Turns FXAA on or Off. You can also enable this when you are Enhancing in game AA as shown above. You used to be able to do so when Overriding as well, but has been broken since every version after 331.82.
      You wouldn't want to use that ALL the time, but only in very specific use cases involving Oversampling. (Battlefield Bad Company 2 is one I can think of that would benefit if you still could do it)
    4 - Texture Filtering
    • Anisotropic Filtering mode: Simply tells the driver whether the Driver controls AF or the AF does it's own thing.
      I highly recommend you leave this set to "User Defined/Off". Because lots of games do not have Texture Filtering options and lots of games also have mediocre or intentionally mediocre Texture Filtering.

      Most of the time anyway, Driver Level AF is higher quality than in game AF.
      Modern recent examples: Just Cause 3, Assassin's Creed Syndicate.
      Only rarely will this incur a performance hit of any significance when overriding globally.
      AC:Syndicate is one example of this, but it's worth it IMO and it's is a rare duck. Most games do not incur the same kind of performance requirement.
    • Anisotropic Filtering setting: If you have it set as above. This determines what level of Texture Filtering is forced on an Application. 16x is the best. But it also has options for "Off[point]" which is Point Filtering(You wouldn't want this 9/10 times.) "Off[Linear]" which i'm pretty sure is Bilinear filtering.
    • Prevent Anisotropic Filtering: Similar to AA Behavior Flags, if this is set to On it will ignore Driver Overrides from NI or NVCP. You don't want this. Some games have this set by Nvidia to On and i'm not sure why. I've never seen an issue arise in these games.
    • Texture Filtering - Anisotropic filter optimization: -This and the below don't have much information available. The most I could glean was from Patents by Nvidia no less. Essentially they reduce the number of texture samples when using AF to improve performance.(Like Trilinear Optimization below) Leave these disabled. This might have been necessary 10-12 years ago. But not now.

      Update 07/16- Taken from NvGames.dll
    • Texture Filtering - Anisotropic sample optimization - See above
    • Texture Filtering - Driver Controlled LOD Bias: When using SGSSAA with this enabled will allow the driver to compute it's own Negative LOD Bias for textures to help improve sharpness for those who prefer it. It's generally less than the fixed amounts that are recommended.

      When this is enabled setting a manual bias will not do anything. And AutoLOD will always be applied.
    • Texture Filtering - LOD Bias (DX) - The Level of Detail Bias setting for textures in DirectX Backends. This normally only works under 2 circumstances.
      For both "Driver Controlled LoD Bias" must be set to "Off"
      1. When Overriding or Enhancing AA.
      2. The last choice is an interesting one. If you leave the "Antialiasing Mode" setting to "Application Controlled" but you set the AA and Transparency setting to SGSSAA (Eg 4xMSAA and 4xSGSSAA;TrSSAA in OpenGL;) then you can freely set the LOD bias and the changes will work without forcing AA. This has the side effect that in some games if the game has MSAA, it will act as if you were "Enhancing" the game setting.
        Comparison example http://screenshotcomparison.com/comparison/159382

      An explanation of the LoD Bias from http://naturalviolence.webs.com/lodbias.htm
      If you wish to use a -LOD Bias when forcing SGSSAA, these are the recommended amounts
      Do not use a -LOD Bias when using OGSSAA (YxY modes) and HSAA modes, these already have their own automatic LOD bias (That can cause issues in some games)
      http://naturalviolence.webs.com/sgssaa.htm
    • Texture Filtering - LOD Bias (OGL) - The same as the above except for OpenGL. I don't remember if the trick mentioned above works for OGL as well to use a LOD bias without forcing AA. If you want to try, set the Transparency setting to "4xSupersampling" instead of SGSSAA.
    • Texture Filtering - Negative LOD bias - This used to control whether NLOD's were Clamped (Not allowed) or Allowed . With Fermi GPU's and on, this no longer really functions. By default it clamps. Driver Controlled LoD Bias works either way.
    • Texture Filtering - Quality -Leave this at High Quality, this is an old optimization for older hardware to improve AF performance at the cost of some quality. If you have older hardware, like G80(8000 series) and prior, feel free to play around to see if it helps at all.
    • Texture Filtering - Trilinear Optimization - Same as above, an optimization of the number of texture samples taken for texture filtering. The HQ Setting above disables this anyway. Might only have to do when using Trilinear filtering.
      Patent - http://www.freepatentsonline.com/7193627.html
    5 - Common
    • Ambient Occlusion setting - This needs to be set to enable HBAO+, there are 3 settings.
      1. Performance
      2. Quality
      3. High Quality
      Q and HQ are pretty similar (Though before HBAO+ was introduced there was a bigger difference), Performance lowers the resolution and precision of the effect noticeably in many games. With less accurate shading and stronger shading. However in some games, it actually fixes some buggy issues when using the Q and HQ settings without other drawbacks. (Ex: Oddworld New N Tasty, Urban Trials Freestyle). The HBAO+ thread and list usually mention if needed.
    • Ambient Occlusion usage - When using HBAO+ just set this to On.
    • Extension limit - I'm not sure what this is exactly (Excuse my stupid), but my GoogleFu has turned up posts of people needing this feature for OpenGL games to work correctly, not crash, run at the right speed etc. (Example: Soldier of Fortune and one of the Riddick games;?;)So it might be worth leaving it at "on". Or setting it for specific OGL games, there are a few other values listed as well. So if you have an OGL game. Feel free to play around.

      Update 07/16- I found this information
    • Multi-display/mixed-GPU acceleration -
    • Power management mode: -
    • Shader cache: - This was added in driver 337.88
    • Threaded optimization- We do not know what this actually does. But it works in DX and OGL and apparently can help and make things worse depending on the game. It defaults to Auto, so that might be the best way to leave it aside from known problematic games.
      Known Games with Problems when enabled.
      • Neverwinter Nights
      • Battlefield Bad Company 2 in MP
      • The Chronicles of Riddick: Assault on Dark Athena
      • DayZ/Arma 2 (Might not be the case anymore. Verification would be nice)
      Known Games that it is helpful with when enabled.
      • Source Engine games (Verification?)
      • Sleeping Dogs
      If you know of any other problem games. Do let me know!
    6 - SLI
    • Antialiasing - SLI AA: From GuruKnight
    • Disable bridgeless SLI
    • Number of GPUs to use on SLI rendering mode
    • NVIDIA predefined number of GPUs to use on SLI rendering mode on DX10
    • NVIDIA predefined number of GPUs to use on SLI rendering mode
    • NVIDIA predefined SLI mode on DirectX10
    • NVIDIA predefined SLI mode
    • SLI indicator
    • SLI Rendering mode
    • Memory Allocation Policy - ElectronSpider did some testing with this and some interesting results that are worth taking a look at for games that might be more VRAM intensive. http://forums.guru3d.com/showpost.php?p=5243365&postcount=39
    ________________________________________________________________

    I also took some time and made some new icons for Nvidia Inspector and a variant for the Profile .exe (P=Profiles)
    This took WAY more time than I expected lol...each one had to be made into 256x256,128x128,64x64,48x48,16x16 versions and combined into one .ico)

    I used this as a base. http://www.geforce.com/sites/default/files-world/attachments/GeForce Claw Logo.jpg

    The reason why there is a version with P on the Left is because normally, at least in Windows 7 the Administrator Icon covers up the lower right hand corner and thus the P on the right as well. I modified my "imageres.dll" to replace that icon with a transparent one personally. But that's sketchy, I also had to run SFC /SCANNOW to fix the problems that created. Not everyone may want to do that.

    Icons
    [​IMG][​IMG]
    [​IMG][​IMG]

    [​IMG]


    DL Link https://www.mediafire.com/?3tn279llv4eokk6
     
    Last edited: Sep 24, 2016
    OnnA, mirh, Magicblow and 1 other person like this.
  2. bjoswald

    bjoswald Guest

    Messages:
    156
    Likes Received:
    6
    GPU:
    Intel UHD 630
    Thank you for the guide (especially the Global Profile settings).
     
    Last edited: Nov 5, 2015
  3. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,636
    Likes Received:
    9,512
    GPU:
    4090@H2O
    Interesting, I always wondered if there was some thing like a tutorial. Gotta read this carefully later on, thanks anyway!
     
  4. GuruKnight

    GuruKnight Guest

    Messages:
    869
    Likes Received:
    17
    GPU:
    2 x 980 Ti AMP! Ex
    Great initiative, although I always say the best way of learning is by doing.
    This is also the case with NVIDIA Inspector IMO.

    Writing a comprehensive "guide" on every driver function exposed by Inspector is a very ambitious project, which is probably why it hasn't been attempted before :)
     

  5. VAlbomb

    VAlbomb Guest

    Messages:
    152
    Likes Received:
    6
    GPU:
    Nvidia G1 Gaming GTX 970
    Does Transparency Multisampling even works anymore?
     
  6. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
    Oh I very much agree personally :/.

    But my work on pcgamingwiki.com lately has made me consider new perspectives based on discussions and talking with people there.

    Particularly, I think you would be able to help with the SLI parts wouldn't you? As you have quite a bit more experience with SLI than I do.

    Most of it is just going to be basic explanations of what things do. For more in depth things, there will be slightly bigger explanations.

    I don't know anything about Gsync as well, so if anyone knows about these functions and what they do. That'd be great.

    I think it does in the games it always worked in. Oblivion/Fallout 3 for example I think? Half life 2 maybe?

    http://www.tweakguides.com/Fallout3_5.html

    Perhaps it's worth investigation.

    I've tested it on one older game, no results so far.
     
    Last edited: Nov 6, 2015
  7. GuruKnight

    GuruKnight Guest

    Messages:
    869
    Likes Received:
    17
    GPU:
    2 x 980 Ti AMP! Ex
    Sure, I have been using SLI for around 9 years now :)
    And do have various tips and tricks, some of which involve special undefined Inspector functions and such.
    But it is going to get very technical and lengthy, if we are going to cover everything.
     
  8. khanmein

    khanmein Guest

    Messages:
    1,646
    Likes Received:
    72
    GPU:
    EVGA GTX 1070 SC
    quite a long time no more new update. i'm hoping new version release soon.
     
  9. VAlbomb

    VAlbomb Guest

    Messages:
    152
    Likes Received:
    6
    GPU:
    Nvidia G1 Gaming GTX 970
    Also it seems Texture Filtering - Negative LOD bias "Clamp" no longer works on cards older than Fermi, so should it be set to "Allow" for Maxwell cards?
     
  10. GuruKnight

    GuruKnight Guest

    Messages:
    869
    Likes Received:
    17
    GPU:
    2 x 980 Ti AMP! Ex
    "Texture Filtering - Negative LOD bias" is basically a legacy setting now, and has no real impact on IQ on any NVIDIA card.
    This was replaced with these variables:

    Texture Filtering - Driver Controlled LOD Bias
    Texture Filtering - LOD Bias (DX)
    Texture Filtering - LOD Bias (OGL)

    By default, the LOD bias is automatically controlled with forced and enhanced SGSSAA in DirectX.
    In OpenGL you have to set "Driver Controlled LOD bias" to "Off" and manually apply a negative LOD bias to get the same effect.
     
    Last edited: Nov 7, 2015

  11. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,702
    Likes Received:
    1,843
    GPU:
    EVGA 1070Ti Black
    your download link dont work is is it just me?
     
  12. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
    The download link should lead to the download at Guru3D.com.

    Also: Regarding clamp, it is correct that it doesn't work.
    You can only do manual adjustment as well if SGSSAA for example is enabled on the profile. (Even if it's set to application controlled)

    I'm sorry I haven't updated since the initial post. I am unable to put in the time on some work days. Sunday night might be the first day I can do anything more.

    Also: I found a white paper about TrMSAA and a game that works with it too.
    http://http.download.nvidia.com/dev...parency/docs/AntiAliasingWithTransparency.pdf
    http://www.nvidia.com/object/transparency_aa.html

    http://screenshotcomparison.com/comparison/149642
     
    Last edited: Nov 7, 2015
  13. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
    there was something borked with the link. Fixed it.
     
  14. heymian

    heymian Guest

    Messages:
    622
    Likes Received:
    0
    GPU:
    ASUS Strix GTX 1080 Ti
    Under 'Antialiasing - Transparency Supersampling', there is an option before 2x that just says 'Supersampling' I believe. I always wondered what this option equates to in terms of IQ and performance. Would be nice if you could shed some light on this particular variable.
     
  15. mr1hm

    mr1hm Active Member

    Messages:
    97
    Likes Received:
    2
    GPU:
    2070 Super XC Ultra
    2 quick questions,

    1. if i export a game profile from the current 358.87 driver and import that profile onto a older driver (358.50), would the profile still behave as it did on the 358.87? meaning, if there were any benefits in the 358.87 game profile, would you also see those benefits if the profile was imported to the 358.50 driver?

    2. is the "Maximum pre-rendered frames" setting tied to the "Maximum frames allowed" setting? if i set Max pre-rendered frames to 2, should i set max frames allowed to 2 as well?

    thanks in advance.
     

  16. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
    I remember testing this a few years ago. But don't remember the results. I will have to check for that.
    Here are some pictures:
    8xMSAA http://u.cubeupload.com/MrBonk/8x.png
    8xMSAA TrMSAA http://u.cubeupload.com/MrBonk/8xtrmsaa.png
    8xMSAA TrSSAA 1x (Is what we'll call it for the moment) http://u.cubeupload.com/MrBonk/8xtrssaa1x.png
    8xMSAA TrSSAA 2x http://u.cubeupload.com/MrBonk/8xtrssaa2x.png
    8xMSAA TrSSAA 4x http://u.cubeupload.com/MrBonk/8xtrssaa4x.png
    8xMSAA TrSSAA 8x http://u.cubeupload.com/MrBonk/8xtrssaa8x.png

    Looks identical to 4x TrSSAA?

    Based on the white paper I posted above, games can be programmed to call a specific setting to enable Transparency SuperSampling and the document refers to this as a 4x mode (4x pixels for every one pixel). So this basic mode is probably what that was.

    1. Benefits as in say Driver Optimizations? I'm going to have to err on the side of caution and say no. Because whatever additional things on the profile that may exist, may reference code or functions that could not be set up properly or not exist at all in older driver code.

    Feel free to experiment though.
    Usually any official additions are at the bottom, many relate to stereo rendering. But others are optimizations I believe. It may not be all optimizations for any given game either for all we know.
     
    Last edited: Nov 8, 2015
  17. mr1hm

    mr1hm Active Member

    Messages:
    97
    Likes Received:
    2
    GPU:
    2070 Super XC Ultra
    ah ok, thanks for the info; and yes, i meant driver optimizations :D

    any idea about question #2? i've been wanting to mess around with the max pre-rendered frames setting but, wasn't sure if the max frames allowed setting had to be set to the same as well
     
  18. MrBonk

    MrBonk Guest

    Messages:
    3,385
    Likes Received:
    283
    GPU:
    Gigabyte 3080 Ti
    Oh whoops I missed that. Sorry.

    AFIK there is no relation, and according to what I have the "Max Frames" command is an OGL function.

    The pre-rendered frames acts on it's own and simply controls how many frames the CPU can render ahead of the GPU.
     
  19. mr1hm

    mr1hm Active Member

    Messages:
    97
    Likes Received:
    2
    GPU:
    2070 Super XC Ultra
    np and thank you for the explanation, appreciate it :)
     
  20. heymian

    heymian Guest

    Messages:
    622
    Likes Received:
    0
    GPU:
    ASUS Strix GTX 1080 Ti
    Looks like a mystery solved! Great work!

    Quick question, I know you're supposed to match MSAA samples with SGSSAA samples for optimal image quality (i.e. 4xMSAA + 4xSGSSAA). But does this rule apply to regular TrSSAA also? If I use 2xMSAA, should I match it with 2xTrSSAA?

    To take it a step further, what if we add MFAA into the mix. If I were to combine MXAA + TrSSAA or SGSSAA, which sample rate am I trying to match? For example, 2x MSAA + MFAA Enabled = 4xMSAA (not really but almost equivalent). So following the above rule for SGSSAA, should I be using 2x SGSSAA or 4xSGSSAA.

    Also MFAA in itself has its own alpha blending techniques and I'm wondering how that interacts with transparency antialiasing. Does one cancel each other out of do they both combine to produce an even "cleaner" image?

    Thank you for your contributions.
     
    Last edited: Nov 10, 2015

Share This Page