Guru3D.com Forums

Go Back   Guru3D.com Forums > Videocards > Videocards - NVIDIA GeForce Drivers Section
Videocards - NVIDIA GeForce Drivers Section In this section you can discuss everything GeForce driver related. GeForce ForceWare and GeForce Exerience drivers are for NVIDIA Quadro and all GeForce based videocards.



Reply
 
Thread Tools Display Modes
Old
  (#76)
pedigrew
Master Guru
 
pedigrew's Avatar
 
Videocard: EVGA GTX770 SC ACX 2GB
Processor: Core i7-4770K @Stock
Mainboard: ASUS Z87-A
Memory: G.SkillSniper 8GB 1866Mhz
Soundcard: Onboard
PSU: Corsair RM850
Default 07-03-2015, 21:51 | posts: 200 | Location: Brazil

Quote:
Originally Posted by GuruKnight View Post
Personally I'm spending this "downtime" to plan a new PC system build.
Overall I'm very impressed by the new GTX 980 Ti cards, and intend on combining a couple of factory overclocked EVGA or ASUS cards with a 5930K CPU.
Could also be interesting to use one of my old GTX 780's as a dedicated PhysX card, to permit optimal SLI scaling in PhysX heavy titles
This in SLI would be awesome.
   
Reply With Quote
 
Old
  (#77)
Zoson
Member Guru
 
Videocard: 2x GTX 980 @ 1557/8000
Processor: Core i7 5960x
Mainboard: ASUS Rampage 5 Extreme
Memory: 32GB Corsair LPX 3000MHz
Soundcard: Swans M10 & V-MODA M100
PSU: Corsair RM1000
Default 07-06-2015, 16:43 | posts: 59 | Location: Manhattan

Quote:
Originally Posted by GuruKnight View Post
The official SLI driver profile (0x020000F5) is completely optimal in terms of SLI scaling for Titanfall, and the light flare issues are purely engine related.
The only way of fixing this multi GPU issue is through a game patch made by the developers.

DX11 SLI bit #25 gives the extra "2" in 0x020000F5.
Without this compatibility bit the profile is simply 0x000000F5, which is equivalent of basic AFR 2 mode.
But unfortunately you can't remove bit #25 without breaking scaling, as you discovered yourself.
Hope this helps clear up the confusion
Thanks for the explanation GuruKnight. It does clear up the confusion. I'm now left wondering if there's a list of the various SLI compatibility flags and what they're supposed to do? Is there a list of them somewhere?

I'm also failing to understand how exactly the '2' in 0x020000F5 relates to '25' since it seems to be a hex qword? Or am I looking for something that doesn't exist because the number of the compatibility flag has no relation other than that defined by nvidia within the qword?
   
Reply With Quote
Old
  (#78)
Guzz
Member Guru
 
Videocard: GTX 970
Processor: 2550K@4.8GHz
Mainboard: ASUS P8Z68-V Pro/Gen3
Memory: 8GB
Soundcard: ASUS Xonar DG
PSU: 850W
Default 07-06-2015, 17:31 | posts: 105

Quote:
Originally Posted by Zoson View Post
I'm now left wondering if there's a list of the various SLI compatibility flags and what they're supposed to do? Is there a list of them somewhere?

Quote:
Originally Posted by Zoson View Post
I'm also failing to understand how exactly the '2' in 0x020000F5 relates to '25' since it seems to be a hex qword? Or am I looking for something that doesn't exist because the number of the compatibility flag has no relation other than that defined by nvidia within the qword?
This '25' related to Nvidia Inspector bit value editor.
   
Reply With Quote
Old
  (#79)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 07-07-2015, 15:12 | posts: 837 | Location: Denmark

Never mind.

Last edited by GuruKnight; 07-07-2015 at 22:32.
   
Reply With Quote
 
Old
  (#80)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 07-15-2015, 19:17 | posts: 837 | Location: Denmark

Quote:
Originally Posted by pedigrew View Post
This in SLI would be awesome.
Two of these are on their way now
Thank you very much for the suggestion BTW.

I'm also working on a few general improvements to this thread and the spreadsheet, but won't have time to implement anything until my new PC system is up and running.

Last edited by GuruKnight; 07-16-2015 at 09:59.
   
Reply With Quote
Old
  (#81)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 07-18-2015, 13:02 | posts: 837 | Location: Denmark

New general Inspector settings examples added to each category of the spreadsheet.
This should make it easier for novice users to get into the world of NVIDIA Inspector tweaking
   
Reply With Quote
Old
  (#82)
Scorpio82Co
Master Guru
 
Scorpio82Co's Avatar
 
Videocard: Gigabyte GTX1070 G1 8GB
Processor: Intel core i5 6600k
Mainboard: MSI Z170 SLI PLUS
Memory: G.SKILL TridentZ 16 GB
Soundcard: Logitech G930
PSU: Rosewill Bronze 1000w
Default 09-08-2015, 19:42 | posts: 177 | Location: Cali,COLOMBIA

finally, i found some sli improvement in deus EX HR in DX11


sli bits 0x000040F5 : i can tell that rise an 23% more performance tan usual bits
   
Reply With Quote
Old
  (#83)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 09-10-2015, 11:06 | posts: 837 | Location: Denmark

Excellent work

Did you encounter any graphical glitches, flickering or other issues with this profile?
It would be great, if you could make a couple screenshot comparisons of the official driver profile vs. "40F5" (with Precision X or Afterburner OSD running).

Try testing with something like 4xDSR (1920x1080 -> 3840x2160) to ensure full GPU usage.
   
Reply With Quote
Old
  (#84)
Scorpio82Co
Master Guru
 
Scorpio82Co's Avatar
 
Videocard: Gigabyte GTX1070 G1 8GB
Processor: Intel core i5 6600k
Mainboard: MSI Z170 SLI PLUS
Memory: G.SKILL TridentZ 16 GB
Soundcard: Logitech G930
PSU: Rosewill Bronze 1000w
Default 09-12-2015, 03:29 | posts: 177 | Location: Cali,COLOMBIA

Quote:
Originally Posted by GuruKnight View Post
Excellent work

Did you encounter any graphical glitches, flickering or other issues with this profile?
It would be great, if you could make a couple screenshot comparisons of the official driver profile vs. "40F5" (with Precision X or Afterburner OSD running).

Try testing with something like 4xDSR (1920x1080 -> 3840x2160) to ensure full GPU usage.
jeje.. i will make time to view results, my test were under res 2880/1620, HBAO flag vs SSAO improve and Reshade, with my rig. config passed from 55- 60fps to 71fps with not that kind of issues until now. In 4k works smooth in vanilla, & max options (under 60s), in my config, roughly 33 - 40 (due to memory cost and inplementation of the injector)

Last edited by Scorpio82Co; 09-12-2015 at 05:47.
   
Reply With Quote
Old
  (#85)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 09-20-2015, 13:17 | posts: 837 | Location: Denmark

Perhaps so, but I must still insist on screenshot comparison evidence (with FPS counter) before adding anything to the spreadsheet.
SLI testing can be very complicated, and I would like to make sure DX11 SLI bit #14 doesn't cause any issues.
   
Reply With Quote
Old
  (#86)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 09-27-2015, 00:54 | posts: 837 | Location: Denmark

Here is a small SLI report for The Vanishing of Ethan Carter Redux
http://www.forum-3dcenter.org/vbulle...postcount=2048
   
Reply With Quote
Old
  (#87)
SnipeStar
Newbie
 
Videocard: 2x GTX 680 4GB
Processor: Core i7 3820
Mainboard: Asus Sabertooth X79
Memory: 16GB Corsair Vengeance LP
Soundcard: Auzentech HD HT 7.1
PSU: Corsair HX1000W
Default 09-30-2015, 16:48 | posts: 4 | Location: Hampton Roads

yo, what the hell? i downloaded nvidia inspector last night to get SLI working in sniper ghost warrior 2, and i definitely do not have the category for "compatibility" so i don't have access to the 'sli compatibility bits'

i've used inspector in the past before with no issue.
   
Reply With Quote
Old
  (#88)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 10-01-2015, 14:56 | posts: 837 | Location: Denmark

Sounds strange, try reinstalling your graphics driver with "perform clean installation" selected and also download a fresh copy of Inspector.
BTW there is already a customized .nip file ready for import under the "NVIDIA Inspector Profile" section of the Ghost Warrior 2 entry in the spreadsheet.
   
Reply With Quote
Old
  (#89)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 11-10-2015, 14:25 | posts: 837 | Location: Denmark

I decided to include MrBonk's excellent DSR Tool Guide as an additional OGSSAA "example" below the appropriate category in the spreadsheet.
This should introduce users more directly to the concepts, and also makes some of the game specific DSR advice in the table easier to "approach".
   
Reply With Quote
Old
  (#90)
DAOWAce
Newbie
 
DAOWAce's Avatar
 
Videocard: 2x GTX 780 (ASUS DC2)
Processor: i7-5930K 4.3GHz
Mainboard: Gigabyte GA-X99 Gaming 5P
Memory: G.Skill 32GB 2800MHz CL15
Soundcard: X-Fi Titanium
PSU: Seasonic SS-1050XP3
Default 11-20-2015, 14:01 | posts: 14 | Location: US East.

LOD bias has never worked for me since I got my 780, potentially even my 670, though I can't remember that far.

I've never been able to experience SGSSAA without the horrid blurring. No setting I've tried has ever removed it, yet it's been talked about in every guide about enabling SGSSAA.

Has anyone managed to actually correct the blur on modern hardware? AFAIK, it's not supported at all on Fermi+, as per NVIDIA's notes, so I don't know why it's still listed in guides all these years later.
   
Reply With Quote
Old
  (#91)
dr_rus
Maha Guru
 
dr_rus's Avatar
 
Videocard: GTX 1080 GRP
Processor: i7-6850K
Mainboard: Sabertooth X99
Memory: 64 GB DDR4
Soundcard: SB X-Fi Ti
PSU: CM V1200 Platinum
Default 11-20-2015, 14:43 | posts: 1,700

Quote:
Originally Posted by DAOWAce View Post
LOD bias has never worked for me since I got my 780, potentially even my 670, though I can't remember that far.

I've never been able to experience SGSSAA without the horrid blurring. No setting I've tried has ever removed it, yet it's been talked about in every guide about enabling SGSSAA.

Has anyone managed to actually correct the blur on modern hardware? AFAIK, it's not supported at all on Fermi+, as per NVIDIA's notes, so I don't know why it's still listed in guides all these years later.
SGSSAA doesn't blur even without LOD tweaks. LOD tweak allows you to get more texture details which won't shimmer because SSAA antialias textures as well as triangles, it does nothing to any blur.

What you're referring to is likely a bug of the game's post processing shaders when combined with SGSSAA. It can be fixed in some titles with a correct AA bit.
   
Reply With Quote
Old
  (#92)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 11-20-2015, 16:08 | posts: 837 | Location: Denmark

Actually the custom DX9 AA profiles have three overall purposes:

1) Prevent blurring.
2) Provide the best possible AA quality.
3) Prevent graphics glitches in situations with complicated post processing or lighting.

And automatic LOD correction with SGSSAA was already built into the drivers in January 2013 (313.95 BETA).
Only in OpenGL is it necessary to use manual LOD correction to prevent blurring with forced/enhanced TrSSAA (SGSSAA has no effect in OpenGL).

But of course there are situations (e.g. Witcher 2) where the lighting the game uses prevents forced AA without issues such as artifacts or blurring, no matter which settings or custom AA profile you use.
   
Reply With Quote
Old
  (#93)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 12-06-2015, 14:29 | posts: 837 | Location: Denmark

The spreadsheet and OP should now be fully up to date again.
I also added some preliminary tweaks for Windows 10 to the introduction of the OP.
This is however still work in progress, and I might add more content if something changes.

BTW "0x000000F5" should be sufficient for decent SLI scaling in Fallout 4, but I decided not to add this since there is now an official SLI profile in the latest 359.06 driver.
However this driver breaks SLI in Star Wars: Battlefront, so I recommend sticking with 359.00 for the time being and just manually adding "0x000000F5" to your FO4 profile.
   
Reply With Quote
Old
  (#94)
aufkrawall2
Master Guru
 
Videocard: MSI GTX 1070 Gaming X
Processor: 2500k@4.6Ghz
Mainboard: P67
Memory: 16GB DDR3 1600
Soundcard:
PSU: Corsair 860W Platinum
Default 12-06-2015, 16:54 | posts: 383

Quote:
Originally Posted by dr_rus View Post
SGSSAA doesn't blur even without LOD tweaks.
Wrong, SGSSAA leads to overfiltering which requires negative LOD bias to prevent blur in textures.
Since some driver version, there is an automatic LOD adjustment, depending on the set up SGSSAA sample count.
This is totally independent from the fact than you often can use a strong LOD bias with it without visible negative effect.
   
Reply With Quote
Old
  (#95)
dr_rus
Maha Guru
 
dr_rus's Avatar
 
Videocard: GTX 1080 GRP
Processor: i7-6850K
Mainboard: Sabertooth X99
Memory: 64 GB DDR4
Soundcard: SB X-Fi Ti
PSU: CM V1200 Platinum
Default 12-06-2015, 17:34 | posts: 1,700

Quote:
Originally Posted by aufkrawall2 View Post
Wrong, SGSSAA leads to overfiltering which requires negative LOD bias to prevent blur in textures.
Since some driver version, there is an automatic LOD adjustment, depending on the set up SGSSAA sample count.
This is totally independent from the fact than you often can use a strong LOD bias with it without visible negative effect.
What's "overfiltering"?
   
Reply With Quote
Old
  (#96)
aufkrawall2
Master Guru
 
Videocard: MSI GTX 1070 Gaming X
Processor: 2500k@4.6Ghz
Mainboard: P67
Memory: 16GB DDR3 1600
Soundcard:
PSU: Corsair 860W Platinum
Default 12-06-2015, 19:42 | posts: 383

The same as positive LOD bias.
Try an old game that doesn't blur with SGSSAA due to post processing etc. and compare texture sharpness.
With SGSSAA without negative LOD bias, you practically will have less AF than without SGSSAA.
   
Reply With Quote
Old
  (#97)
dr_rus
Maha Guru
 
dr_rus's Avatar
 
Videocard: GTX 1080 GRP
Processor: i7-6850K
Mainboard: Sabertooth X99
Memory: 64 GB DDR4
Soundcard: SB X-Fi Ti
PSU: CM V1200 Platinum
Default 12-06-2015, 21:49 | posts: 1,700

Less AF? What? I really don't think that you know what you're talking about.

Here's a quick comparison I just made for you in Morrowind:



Using SGSSAA without any LOD correction does not blur textures. What it does to textures is antialias them - which may result in a loss of excessively bright/dark pixels which someone may see as blurring. But it's not blurring, it's antialiasing.

The ability to lower the texture LOD value with SGSSAA comes from the fact that SGSSAA antialias textures as well as polygons and as a result you may push higher detail MIPs into the scene and SGSSAA will take care of texture shimmering which this may produce otherwise. It has nothing to do with combating some "blur" of SGSSAA, it's just a bonus of AA running on textures. If you look at the left side wall on the comparison above you'll easily notice that lowering texture LOD bias does nothing to the sharpness of the base texture already shown there under LOD 0.0.

Last edited by dr_rus; 12-06-2015 at 21:52.
   
Reply With Quote
Old
  (#98)
aufkrawall2
Master Guru
 
Videocard: MSI GTX 1070 Gaming X
Processor: 2500k@4.6Ghz
Mainboard: P67
Memory: 16GB DDR3 1600
Soundcard:
PSU: Corsair 860W Platinum
Default 12-06-2015, 22:01 | posts: 383

Quote:
Originally Posted by dr_rus View Post
Less AF? What? I really don't think that you know what you're talking about.
Ok, eod. You can't be argued with.
   
Reply With Quote
Old
  (#99)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 12-07-2015, 12:58 | posts: 837 | Location: Denmark

aufkrawall is completely correct about driver forced SGSSAA needing auto LOD or manual LOD correction to prevent image blurring.
This is due to the nature of SGSSAA, and the correct LOD bias is determined by this formula:

Quote:
LOD(n) = -0.5 * lb(n), where n is the number of AA samples and lb(n) is LOG base 2
http://www.miniwebtool.com/log-base-2-calculator/
   
Reply With Quote
Old
  (#100)
jiminycricket
Member Guru
 
Videocard: Maxwell
Processor: Haswell-E
Mainboard:
Memory:
Soundcard:
PSU: 1kW
Default 01-02-2016, 14:01 | posts: 148

Quote:
Originally Posted by GuruKnight View Post
Only in OpenGL is it necessary to use manual LOD correction to prevent blurring with forced/enhanced TrSSAA (SGSSAA has no effect in OpenGL).
What about OGSSAA and DSR, no manual LOD bias needed either? Because I notice quite a bit of blur with forced OGSSAA in some DX9 games (Source Engine) w/o manual negative LOD bias even with correct AA bits. AFAIK auto LOD bias is only in SGSSAA and HSAA (XxS/XxSQ).
   
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Powered by vBulletin®
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
vBulletin Skin developed by: vBStyles.com
Copyright (c) 2017, All Rights Reserved. The Guru of 3D, the Hardware Guru, and 3D Guru are trademarks owned by Hilbert Hagedoorn.