Guru3D.com Forums

Go Back   Guru3D.com Forums > Videocards > Videocards - NVIDIA GeForce Drivers Section
Videocards - NVIDIA GeForce Drivers Section In this section you can discuss everything GeForce driver related. GeForce ForceWare and GeForce Exerience drivers are for NVIDIA Quadro and all GeForce based videocards.



Reply
 
Thread Tools Display Modes
Old
  (#101)
CPC_RedDawn
Ancient Guru
 
CPC_RedDawn's Avatar
 
Videocard: MSI GTX1080 +110/+500 H20
Processor: Ryzen 1800X 4GHz H20
Mainboard: CROSSHAIR VI HERO
Memory: 16GB TeamGroup 3200MHz
Soundcard: HyperX Cloud II USB
PSU: 850W EVGA Platinum
Default 01-02-2015, 20:30 | posts: 7,360 | Location: Wolverhampton/United Kingdom

Does Nvidia Inspector need to be left running in the background for the changed to work or is it just a case of applying the changing and closing down inspector and the fixes should work?
   
Reply With Quote
 
Old
  (#102)
MiR4i
Newbie
 
Videocard: GTX 780Ti (x3)
Processor: 4930K
Mainboard: Asus RIVBE
Memory: 64GB
Soundcard:
PSU: AX1200
Default 01-03-2015, 11:58 | posts: 34

Quote:
Originally Posted by CPC_RedDawn View Post
Does Nvidia Inspector need to be left running in the background for the changed to work or is it just a case of applying the changing and closing down inspector and the fixes should work?
You open up Inspector, make your changes, hit apply, and then start the game. Any changes made after that while the game is still running will require a restart of the game client.

You do not need to keep Inspector open nor do you need to close it down. I generally have it running whenever I'm logged into my computer... Which is always.
   
Reply With Quote
Old
  (#103)
CPC_RedDawn
Ancient Guru
 
CPC_RedDawn's Avatar
 
Videocard: MSI GTX1080 +110/+500 H20
Processor: Ryzen 1800X 4GHz H20
Mainboard: CROSSHAIR VI HERO
Memory: 16GB TeamGroup 3200MHz
Soundcard: HyperX Cloud II USB
PSU: 850W EVGA Platinum
Default 01-03-2015, 21:01 | posts: 7,360 | Location: Wolverhampton/United Kingdom

Quote:
Originally Posted by MiR4i View Post
You open up Inspector, make your changes, hit apply, and then start the game. Any changes made after that while the game is still running will require a restart of the game client.

You do not need to keep Inspector open nor do you need to close it down. I generally have it running whenever I'm logged into my computer... Which is always.
Cheers mate I hate having programs running in the background that I am not using.

Great to known I can just apply the settings and close it down.

So does Inspector basically link to the display driver and overwrite its settings then for the games profile then?
   
Reply With Quote
Old
  (#104)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 01-08-2015, 12:54 | posts: 838 | Location: Denmark

Quote:
Originally Posted by CPC_RedDawn View Post
Cheers mate I hate having programs running in the background that I am not using.

Great to known I can just apply the settings and close it down.

So does Inspector basically link to the display driver and overwrite its settings then for the games profile then?
All NVIDIA Inspector does is to expose existing advanced driver features, which are normally hidden from the "average" user.
It doesn't add anything new

Personally I always recommend changing your driver profiles, hit "Apply" and close down Inspector before playing your games.
The less programs running in the background, the better IMO.
   
Reply With Quote
 
Old
  (#105)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 01-23-2015, 17:21 | posts: 838 | Location: Denmark

I added an SLI "water flickering" fix for Assassin's Creed: Unity to my master thread spreadsheet
All major changes to the OP or spreadsheet can always be found in the changelog spoiler at the bottom of the OP.
http://forums.guru3d.com/showpost.ph...43&postcount=1
   
Reply With Quote
Old
  (#106)
adrock311
Master Guru
 
Videocard: EVGA GTX 1070 SC 8gb
Processor: i5 2500k @ 4.5ghz
Mainboard: GIGABYTE GA-Z68X-UD3H-B3
Memory: 8 gb DDR3 1600mhz
Soundcard: Creative Soundblaster Z
PSU: 850 watt Thermaltake
Default 01-30-2015, 17:31 | posts: 791

Anyone get rid of titanfall flickering in SLI? its been forever and no official fix
   
Reply With Quote
Old
  (#107)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 01-30-2015, 18:02 | posts: 838 | Location: Denmark

Quote:
Originally Posted by adrock311 View Post
Anyone get rid of titanfall flickering in SLI? its been forever and no official fix
The official driver profile gives perfect SLI scaling in Titanfall, so there is nothing to be done here without breaking scaling.
The flickering you mention is entirely an application issue.
   
Reply With Quote
Old
  (#108)
adrock311
Master Guru
 
Videocard: EVGA GTX 1070 SC 8gb
Processor: i5 2500k @ 4.5ghz
Mainboard: GIGABYTE GA-Z68X-UD3H-B3
Memory: 8 gb DDR3 1600mhz
Soundcard: Creative Soundblaster Z
PSU: 850 watt Thermaltake
Default 01-30-2015, 19:14 | posts: 791

OH yeah, the framerate and overall performance gain is massive in titanfall in SLI, the only problem is there are light sources, like light bulbs on the walls of buildings, that flicker like a strobe light. So its up to the Titanfall devs to patch it, i guess.
   
Reply With Quote
Old
  (#109)
MaLDo
Master Guru
 
Videocard: GTX1080
Processor: 2700K @ 5 Ghz
Mainboard: G1 Sniper 3
Memory: 32 GB DDR3 1600 Mhz
Soundcard:
PSU: Enermax 1250W
Default 02-14-2015, 20:14 | posts: 525 | Location: Barcelona

Quote:
Originally Posted by blesner View Post
SLI profile for NBA 2k15?
https://forums.geforce.com/default/t...-2k15-sli-fix/
   
Reply With Quote
Old
  (#110)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 02-14-2015, 22:54 | posts: 838 | Location: Denmark

Quote:
Originally Posted by MaLDo View Post
Added to my spreadsheet
http://forums.guru3d.com/showpost.ph...43&postcount=1
   
Reply With Quote
Old
  (#111)
robb
Member Guru
 
robb's Avatar
 
Videocard: 2 x GTX 970 STRIX
Processor: i7-3770K @ 4.5 GHZ
Mainboard: Giga GA-Z77X-UD5H Rev 1.1
Memory: 16GB RipjawsZ 2400MHz
Soundcard: Claro Halo w/ Senn PC 360
PSU: Corsair AX760i
Default 02-19-2015, 03:43 | posts: 75 | Location: Connecticut

Tomb Raider (2013)

The game uses the Nvidia given flag 0x084000F5. When TressFX is enabled (known that TressFX performance on Nvidia is bleh), in many areas, frame rates will plummet to the mid 20's, with GPU usage dropping to the 50% or lower range. It's rather common too and can make it unplayable.
  • When SLI is enabled with TressFX, drops occur.
  • When SLI is disabled with TressFX, drops don't occur.
  • When SLI is enabled without TressFX, drops don't occur.

I read and tried: 0x081902F5 (Far Cry 3) and 0x080040F5 (Daylight/Far Cry 4/Evolve).

0x080040F5
TressFX and SLI work fine together. Lower performance than original flag though, via benchmark. No drops during actual gameplay.
  • Min: 89.1
  • Avg: 126.8
  • Max: 164.0

0x081902F5
TressFX and SLI also work fine together. Performance is even better than the Nvidia given flag too. No drops during actual gameplay. I read that some users noticed ferns in the jungle flickering, but have seen nothing of the sort quite a few hours in. I have noticed very rare shadow problems where light seems cut off slightly (really is an insignificant problem many wouldn't notice, not even positive it's a problem).
  • Min: 96.0
  • Avg: 129.1
  • Max: 166.0

0x084000F5
This is the Nvidia given flag. TressFX and SLI don't work well together, but the problem won't manifest in the benchmark.
  • Min: 94.0
  • Avg: 128.5
  • Max: 166.0

0x081902F5 is the to go flag as I see it. I tried AFR2 for all the flags, and Croft's hair is popping in and out and doing all sorts of things--so AFR is a must.

All settings are maxed out at 1920x1080@120Hz. The only setting not at the highest is anti-aliasing, which is set to 2xSSAA.
   
Reply With Quote
Old
  (#112)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 02-19-2015, 13:56 | posts: 838 | Location: Denmark

The TressFX SLI issue should be corrected as of 344.11 WHQL.
But maybe something was broken in the 347.xx drivers?

Also I recommend using 4xSSAA with shadows reduced to "Normal".
Those contact hardening shadows look terrible in this game:
http://forums.eidosgames.com/showthr...89#post1883789

Could you please try "0x084040F5" in TR 2013?
   
Reply With Quote
Old
  (#113)
robb
Member Guru
 
robb's Avatar
 
Videocard: 2 x GTX 970 STRIX
Processor: i7-3770K @ 4.5 GHZ
Mainboard: Giga GA-Z77X-UD5H Rev 1.1
Memory: 16GB RipjawsZ 2400MHz
Soundcard: Claro Halo w/ Senn PC 360
PSU: Corsair AX760i
Default 02-19-2015, 18:39 | posts: 75 | Location: Connecticut

Quote:
Originally Posted by GuruKnight View Post
The TressFX SLI issue should be corrected as of 344.11 WHQL.
But maybe something was broken in the 347.xx drivers?

Also I recommend using 4xSSAA with shadows reduced to "Normal".
Those contact hardening shadows look terrible in this game:
http://forums.eidosgames.com/showthr...89#post1883789

Could you please try "0x084040F5" in TR 2013?
0x084040F5
Worst performance of all the flags so far. Any and all hair abnormalities seem to be fixed though, like official. No TressFX slowdowns.
  • Min: 88.0
  • Avg: 121.3
  • Max: 156.0

0x084840F5
Same as above.
  • Min: 88.0
  • Avg: 120.6
  • Max: 160.0

Now I'm irritated though since I can't get 0x084000F5 (official) to induce those massive slowdowns anymore. I don't think shader caching would really impact it that much, could it?

Anyhow, I still see Far Cry 3's flag better performing, although her hair may cast more shadows than needed--not seeing really any other problems there.
   
Reply With Quote
Old
  (#114)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 02-19-2015, 19:03 | posts: 838 | Location: Denmark

Quote:
Originally Posted by robb View Post
0x084040F5
Worst performance of all the flags so far. Any and all hair abnormalities seem to be fixed though, like official. No TressFX slowdowns.
  • Min: 88.0
  • Avg: 121.3
  • Max: 156.0

0x084840F5
Same as above.
  • Min: 88.0
  • Avg: 120.6
  • Max: 160.0

Now I'm irritated though since I can't get 0x084000F5 (official) to induce those massive slowdowns anymore. I don't think shader caching would really impact it that much, could it?

Anyhow, I still see Far Cry 3's flag better performing, although her hair may cast more shadows than needed--not seeing really any other problems there.
Those fluctuations you see in the benchmark (especially the min FPS) can easily be contributed to a CPU limit.
I would say, there is no actual performance difference between the tested profiles.
The official profile seems to scale very well for me, but maybe I will test further to make sure.

Do you have any actual ingame screenshot comparisons showing slowdowns with the official SLI profile?
In which part of the game did you notice slowdowns originally?
   
Reply With Quote
Old
  (#115)
robb
Member Guru
 
robb's Avatar
 
Videocard: 2 x GTX 970 STRIX
Processor: i7-3770K @ 4.5 GHZ
Mainboard: Giga GA-Z77X-UD5H Rev 1.1
Memory: 16GB RipjawsZ 2400MHz
Soundcard: Claro Halo w/ Senn PC 360
PSU: Corsair AX760i
Default 02-19-2015, 22:41 | posts: 75 | Location: Connecticut

Quote:
Originally Posted by GuruKnight View Post
Those fluctuations you see in the benchmark (especially the min FPS) can easily be contributed to a CPU limit.
I would say, there is no actual performance difference between the tested profiles.
The official profile seems to scale very well for me, but maybe I will test further to make sure.

Do you have any actual ingame screenshot comparisons showing slowdowns with the official SLI profile?
In which part of the game did you notice slowdowns originally?
Nope. It happened through the entire intro and when I first went outside to look at the coast with all the wreckage--to which I was fed up enough with it to find a different flag. The one place in particular I remember the drops was right when you fall (literally the intro sequence), drop on a piece of rebar, and start walking through that narrow cave. Disabling TressFX right there fixed it entirely, and the moment I turned it back the drops occurred in the area.

I think I'll just let the topic die in its tracks since I can't replicate the problems all of a sudden, so I have nothing to work off of anymore. Scailing/performance does seem identical in the rain, although Far Cry 3's did get a higher max of about +0.2 to +0.3 frames more often than the official did.
   
Reply With Quote
Old
  (#116)
MaLDo
Master Guru
 
Videocard: GTX1080
Processor: 2700K @ 5 Ghz
Mainboard: G1 Sniper 3
Memory: 32 GB DDR3 1600 Mhz
Soundcard:
PSU: Enermax 1250W
Default 02-21-2015, 12:11 | posts: 525 | Location: Barcelona

The problem in Tomb Raider is that SLI affects TressFX hair weight and physics. Every added GPU applies a /2 factor in weight. So using 2xSLI hair looks floaty with a bit of flickering in the fine strands. With 3xSLI hair is SUPER floaty. Only with singlegpu TressFX is accurate.

It's not a horrible problem, but it is annoying.
   
Reply With Quote
Old
  (#117)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 02-21-2015, 13:38 | posts: 838 | Location: Denmark

Perhaps, but after the 344.11 WHQL driver the issue is very minor with 2-way SLI
Compared to the "crazy hair" before this driver revision, this is a vast improvement IMO.
   
Reply With Quote
Old
  (#118)
robb
Member Guru
 
robb's Avatar
 
Videocard: 2 x GTX 970 STRIX
Processor: i7-3770K @ 4.5 GHZ
Mainboard: Giga GA-Z77X-UD5H Rev 1.1
Memory: 16GB RipjawsZ 2400MHz
Soundcard: Claro Halo w/ Senn PC 360
PSU: Corsair AX760i
Default 02-21-2015, 23:28 | posts: 75 | Location: Connecticut

Redacted

Last edited by robb; 02-24-2015 at 00:19.
   
Reply With Quote
Old
  (#119)
signex
Ancient Guru
 
signex's Avatar
 
Videocard: ASUS GTX 780Ti 3GB Matrix
Processor: Intel Pentium G4560
Mainboard: MSI H110M Pro-VD
Memory: 8GB Crucial DDR4 2133mhz
Soundcard: Onboard
PSU: EVGA 430W
Default 03-16-2015, 11:15 | posts: 8,149 | Location: Netherlands

Can anyone look in their NVIDIA Inspector and post AC: Unity SLI flag?

I deleted Unity profile and now the Unity SLI Flag seems to be missing.
   
Reply With Quote
Old
  (#120)
twin snakes
Newbie
 
Videocard: nvidia 6800GT
Processor: P4 3.0E
Mainboard: Abit IC7
Memory: 512PC3200 Kingston
Soundcard: onboard
PSU:
Default 03-16-2015, 11:19 | posts: 5 | Location: VN

SLI bits (dx1x) 0x0A0010F5 (Assassin's Creed Unity)
   
Reply With Quote
Old
  (#121)
signex
Ancient Guru
 
signex's Avatar
 
Videocard: ASUS GTX 780Ti 3GB Matrix
Processor: Intel Pentium G4560
Mainboard: MSI H110M Pro-VD
Memory: 8GB Crucial DDR4 2133mhz
Soundcard: Onboard
PSU: EVGA 430W
Default 03-16-2015, 11:57 | posts: 8,149 | Location: Netherlands

Quote:
Originally Posted by twin snakes View Post
SLI bits (dx1x) 0x0A0010F5 (Assassin's Creed Unity)
Thank you!
   
Reply With Quote
Old
  (#122)
(.)(.)
Banned
 
Videocard: GTX 970
Processor: i72600k @ 4.7
Mainboard: Asus M4E
Memory: Corsair Vengence 10gb
Soundcard: onboard
PSU: Enermax Rev 1250w
Default 05-22-2015, 01:10 | posts: 9,098 | Location: NZ

Can someone point me in the right direction. The below sli bits and AFR2/GPU Count = Four give great performance in the Witcher 3...but there is quite a bit of graphic corruption.

What id like to know, is how do sli bits work, whats the...logic behind the bits and how do they affect the game exactly? Can the below sli bits be tweaked to fix corruption?

The issue that ive atleast been able to figure out is that the "1" is what gives the majority of performance, but also causes the corruption. Am on the right track or is it possible that the performance gain is simply coming from the fact that certain elements are not being rendered?

0x080010F5 (Sleeping Dogs, Sleeping Dogs Definetive Edition, Triad Wars)

SLI_PREDEFINED_GPU_COUNT_DX10_FOUR

SLI_PREDEFINED_MODE_DX10_FORCE_AFR

(Lol at inspector spelling mistake)
   
Reply With Quote
Old
  (#123)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 05-22-2015, 11:39 | posts: 838 | Location: Denmark

The "1" in the "0x080010F5" profile is DX11 SLI bit #12, which can be enabled or disabled using the SLI bit value editor in NVIDIA Inspector.
A "4" would enable DX11 SLI bit #14 (i.e. 0x080040F5), which is also very aggressive and fixes SLI scaling in many DX11 games.
Essentially these two SLI bits skip texture resolve, which may or may not be a problem in some games

Last edited by GuruKnight; 05-22-2015 at 12:10.
   
Reply With Quote
Old
  (#124)
(.)(.)
Banned
 
Videocard: GTX 970
Processor: i72600k @ 4.7
Mainboard: Asus M4E
Memory: Corsair Vengence 10gb
Soundcard: onboard
PSU: Enermax Rev 1250w
Default 05-23-2015, 09:33 | posts: 9,098 | Location: NZ

Quote:
Originally Posted by GuruKnight View Post
The "1" in the "0x080010F5" profile is DX11 SLI bit #12, which can be enabled or disabled using the SLI bit value editor in NVIDIA Inspector.
A "4" would enable DX11 SLI bit #14 (i.e. 0x080040F5), which is also very aggressive and fixes SLI scaling in many DX11 games.
Essentially these two SLI bits skip texture resolve, which may or may not be a problem in some games
Awesome mate, thank you!
   
Reply With Quote
Old
  (#125)
frankgom
Newbie
 
Videocard: 2 x NVIIDA GTX 780ti SLI
Processor: i7 2600k oc 4.6 GHz
Mainboard: ASUS Sabertooth P67 b3
Memory: 16 gb 2000 mhz corsair
Soundcard: creative soundblaster
PSU: 1000 w
Default 06-25-2015, 20:18 | posts: 33 | Location: Palma de Mallorca

for motogp15: the dx11 f12012 sli bits works wonderfully almost doubles performance
very very good
don't know why nviida didn't supplied an sli revisión of the profile.
Enjoy,
   
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Powered by vBulletin®
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
vBulletin Skin developed by: vBStyles.com
Copyright (c) 2017, All Rights Reserved. The Guru of 3D, the Hardware Guru, and 3D Guru are trademarks owned by Hilbert Hagedoorn.