Guru3D.com Forums

Go Back   Guru3D.com Forums > Videocards > Videocards - NVIDIA GeForce Drivers Section
Videocards - NVIDIA GeForce Drivers Section In this section you can discuss everything GeForce driver related. GeForce ForceWare and GeForce Exerience drivers are for NVIDIA Quadro and all GeForce based videocards.



Reply
 
Thread Tools Display Modes
Old
  (#51)
MrBonk
Maha Guru
 
MrBonk's Avatar
 
Videocard: ASUS GTX 980 STRIX
Processor: Intel Core i7 950 @4Ghz
Mainboard: ASUS Rampage II GENE
Memory: G-SKILL 12GB @1600Mhz
Soundcard: Creative Sound Blaster Z
PSU: Corsair 750TX V1
Default 03-14-2016, 09:57 | posts: 2,676 | Location: Oregon

If you are exporting profiles in NIP. It should only bring over User changes I think from what i've seen. (Though what is shown below, certainly seems more like it imports everything)

If you export/import through "Export all profiles" which uses a text/xml format. It will override absolutely everything.

Here's a comparison of the A.Creed Syndicate profile
Unaltered Single export
Quote:
<?xml version="1.0" encoding="utf-16"?>
<ArrayOfProfile>
<Profile>
<ProfileName>Assassin's Creed Syndicate</ProfileName>
<Executeables>
<string>acs.exe</string>
<string>datapc.exe</string>
</Executeables>
<Settings>
<ProfileSetting>
<SettingID>3224886</SettingID>
<SettingValue>2048</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>6249010</SettingID>
<SettingValue>3</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>10512710</SettingID>
<SettingValue>167842037</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>271830722</SettingID>
<SettingValue>2</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>271834323</SettingID>
<SettingValue>4</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>271895433</SettingID>
<SettingValue>0</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>283958146</SettingID>
<SettingValue>1048577</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1879649610</SettingID>
<SettingValue>2979804468</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1881060439</SettingID>
<SettingValue>574728993</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1888336069</SettingID>
<SettingValue>594403908</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1889148383</SettingID>
<SettingValue>1260181864</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1889196762</SettingID>
<SettingValue>938836823</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1889196763</SettingID>
<SettingValue>3394743669</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1894626177</SettingID>
<SettingValue>606112620</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1895359496</SettingID>
<SettingValue>2159440372</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>2157797216</SettingID>
<SettingValue>4026531841</SettingValue>
</ProfileSetting>
</Settings>
</Profile>
</ArrayOfProfile>
AA Fix enabled Single Export
Quote:
<?xml version="1.0" encoding="utf-16"?>
<ArrayOfProfile>
<Profile>
<ProfileName>Assassin's Creed Syndicate</ProfileName>
<Executeables>
<string>acs.exe</string>
<string>datapc.exe</string>
</Executeables>
<Settings>
<ProfileSetting>
<SettingID>547063</SettingID>
<SettingValue>0</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>3224886</SettingID>
<SettingValue>2048</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>6249010</SettingID>
<SettingValue>3</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>10512710</SettingID>
<SettingValue>167842037</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>271830722</SettingID>
<SettingValue>2</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>271834323</SettingID>
<SettingValue>4</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>271895433</SettingID>
<SettingValue>0</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>283958146</SettingID>
<SettingValue>1048577</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1879649610</SettingID>
<SettingValue>2979804468</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1881060439</SettingID>
<SettingValue>574728993</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1888336069</SettingID>
<SettingValue>594403908</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1889148383</SettingID>
<SettingValue>1260181864</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1889196762</SettingID>
<SettingValue>938836823</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1889196763</SettingID>
<SettingValue>3394743669</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1894626177</SettingID>
<SettingValue>606112620</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1895359496</SettingID>
<SettingValue>2159440372</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>2157797216</SettingID>
<SettingValue>4026531841</SettingValue>
</ProfileSetting>
</Settings>
</Profile>
</ArrayOfProfile>
AA fix enabled Export Customized Profiles only
Quote:
<?xml version="1.0" encoding="utf-16"?>
<ArrayOfProfile>
<Profile>
<ProfileName>Assassin's Creed Syndicate</ProfileName>
<Executeables>
<string>acs.exe</string>
<string>datapc.exe</string>
</Executeables>
<Settings>
<ProfileSetting>
<SettingID>547063</SettingID>
<SettingValue>0</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>3224886</SettingID>
<SettingValue>2048</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>6249010</SettingID>
<SettingValue>3</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>10512710</SettingID>
<SettingValue>167842037</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>271830722</SettingID>
<SettingValue>2</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>271834323</SettingID>
<SettingValue>4</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>271895433</SettingID>
<SettingValue>0</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>283958146</SettingID>
<SettingValue>1048577</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1879649610</SettingID>
<SettingValue>2979804468</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1881060439</SettingID>
<SettingValue>574728993</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1888336069</SettingID>
<SettingValue>594403908</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1889148383</SettingID>
<SettingValue>1260181864</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1889196762</SettingID>
<SettingValue>938836823</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1889196763</SettingID>
<SettingValue>3394743669</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1894626177</SettingID>
<SettingValue>606112620</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>1895359496</SettingID>
<SettingValue>2159440372</SettingValue>
</ProfileSetting>
<ProfileSetting>
<SettingID>2157797216</SettingID>
<SettingValue>4026531841</SettingValue>
</ProfileSetting>
</Settings>
</Profile>
</ArrayOfProfile>
Copy pasta them into separate windows and A/B them. Latter 2 seem to be the same. Whether settings are user defined it doesn't say. Which is odd. Because when you export ALL profiles, it is formatted differently.

Same profile. AA fix enabled in NVText format

NVText is shown in Hexdecimal. While exported .NIP XMLs are just Decimal conversions.


One would have to test I suppose by taking a profile. Exporting it. Then deleting any and all set and customized values on the profile. And see if it brings them all back....and it does.

Last edited by MrBonk; 03-14-2016 at 10:10.
   
Reply With Quote
 
Old
  (#52)
Anarion
Ancient Guru
 
Anarion's Avatar
 
Videocard: EVGA GeForce GTX 1070 ACX
Processor: Intel Core i7 3770K
Mainboard: ASUS P8Z77-V
Memory: G.SKILL RipjawsX 16 GB
Soundcard: Sound Blaster Zx + HD 595
PSU: Corsair AX760
Default 03-14-2016, 10:10 | posts: 12,906 | Location: Finland

Quote:
Originally Posted by CK the Greek View Post
I am curious if update is needed, especially about exporting profiles and importing after a newer driver installation. Does importing back -older- profiles combines older stuff with newly added ones in a profile or not? This is very important because if replacing a profile (just for settings so we don't have to change all over again) "removes" all new added stuff then will NOT be correct and should be confirmed.
The old profile will replaces everything in the new (it basically deletes the new profile and replaces it with the old exported profile). That's why I only use it for games that are old enough.

Last edited by Anarion; 03-14-2016 at 10:12.
   
Reply With Quote
Old
  (#53)
MrBonk
Maha Guru
 
MrBonk's Avatar
 
Videocard: ASUS GTX 980 STRIX
Processor: Intel Core i7 950 @4Ghz
Mainboard: ASUS Rampage II GENE
Memory: G-SKILL 12GB @1600Mhz
Soundcard: Creative Sound Blaster Z
PSU: Corsair 750TX V1
Default 03-14-2016, 10:11 | posts: 2,676 | Location: Oregon

Does anyone actually know of any games that this causes problems for?

This does make it seem like Inspector should be changed so it only imports USER defined values.

Last edited by MrBonk; 03-14-2016 at 10:14.
   
Reply With Quote
Old
  (#54)
Anarion
Ancient Guru
 
Anarion's Avatar
 
Videocard: EVGA GeForce GTX 1070 ACX
Processor: Intel Core i7 3770K
Mainboard: ASUS P8Z77-V
Memory: G.SKILL RipjawsX 16 GB
Soundcard: Sound Blaster Zx + HD 595
PSU: Corsair AX760
Default 03-14-2016, 10:14 | posts: 12,906 | Location: Finland

Quote:
Originally Posted by MrBonk View Post
Does anyone actually know of any games that this causes problems for?
Well, if SLI profile gets updated in new drivers then that would be an issue.

I made a tool that got rid of all stuff that could potentially cause issues in exported XMLs so that it wouldn't override those changes but that plan didn't work because NVIDIA Inspector doesn't just replace the values that are in the exported profile (it basically deletes the profile and creates new one based on the exported settings).

In perfect world it would only export user changed settings, only export those and only replace those values in existing profile. Tracking user changes can be a bit complex thing to implement and make fool proof.

Last edited by Anarion; 03-14-2016 at 10:25.
   
Reply With Quote
 
Old
  (#55)
fantaskarsef
Ancient Guru
 
fantaskarsef's Avatar
 
Videocard: 1080Ti @h2o
Processor: 5930K @h2o
Mainboard: R5E @h2o
Memory: Ripjaws 32GB DDR4
Soundcard: ALC1150
PSU: AX 1200i
Default 03-14-2016, 10:56 | posts: 6,800 | Location: Austria (no kangaroos here)

Aren't there two options, export everything, or export user changed profiles only?
I brought my settings over during the last driver installation, worked well so far, haven't had any issues since then.
   
Reply With Quote
Old
  (#56)
MrBonk
Maha Guru
 
MrBonk's Avatar
 
Videocard: ASUS GTX 980 STRIX
Processor: Intel Core i7 950 @4Ghz
Mainboard: ASUS Rampage II GENE
Memory: G-SKILL 12GB @1600Mhz
Soundcard: Creative Sound Blaster Z
PSU: Corsair 750TX V1
Default 03-14-2016, 11:17 | posts: 2,676 | Location: Oregon

Quote:
Originally Posted by fantaskarsef View Post
Aren't there two options, export everything, or export user changed profiles only?
I brought my settings over during the last driver installation, worked well so far, haven't had any issues since then.
I've been doing the same for a couple years actually...

And it actually does seem to import the complete profile, including optimization flags.

A better test would be to take something like A.Creed Syndicate. And export a single profile for it from an earlier driver before they changed the SLI flag and added more optimizations with one setting changed by you. And see if it changes all of the up to date in the newer driver to the old stuff.


Anyone on some older drivers from a few months ago?
I could test this myself with AC:S
   
Reply With Quote
Old
  (#57)
jiminycricket
Member Guru
 
Videocard: Maxwell
Processor: Haswell-E
Mainboard:
Memory:
Soundcard:
PSU: 1kW
Default 03-19-2016, 20:09 | posts: 148

Quote:
Originally Posted by MrBonk View Post
Enable Maxwell sample interleaving (MFAA): This enables Nvidia's new Multi Frame Anti Aliasing mode. This only works in DXGI (DX10+) and requires either the game to have MSAA enabled in the game or MSAA to be forced (Good luck with that. Few and far in between number of games that work).

What it does is change the sub sample grid pattern every frame and then is reconstructed in motion with a "Temporal Synthesis Filter" as Nvidia calls it.
There are some caveats to using this though.
  • It is not compatible with SGSSAA
  • With TrSSAA in one case I tested could cause some blur on TrSSAA components
  • It causes visible flickering on geometric edges with a visible sawtooth like pattern. This is nullified with Downsampling though. So it's GREAT to use with downsampling to improve AA/Performance
  • It has a Framerate requirement of about 40FPS minimum. Otherwise the Temporal Synthesis Filter seems to fall apart in a strange way. When the game is motion you'll notice severe blurring and smearing in motion. Making it unplayable. Strangely enough though if you record video of while the framerate is under this it will not be visible in the recording at all. Bizarre.
This is due to MFAA's cross-frame sampling pattern and doesn't show up in actual gameplay, only in screenshots and videos captured on the local machine. If you use external framegrabber, i.e. an FCAT rig, to capture MFAA there is no sawtooth pattern.
   
Reply With Quote
Old
  (#58)
dr_rus
Maha Guru
 
dr_rus's Avatar
 
Videocard: GTX 1080 GRP
Processor: i7-6850K
Mainboard: Sabertooth X99
Memory: 64 GB DDR4
Soundcard: SB X-Fi Ti
PSU: CM V1200 Platinum
Default 03-19-2016, 22:45 | posts: 1,709

Quote:
Originally Posted by jiminycricket View Post
This is due to MFAA's cross-frame sampling pattern and doesn't show up in actual gameplay, only in screenshots and videos captured on the local machine. If you use external framegrabber, i.e. an FCAT rig, to capture MFAA there is no sawtooth pattern.
Yep. MFAA is perfectly ok in-game, the sawtooth pattern shows up on screenshots only because of how MFAA works via combining AA samples from several frames.
   
Reply With Quote
Old
  (#59)
khanmein
Maha Guru
 
khanmein's Avatar
 
Videocard: EVGA GTX 1070 SC ACX 3.0
Processor: Intel® Core™ i5-4460
Mainboard: ASRock H97 Pro4
Memory: 16GB DDR3 Kingston CL11
Soundcard: Realtek ALC892
PSU: Seasonic X-750 (KM3)
Default 03-20-2016, 03:42 | posts: 1,240 | Location: Batu Pahat, Johor, Malaysia

set MFAA globally on won't affect any games, it will auto turn off if not support.

http://www.tweakguides.com/NVFORCE_7.html
   
Reply With Quote
Old
  (#60)
jiminycricket
Member Guru
 
Videocard: Maxwell
Processor: Haswell-E
Mainboard:
Memory:
Soundcard:
PSU: 1kW
Default 03-20-2016, 05:34 | posts: 148

Quote:
Originally Posted by khanmein View Post
set MFAA globally on won't affect any games, it will auto turn off if not support.

http://www.tweakguides.com/NVFORCE_7.html
If MFAA is enabled globally it affects any DXGI game in which you have enabled MSAA
   
Reply With Quote
Old
  (#61)
MrBonk
Maha Guru
 
MrBonk's Avatar
 
Videocard: ASUS GTX 980 STRIX
Processor: Intel Core i7 950 @4Ghz
Mainboard: ASUS Rampage II GENE
Memory: G-SKILL 12GB @1600Mhz
Soundcard: Creative Sound Blaster Z
PSU: Corsair 750TX V1
Default 03-20-2016, 05:57 | posts: 2,676 | Location: Oregon

Quote:
Originally Posted by jiminycricket View Post
This is due to MFAA's cross-frame sampling pattern and doesn't show up in actual gameplay, only in screenshots and videos captured on the local machine. If you use external framegrabber, i.e. an FCAT rig, to capture MFAA there is no sawtooth pattern.
This is true after re-testing. I will edit that to reflect it.

I don't really care to use MFAA at all unless it's combined with SSAA. The flickering, other temporal artifacts are enough for me to personally to not to want to use it at native res. With enough downsampling to mitigate it however that's fine.

There is also the framerate problem.
In Lost Planet at sub 40FPS rather than turn into the smearing mess that FC3:BD and Grandia II Anniversary do, the temporal artifacts and flickering intensify.

I wouldn't enable it globally, nor recommend to do so. Feel free to do as you please though.

Last edited by MrBonk; 03-20-2016 at 06:03.
   
Reply With Quote
Old
  (#62)
khanmein
Maha Guru
 
khanmein's Avatar
 
Videocard: EVGA GTX 1070 SC ACX 3.0
Processor: Intel® Core™ i5-4460
Mainboard: ASRock H97 Pro4
Memory: 16GB DDR3 Kingston CL11
Soundcard: Realtek ALC892
PSU: Seasonic X-750 (KM3)
Default 03-20-2016, 06:32 | posts: 1,240 | Location: Batu Pahat, Johor, Malaysia

Quote:
Originally Posted by jiminycricket View Post
If MFAA is enabled globally it affects any DXGI game in which you have enabled MSAA
which type of DXGI games? serious it didn't affect on my side.
   
Reply With Quote
Old
  (#63)
dr_rus
Maha Guru
 
dr_rus's Avatar
 
Videocard: GTX 1080 GRP
Processor: i7-6850K
Mainboard: Sabertooth X99
Memory: 64 GB DDR4
Soundcard: SB X-Fi Ti
PSU: CM V1200 Platinum
Default 03-20-2016, 13:00 | posts: 1,709

Quote:
Originally Posted by MrBonk View Post
This is true after re-testing. I will edit that to reflect it.

I don't really care to use MFAA at all unless it's combined with SSAA. The flickering, other temporal artifacts are enough for me to personally to not to want to use it at native res. With enough downsampling to mitigate it however that's fine.

There is also the framerate problem.
In Lost Planet at sub 40FPS rather than turn into the smearing mess that FC3:BD and Grandia II Anniversary do, the temporal artifacts and flickering intensify.

I wouldn't enable it globally, nor recommend to do so. Feel free to do as you please though.
Haven't noticed any flickering with MFAA. As for the framerate when you're using MFAA with MSAA 8x you're getting something akin to MSAA 16x and this does cost a bit of performance.

Quote:
Originally Posted by khanmein View Post
which type of DXGI games? serious it didn't affect on my side.
MFAA works in all DX10 and DX11 games with 2 or 3 exceptions.
   
Reply With Quote
Old
  (#64)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 03-20-2016, 16:06 | posts: 838 | Location: Denmark

Quote:
Originally Posted by dr_rus View Post
MFAA works in all DX10 and DX11 games with 2 or 3 exceptions.
The debate of whether/how well MFAA works in specific DX10+ titles is a complicated one, and really belongs in the AA thread

The quality of the result and any resulting issues highly depends on the quality of the ingame MSAA implementations, which are questionable at best in most newer titles.
There is really no substitute for native 4K resolution combined with FXAA/SMAA or TXAA these days.
   
Reply With Quote
Old
  (#65)
VAlbomb
Member Guru
 
Videocard: Nvidia G1 Gaming GTX 970
Processor: Intel i5 4690K@4.6Ghz
Mainboard: Asus Z87M-PLUS
Memory: 16GB DDR3@2133Mhz
Soundcard: Realtek ALC887
PSU: Corsair CX600M
Default 03-20-2016, 21:18 | posts: 134

On what games do you get flickering, artifacts and smearing?
   
Reply With Quote
Old
  (#66)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 03-20-2016, 22:05 | posts: 838 | Location: Denmark

I don't really use MFAA, since it doesn't work in SLI mode
No big loss IMO.
   
Reply With Quote
Old
  (#67)
TheRyuu
Member Guru
 
Videocard: EVGA GTX 1080
Processor: i7-4771
Mainboard: Asus Z87-Pro
Memory: 16GB DDR3-1600
Soundcard: FiiO E10K DAC
PSU: Corsair 750W
Default 03-21-2016, 03:59 | posts: 107

Quote:
Originally Posted by khanmein View Post
set MFAA globally on won't affect any games, it will auto turn off if not support.

http://www.tweakguides.com/NVFORCE_7.html
I would still caution against editing the global profile. It's not that much extra effort to change the application specific profile. Nvidia Inspector makes this easy and can even save your application specific settings as well for (clean) driver updates (which you should be doing).
   
Reply With Quote
Old
  (#68)
MrBonk
Maha Guru
 
MrBonk's Avatar
 
Videocard: ASUS GTX 980 STRIX
Processor: Intel Core i7 950 @4Ghz
Mainboard: ASUS Rampage II GENE
Memory: G-SKILL 12GB @1600Mhz
Soundcard: Creative Sound Blaster Z
PSU: Corsair 750TX V1
Default 03-21-2016, 10:12 | posts: 2,676 | Location: Oregon

Quote:
Originally Posted by dr_rus View Post
Haven't noticed any flickering with MFAA. As for the framerate when you're using MFAA with MSAA 8x you're getting something akin to MSAA 16x and this does cost a bit of performance.



MFAA works in all DX10 and DX11 games with 2 or 3 exceptions.
If you A/B it with MSAA, it most definitely does flicker at least in some games (Like Lost Planet).

The framerate is just a matter of performance sake. Most modern games are demanding. And modern MSAA is even more demanding. So if someone wants better quality MSAA at say 4k, but can only get avg 45FPS unstable with 2xMFAA. They could cap it at 30FPS to get stable performance with no stuttering on a 60hz display. But then MFAA messes that up.

With Lost Planet specifically, I throttled it back to 60,40 and then 30 to gauge problems. Even at 120FPS, there is still some flickering and artifacts that don't exist in the same manner as with 4xMSAA built natively into the game.
Quote:
Originally Posted by VAlbomb View Post
On what games do you get flickering, artifacts and smearing?
Here are 3 I know of off the top of my head.

Far Cry 3 Blood Dragon . (I"m sure it extends to FC3)
Grandia II Anniversary Edition (Locked to 30FPS outside of battle, in battles doesn't happen if 60FPS is enabled)
Lost Planet Extreme Condition/Colonies.

As some else has mentioned, it is entirely dependent on the in game implementation. Which probably is also a factor.

Last edited by MrBonk; 03-21-2016 at 10:29.
   
Reply With Quote
Old
  (#69)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 03-21-2016, 21:05 | posts: 838 | Location: Denmark

Quote:
Originally Posted by MrBonk View Post
[*]Gsync Global Feature: A simple On/Off switch. I know no more. Anyone who does, please let me know.[*]GSYNC Requested State:A simple On/Off switch. I know no more.
[*]Vertical Sync Tear Control: This controls when a frame drop is detected whether Vsync should be disabled to maintain performance or sync should drop to the next syncable rate. At 60hz, without adaptive the frame rate will drop to 30FPS because it's the next syncable rate; 1/2.
You can use TB as mentioned above instead of adaptive, or as long as you ensure you have enough power to sustain the peformance you are aiming for it shouldn't be an issue.

Adaptive in my experience can be hit and miss, but so can Triple Buffering. In some cases TB can increase Input Latency, stay the same or decrease it. (Despite what anyone may say).
It's up to you what you prefer to use. I prefer to not use adaptive. And again Gsync makes this irrelevant.
[*]Vertical Sync: Controls whether Vsync can be enabled for any given application. Typically it's set to "Application Controlled". Which means instead it's up to the individual application itself to enable/disable or offer the option for Vsync.
One recent example is Fallout 4. The game has no Vsync option, but it is forced on no matter what.
You can disable it by setting this to "Force Off" on the Fallout 4 profile.

Remember Gsync makes this irrelevant. (AFIK)
Actually V-Sync is automatically forced in the global profile, when you reinstall the graphics drivers after connecting a new G-Sync monitor.
However it is possible to disable V-Sync, and in the process also G-Sync, on a game per game basis by setting "Vertical Sync" to "Use the 3D application setting" in the specific driver profile.
Ingame V-Sync settings should always be disabled when using a G-Sync monitor to prevent any conflicts.

Quote:
Originally Posted by MrBonk View Post
[*]Threaded optimization- We do not know what this actually does. But it works in DX and OGL and apparently can help and make things worse depending on the game. (Bad with ex: Neverwinter Nights. Helps: Source Engine games)Default to on is best bet.
If you know of any other problem games. Do let me know!
Setting this to "Off" also has a negative effect in for example Battlefield: Bad Company 2, and can cause stuttering on some CPU intensive multiplayer maps like Valparaiso.
At least it did on my old i7-940 + 2-way GTX 780 system.
Quote:
Originally Posted by MrBonk View Post
[*]Antialiasing - SLI AA
SLI AA essentially disables normal AFR rendering, and in 2-way mode will use the primary GPU for rendering+forced AA, while the secondary GPU is only used to do AA work.
In SLI8x mode for example, each GPU would then do 4xMSAA after which the final result becomes 4xMSAA+4xMSAA=8xMSAA.
This can be useful in games without proper SLI support, so at least the second GPU is not just idling.

However it unfortunately only works correctly in OpenGL, and there will be no difference in temporal behavior between for example normal forced 4xMSAA+4xSGSSAA and SLI8x+8xSGSSAA in DX9.
http://www.nvidia.com/object/slizone_sliAA_howto1.html
   
Reply With Quote
Old
  (#70)
TheRyuu
Member Guru
 
Videocard: EVGA GTX 1080
Processor: i7-4771
Mainboard: Asus Z87-Pro
Memory: 16GB DDR3-1600
Soundcard: FiiO E10K DAC
PSU: Corsair 750W
Default 03-22-2016, 05:38 | posts: 107

Quote:
Originally Posted by GuruKnight View Post
Setting this to "Off" also has a negative effect in for example Battlefield: Bad Company 2, and can cause stuttering on some CPU intensive multiplayer maps like Valparaiso.
At least it did on my old i7-940 + 2-way GTX 780 system.[/URL]
You should leave threaded optimization to application controlled/Auto. I would especially avoid touching this in the global profile. The driver should know what's best the vast majority of the time.
   
Reply With Quote
Old
  (#71)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 03-22-2016, 10:25 | posts: 838 | Location: Denmark

Quote:
Originally Posted by TheRyuu View Post
You should leave threaded optimization to application controlled/Auto. I would especially avoid touching this in the global profile. The driver should know what's best the vast majority of the time.
That's not really the point, now is it?
I was just mentioning an example of where disabling threaded optimization would have a negative effect.
Never said anything about setting it in the global profile

There are many other cases, where setting threaded optimization to "Off" or "On" is better than just auto
Sleeping Dogs: Best to set to "On" for smoother gameplay and better stability
The Chronicles of Riddick: Assault on Dark Athena: Best to set "Off" to avoid framerate drops and other issues

etc.
   
Reply With Quote
Old
  (#72)
CK the Greek
Maha Guru
 
CK the Greek's Avatar
 
Videocard: 2x970G1 SLI,Gsync,3DVsn2
Processor: i5 4670K @4.4Ghz H2O H110
Mainboard: GA Z87X-UD5H
Memory: G.skill F3-2400C10D16GTX
Soundcard: Premium 5.1 Snd Sys
PSU: Corsair RM1000x
Default 03-23-2016, 09:59 | posts: 1,183 | Location: Greece

Quote:
Originally Posted by GuruKnight View Post
Ingame V-Sync settings should always be disabled when using a G-Sync monitor to prevent any conflicts.
Not in any game though, for example Division, it's a weird how engine renders game in FULSREEN because with in game V sync OFF Gsync doens't work (I see tearing and believe me I've tried any combination but dodn't test setting Gsync to be enabled for windowed mode though).

As generic rule yes, Vsync should be OFF ingame settings (except Division..)
   
Reply With Quote
Old
  (#73)
TheRyuu
Member Guru
 
Videocard: EVGA GTX 1080
Processor: i7-4771
Mainboard: Asus Z87-Pro
Memory: 16GB DDR3-1600
Soundcard: FiiO E10K DAC
PSU: Corsair 750W
Default 03-23-2016, 12:33 | posts: 107

Quote:
Originally Posted by GuruKnight View Post
That's not really the point, now is it?
I was just mentioning an example of where disabling threaded optimization would have a negative effect.
Never said anything about setting it in the global profile

There are many other cases, where setting threaded optimization to "Off" or "On" is better than just auto
Sleeping Dogs: Best to set to "On" for smoother gameplay and better stability
The Chronicles of Riddick: Assault on Dark Athena: Best to set "Off" to avoid framerate drops and other issues

etc.
Indeed, I just wanted to make sure people didn't get the wrong idea.
   
Reply With Quote
Old
  (#74)
bjoswald
Member Guru
 
bjoswald's Avatar
 
Videocard: 8GB MSI Gaming-X RX 480
Processor: Intel i5-4430
Mainboard: ASRock H87M Pro4
Memory: 16GB Crucial Ballistix
Soundcard: Realtek ALC892
PSU: Corsair CX600M
Default 03-23-2016, 15:23 | posts: 139 | Location: Florida

Threaded Optimization is an odd beast for me. I mean, when reading the tooltip, common sense dictates something like this should be on constantly. I guess really, really old games or buggy games (e.g. DayZ) may not properly spread the load across all processors, but how is this a problem today?
   
Reply With Quote
Old
  (#75)
GuruKnight
Maha Guru
 
GuruKnight's Avatar
 
Videocard: 980 Ti AMP! Extreme SLI
Processor: Intel Core i7 5930K
Mainboard: Asus Rampage V Extreme
Memory: 16GB Corsair DDR4
Soundcard: Onboard+Sennheiser HD 595
PSU: 1500W SilverStone Strider
Default 03-24-2016, 17:28 | posts: 838 | Location: Denmark

Quote:
Originally Posted by bjoswald View Post
Threaded Optimization is an odd beast for me. I mean, when reading the tooltip, common sense dictates something like this should be on constantly. I guess really, really old games or buggy games (e.g. DayZ) may not properly spread the load across all processors, but how is this a problem today?
Actually DayZ runs on a modified version of the same basic engine as ArmA 2, another game which doesn't benefit from NVIDIA's threaded optimization setting.
In fact in the earlier builds of ArmA 2, performance could be gained from setting this to "Off" in the A2 driver profile.
Go figure

I think the problem might be, that in this particular engine enabling driver threaded optimization takes CPU workload away from the game engine itself due to the poor multi-threading.
   
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Powered by vBulletin®
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
vBulletin Skin developed by: vBStyles.com
Copyright (c) 2017, All Rights Reserved. The Guru of 3D, the Hardware Guru, and 3D Guru are trademarks owned by Hilbert Hagedoorn.