Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Cyris, Apr 5, 2014.
Nice trick, might come in handy - thanks for sharing.
If it does not use all of your gpu, it is cpu bound, "most" does not matter when comparing usage percentages. Further, if it does not use all of your cpu in the first place, it is unoptimized. :grab:
Cheers Blaire, I didn't know about that.
I think that Assetto Corsa is a game that is "not" affected by the negative LOD bias. Additionally the example shown in the article is probably just an extreme example affecting one game. However, I'm not dismissing the idea if it affects other games too as I'm definitely no expert what LOD affects (is it textures, aa, colour etc).
I've tried AC with various settings and objects in the distance look fine. Only when applying Positive LOD everything became blurry. However, I do think that the game can be improved by better aa techniques when using SGSSAA.
for most i intend 80-90%..i didn't see any game using all my gpu...
Your CPU is old, that's why.
I see nothing wrong with the game engine. It's fast, sleek and looks good enough even with no SLI.
I can handle the game maxed at 1440p with just one card running at 60fps.
That to me is good enough, although I understand why he wants SLI support.
If it can do it why not. Especially if its advertised as such.
Will have to disagree, ive played SWTOR(same engine) and i get at most 30% gpu utilization at 1400mhz, game is not threaded very well for CPUs. Adding SLI support would not be of any benefit.
See above, ESO beta is the same result for me. Although it is marginally better due to a more recent version of the Hero engine, still not threaded properly.
Shader cache, Ramdisk ?
Question about the new shader cache, not sure it its been discussed before i've not been around much lately.
I have my Windows Temp folders set to my Ramdisk. The Ramdisk does not save itself on shutdown (i only use it for a Firefox cache, temp folders and IE temps to save writes to my SSD's)
Does this mean I will lose the benefit of the shader cache, as every reboot it will empty ????
Sadly I'm going to have to uninstall this beta - it crashes some game and applications on my end.
Maybe that's the problem.
Obviously. Did you expect the cache to magically come back from digital nirvana?
I can use TXAA on my non-GTX card with this driver in Arkham Origins (Quadro 337.50)
To clarify I am using Quadro drivers on my Geforce card
Any Kepler gpu should support it by default..
Speaking of TXAA & Batman AO
I could enable it on Fermi too before they patched it, same by COD9 ghosts
this TXAA is still game only thing? i gather there is no NVcp Forcing of it? just curious about it, but then i hear it's blurring is worries then FXAA
yeah only game specific.. Its like a sweetfx d3d9.dll injector - txaa.win32.dll
Get massive image corruption in 3DMark Firestrike with these, back to 335.23 WHQL...
That's strange, RAMdisk saved the data to a image file on shutdown when I used it.
Go to view and click on advanced and you will be able select the option to save the Disk Image at shutdown.
No I purposefully set it not to save.
As I said I use my Ramdisk for all my Temp and Internet Caching to save writes to my SSD's. Saving the Ramdisk at shutdown would defeat that purpose and also slow shutdown times.
I was just really after the ins and outs of how the shader cache works, eg does it only benefit the 2nd time and beyond when you load up a game, or will it still have a benefit the first time its generated.
It seems a stupid place to keep it in the system temp directory, people are always cleaning it out with ccleaner etc etc
What I may do is create a junction point in my Ramdisk image in the temp folder for the nVidia cache, and point it to another more permanent location on another SSD.
As I previously stated I dont want to save my Ramdisk image at shutdown everytime.
Then you can't do anything about it, it will re-create new ramdisk each time from scratch.