Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by Mott, Apr 3, 2018.
Links broken. Even skynet can't fix that.
Since it looks like you're rewoking other people's boards, perhaps you're thinking about the Indium based desoldering alloys used to remove components. Those work by liquid metal attack to adulterate the composition of the original solder alloys enough to get them to come off at depressed temperatures with less risk of thermal damage to the components. No one would ever use them to build a new board, and they should never be used to reattach the components you're repairing.
The mad lad is beyond reason, truly thinks a GPU can desolder itself and lacks any self awareness.
@MajorMagee we need to get back on topic, which was apparently fixing Farmville. (?)
Try Windows 1803 and update it manually to the latest version on the Microsoft catalog, then disable automatic updates with gpedit.msc. I can also help you a bit on how to tweak it for a better experience, I will tell you all the steps, specially so you can lower latency, also please, get an SSD at least as main system drive.. Why 1803? Latest version fixes all Standby Memory bugs, no Spectre&Meltdown embeded into the O.S, QueryPerformance is 3.something MHz and not 10 Mhz as it is on the newer versions, its fast and it works. 1709 is dated and doesn't have proper memory bug fixes, you need ISLC on that version constantly, really bad.
Any Windows version past 1803 has issues due to lots of "security" patches. I wonder if 2004 will be any different, but I doubt it. I tried both 1809 and 1909 and had a terrible experience, also since those versions have forced CPU microcode and not via Windows Update, if you force it and decide to delete the one Windows has and use your own it will give strange issues.
And I got a warning from a moderator due to the way I responded to him. I mean, he is literally asking for it with his behavior. He is basically trolling or he simply lacks the capacity to comprehend basic arguments.
Go run yer GPU at 82C you know it all! Hows TOMSHARDWARE? 1080 At 82C lolz! OH THE FAN IS SO LOUD! Another topic highjacking........
God, reading your posts makes my head hurt, so much absolute nonsense and zero useful info... Also, do you have a trauma at Tom's Harware or what? Why do you relate me to that site? Its actually awful imo, they never helped anyone too well and for the record, my GPU never gets over 65c while gaming, stop being so dense and so obtuse...
OK , Last warning for everyone here!
Everyone back to topic please!
So no more insulting , and stabbing. Lets just have a healthy discussion without degrading people. (and yes noobs is degrading).
Aight so I'm just gonna say it I don't know if anyone else is trying to play warzone on maxwell gpus but if you are and you're on windows 10 ltsb v1607 to avoid the dreaded game stutter/cached ram failures you're gonna get dev error 6068 no matter what you do randomly unless you use nvidia driver 425.31 or lower.
Warzone has a lot of dev errors and I too get the 6068. Tried all the fixes I could found but to no avail. Only thing that worked for me was changing the power plan settings.
It's a modern game and seems to be very susceptible to power settings. Try your graphics drivers in default (no overclocking or tools running e.g. MSI afterburner) and try your CPU with no overclocking (best use optimised motherboard settings that should default it and turn off any memory overclocking too such as XMP), just to discount them as a problem. Then go to your power plan settings and click Change advanced power settings and then look for the following:
PCI Express, Link State Power Management, setting should be Off.
Processor power management, processor performance core parking min cores - setting should be 100%
If you cannot see the second option, use power shell or command prompt (think they need to be administrator too) and type the following command:
powercfg -attributes SUB_PROCESSOR CPMINCORES -ATTRIB_HIDE and hit enter.
If you use High Performance plan then PCI Express should be on the right setting, but any other e.g. balanced will be lower and could be causing the dev error. I have since turned MSI back on and its warzone seems to be working so I think for me it was the power plan changes I made. I also run an overclock on the CPU and it all seems good now.
at some point we'll have to use newer versions of windows. there has to be a way to disable those mitigations for getting performance back. though this doesnt affect new CPUs much, only old generations... and im still rocking 3770K on 1709.
I am on 1909 with a 3570k and in games I play I don't see any issue. I have Inspectre disable migrations and I add the game to exception rules in defender CFG and DEP disabled.
Also bottom up ASLR disabled.
Same CPU here and i didn't even disable spectre/meltdown patches, tbh i didn't notice any difference
I decided to give 1809 a try, 1803 kinda sucks, a bit unstable. At 1809 I am getting lower latency and the system in general is snappier, I installed an older Microcode and deleted the one 1809 comes with as stock from System32 which has forced Spectre&Meltdown. Now I have 1809 without any mitigations. There is probably more you can disable via the registry, but I haven't investigated on that. I also tweaked the system a bit, I always do it, feels very good. However, I had to do a clean install of 1809, when I updated it from 1803, a lot of stuff and functions were broken.
1709 is also great, all versions that end with a "9" seem to do quite well. The "3" ones often have some issues.
I'll try to fiddle with CFG to see if it brings any benefits.
No idea what games you play, but that CPU does get affected by the mitigations by quite a margin.
I didn't notice any performance degrading in any game or CPU BM, in a measurable way but I/O did take a hit.
Do you know if the I/O hit is from the O.S globally or just from the patch? I have it disabled+I am using an older Microcode as well.
I still have OS microcode installed but its partly fixed by just using InSpectre and disabling both options.
Its the random rates on I/O if I remember right.
What is forced when having the default updated microcode enabled that comes with 1809, while disabling the mitigations via registry at the same time? Can you prove that silly claim by providing hard benchmark data? I bet you can't.