Discussion in 'Videocards - NVIDIA GeForce' started by theonlybabyface, Feb 7, 2009.
Tested with new patch and found to be working fine so far, thank you king-dubs.
Can I get the updated version too please? <3
Thanks for the positive feedback!
oh i wont give it out at all , just offering if in the future you need a host.....thanks for the patch however its working like a champ!!!
great work here
Hmm, I find it funny that legitimate EVGA card owners with the GTX 295 are getting the "Device not recognized" error with the original tool. Meaning, "No EVGA card was detected".
It's all over the EVGA forums. My patch definitely fixes that as a side effect
Thanks man - works now with my BFG GTX280. Great job on the patcher and the effort!
Now we just wait for permission for a release
Thanks to all that participated in the testing!
Working PERFECTLY on my Galaxy GTX260.
Stock volts: 1.1125v
Overclocked to: 1.1500v
There's nothing illegal about the patch, and we also do not censor.
However if you show a download link I want you to include a very clear warning that usage MIGHT damage the card when wrongfully used, or when the patch is applied improperly.
I can see some issues with cards that officially are not supported outside the GTX family, yet get a voltage change could would be out of range too much.
To the people that download it: If you use it, you need to understand the risks you are taking YOURSELF.
Could I please get the patch for my xfx GTX280 =) would very much apprichiate it.
Thanks mate, will do.
It will never be built into EVGA Precision.
omg all the snow melted over night and all the robins are back already????
are GPU voltage and Shader Core voltage linked?
if i increase GPU Voltage with this tool will shaders also have higher voltage?
Yes thay are both linked.
Now my GTX260 (216SP) from BFG could run 3Dmark Vantage at 756/1237/1513 MHz @ 1.213v
No worries, make sure you keep an eye on your temps and in particular vrm temps.
So....do 295s normally clock that high? Im a little scared to go any further...
yes for screen shots they clock higher BUT try to RUN a benchmark or game thuo.......
Anyone have any results, max clocks before and after?