Discussion in 'RivaTuner Advanced Discussion forum' started by Unwinder, Sep 7, 2007.
Glad to see you getting back into the swing of things. Thanks for your time and effort.
Guys, I've got very good news for G80 owners. I've just examined overclocking interfaces of newly released 163.67 drivers and I was really pleased to see that NVIDIA finally added an ability of independent shader clock adjustment. As you probably know, with the past driver families the ForceWare automatically overclocked G80 shader domain synchronicallly with ROP domain using BIOS defined Shader/ROP clock ratio. Starting from 163.67 drivers internal ForceWare overclocking interfaces no longer scale shader domain clock when ROP clock is adjusted and the driver now provides completely independent shader clock adjustment interface. It means that starting from ForceWare 163.67 all overclocking tools like RivaTuner, nTune, PowerStrip or ATITool will adjust ROP clock only.
However, new revisions of these tools supporting new overclocking interfaces will probably allow you to adjust shader clock too. Now I've played with new interfaces and upcoming v2.04 will contain an experimental feature allowing power users to definie custom Shader/ROP ratio via the registry, so RT will clock shader domain together with ROP domain using user defined ratio.
And v2.05 will give you completely independent slider for adjusting shader clock independently of core clock.
By default this applies to Vista specific overclocking interfaces only, Windows XP drivers still provide traditional overclocking interface adjusting both shader and ROP clocks. However, XP drivers also contain optional Vista-styled overclocking interfaces and you can force RivaTuner to use them by setting NVAPIUsageBehavior registry entry to 1.
more sliders = cool
Good job !
Great news !!
I need a few experienced RivaTuner users with G80 for beta-testing this feature. If you wish to test it - send me your email via PM.
А где по-русски можно прочитать?
Не совсем все понял.
A few "minor changes"? Looks like you've been hard at work!
I just bought a Samsung 204B LCD screen and need vsync more than ever for my SLI setup as I am limited to 60Hz/fps @ 1600x1200 before I get horizontal tearing, so this is great news!
Can't wait :thumbup:
Awesome, can't wait for independent shader overclocking on my 8800GTX!
Does anyone know how high the average 8800GTX shader clock can go, while remaining stable? Mine's currently at 1512Mhz, just interested if I'll be able to get many more extra megahertz out of it. The next jump is 54Mhz up, so I may not get it any higher
Anyone know why the shader clock was tied to the core clock in the first place?
I just upgraded to the 163.67 beta drivers. Bad decision.
Should have realised this, but because the shader clock is now independant from the core clock, it's gone back to the stock speed. And I can't adjust it.
Back to 163.44 for me.
We all appreciate the work u do. Many ppl don't realise the amount of work that goes into this.
Thanx for the many great added features. I just love the G15 support and to top that we get independent shader clock control. :smoke:
great news Unwinder 10x
Changin Shader Clock independandly will be great!
are there any overclocking problems on 2.03 version with new Forcware 163.67?
There are no problems in 2.03. Driver just overclocks G80 _differently_ with new drivers and has more agressive throttling system, so may of you should _lower_ unstable clocks you've used in teh past. And that's doesn't depend on RT and won't change with new RT versions.
BUT....in 2.03, my shader clock will NOT change no matter what I do, is this correct?
If you're under Vista - yes.
What about XP?
I was under the impression that the shader clock was independent with the new drivers, and thus, could not be changed with 2.03....?
What about reading post #22?
Excellent, good job man, please work your magic and make the xp interface work good, not using vista yet.
Great! Tanx Unwinder!
Answer is right in the first posting.