Ok i may be being a dumb arse but i have looked and looked but i cannot see how to overclock GPUs 1,2 and 3. I have overclocked GPU 0 quite easily using the driver level controls, but when i go to the same section for the other 3 GPUs i need to 'enable driver level hardware overclocking' at which point the options are to detect the default clocks or reboot. Neither of these of these choices brings up the overclocking details that are available for GPU 0. Anybody got any experience or advice on this. Cheers.
What does the RT hardware monitor show while a 3D API (app) is running? Do you use RT's OSD to see what the clock speeds are doing? You need an application requiring the 3D clock speeds to really show what's going on and RT's OSD running.
I presume the OSD is the hardware monitor. I have this running and its show the GPU0 running at overclocked speeds and the other three at standard speeds. I have running 3 GPU clients of standford's folding project running. I don't think its an issue with 2d/3d clock speeds, just i can't see how to change the clocks of GPU's 1 to 3. thanks for the replies though i appreciate your thoughts.
thanks but still no go. Cheers for the info. Unfortunatly that still did'nt work. I wonder if it has anything to do with me only having one monitor attached, or that i am using XP x64. The one with the monitor attached is the one i can overclock. Any more ideas most welcome.
Vista better. I have had a go at this on a friends machine, he has the same setup, but has vista x64 and two monitors. Strangely, or not, i can get the option to overclock three of his four GPUs. Not all four but its better than my one. Not sure if this sheds any further light on the problem.
Problem solved Okay. Good news, i have found the answer to this problem. You have to 'extended desktop' in display properties over all four 'monitors' for each of the four GPUs, even if those monitors are'nt attached and are just the 'default monitor'. Then Rivatuner will happily give you the overclocking options for all four GPUs. GPU-Z shows all the GPU clocks as i set them and the GPUs are folding at the expected rate.
Lol This is probably the same problem I had here http://forums.guru3d.com/showthread.php?t=279972 Unfortunately this thread was closed by Mr. Sidewinder :biggun:, claiming there is working workarounds. This was, however, not mentioned nowhere yet. Great work!!! Here is another working workaround just in case: 1. Download NVidia NTune and install it 2. Within NTune go to the device settings. This should show the Profile menu. Select Profile -> Save. Specify a path, e.g. save it to your desktop. 3. Open the file with notepad 4. Remove anything tweaking with the mainboard! Here is my underclocking file for example purposes: ... [BoardConfiguration] Version=200 CPU=6fb MCP=1109 SPP=2576 GPU=NVIDIA GeForce 8800 GTX [ClockSettings] GPUCOREMHZ0=144 GPUMEMMHZ0=225 GPUSHADERMHZ0=337 GPUCOREMHZ1=144 GPUMEMMHZ1=225 GPUSHADERMHZ1=337 ... 5. Create another file doing normal clockspeed and one with overclocking if you like so 6. Doubleclicking any of those files will 100% reliably set the clock speed to whatever you like! Workz!!!! You can place a shortcut to this file in your autostart group. I have removed any other things except the GPU settings because NTune will write the settings of that file to CMOS regardless of whatever settings you have set within the bios and attempt to apply those new settings at runtime. Had a lot of lockups because it could not reliably switch my CPU and FronSSide Bus clockspeed at runtime. It could possibly work if you do all the settings within the NSU file and never touch the bios settings. For now I use it for GPU clockspeed only in combination with profiles defined in the bios for CPU/FSB speed. Maybe I will become an NVidia fanboy one day - LOL Andy
Vista Cheers for the info Andy. My way seems to be happily working away in XP, but get the feeling i would need some dummy monitor plugs for it to work in Vista.
That thread was closed due to your posting with "f%ck" language there. This one will be closed too if you'll continue distorting nicknames purposely.
Save Maybe you could delete the post but not the thread. Others may find the thread useful now i have found a solution.