Discussion in 'Die-hard Overclocking & Case Modifications' started by Rodman, Jun 20, 2008.
Yeah...these cards are definitely not shader clock friendly.
Yep, seems that way, so far very happy with the performance and overclocking in this card, the only complaint so far is noise, simalar kind of noise my old X1900XTX used to put out. Hopefully should see some decent 3rd party coolers for it soon.
Slam, try linking the shaders with the core and ramp the core to 724 and see if you get better results, you can leave the mem at 2600. I think I'm on to something
LOL, I know, they sound like a freaking blow dryer at 100%.
I can assure I am not done pushing this one.
Where is every ones overclocks? Feel like theres only a few of us with 200 series gpu.
Anyway, I finally got my Tieten water block on and did some more testing.
Idle air 50c
Load on air 85c with 100% fan duty. FYI, this is a very noisy fan at 100%, not something I would want to put up with all the time.
Idle WC 38c
Load WC 50c No fan noise at all, very nice imo.
Clocks so far as I still have not seen any artifacts or lockups ect..
I'm going to game on these settings for the day and go for more after
With that, these are Benchmark stable, will find out if they are gaming stable.
I found my best shader and core clock, still dinking with memory-
I can run 721|2600 1545 linked, or I can unlink and raise the core further to 742.
Vantage linked 718|2600 1545 = 11627 GPU, unlinked 742|2600 1547 = 11706 GPU.
Maybe I'll lower the shader clock and go for more core.
WinXP SP3 177.39Driver w/ PhysX
Tried my 1st. OC on EVGA GTX 290 FTW It did 700/1458/2538 and never broke 53C. That is 161.7GB/sec
3D06 = 15949
Crysis looked so incredible I could smell the ocean. Will bench later.
1 stick RAM died so I'll finish testing with new in a couple of days
Rig pics after WC set-up.
Good stability tester, the medusa demo. Loving the BenQ
what order is this 670/2500/1458
core memory shader or core shader memory?
The evga utility on my only lets me go 1775 max on memory what program are u using to overclock
Received my GTX260 today (after ordering on impulse at 16:58 yesterday hahaha) and set it straight to AMP edition speeds (650/1400/2100) havn't tried any higher yet but the temperature has barely gone up 2ºC over stock clocks. 40ºC idle mid 60s load.
Vantage score P10,474 - My 8800gtx scored P5,816
Crysis gpu bench @ 1680x1050 everything set to Very High no AA = 23.19 almost playable (I feel 25fps is playable in crysis). At lower res its playable at very high. My 8800gtx got 17.9fps average.
Lost Planet dx10 everything maxed out C16xQ AA Snow 33.1fps (14.1 with 8800) Cave 44.9fps (19.9 with 8800)
Very impressed so far.
My MSI GeForce GTX 280 is stable at 700/1425/2500
The lowest i got with my 8800GTX were 15fps and max 60. so i don`t think sli is working..
No the core on 8800GTX started at 575 and ultra at 612, mem at 1800 GTX and 2160 on ultra :nerd:
GTX can almost always OC over Ultras spec, my GTX does 670/2250...
Thats Core/mem/shader. shader can`t do 2500..
LOL, my god man, you are actually at 3550 memory speed if you are asking me this question correct. I think you mean shader clock as they stop around 1700 but i'm not sure how stable your card will be at those clocks, esp if you increase core and memory speeds.
I game tested my 752/1506/1325 overclocks and they are solid, no issues. I for one am impressed seeing many thought the G200's would not overclock well. These cards overclock better than any series I have owned. I mean I managed a stable 150mhz OC on the core, a 450mhz OC on the memory and shaders are doing well with a 250mhz OC. This by far makes the GTX 280 the most powerfull GPU on the market.
Testing current clocks
Stable 3DMark runs so far, I for one am impressed. These clocks give me 100 'gig per sec' more bandwidth than my prev 8800GTX.
I was playing GRID (it didn't crash, woohoo) last night at 1920x1200 4XAA/16XAF all options on Ultra and never dropped below 60fps on any track. To see this game playing at those settings on a 2ms 24in wide screen was simply breathtaking:nerd:
Not sure where to post up this info but if moderaters want to move it to a more 'helpfull' area please do so.
Overclocking the GTX 260/280 (I imagine this would work on almost all geforce cards but not tested) in Vista.
I find using Nvidias new System monitor utility very UAC friendly, however it falls short on core overclocks hitting a wall at 724. Though it does Memory and shader clocks just fine. It adds a performance tab in the driver panel and one can select device settings and from there change the card settings. Features core/shader/memory clocks as well as an easy no frill fan speed adjustmant from 0-100%. One VERY interesting option currently greyed out is Voltage settings, this is an area ATI has had over Nvidia for some time and now it looks like Nvidia will support Voltage adjustments with future drivers. Not sure if one will need a Nforce mobo to do this but Voltage adjustment is coming and for the first time at a driver level, nice.
The other option it adds is System monitoring. It is a very easy slick interface that allows the user to view GPU clocks/temps along with CPU usage on all cores along with T' temps taken right from the core(s) along with warning temp threshholds on both GPU and CPU. Also I should mention it reads memory use, mobo temps, fan speeds and even hard drive temps. Basicly everything the average Joe needs. I esp like the monitoring option. By selecting Ctrl+Alt+L you can start monitoring any perimeters you like such as GPU speeds/temps as well as CPU usage and temps. I opt for the files to be created on my desktop as they are easy to find and monitor from here. To stop and view the file after a bench or gaming session just hit Ctrl+Alt+L again and double click the log file you see created after you stop the logging.
NVIDIA System Tools with ESA Support
Release Date: June 17, 2008
Operating System: Windows XP 32-bit, Windows XP 64-bit,
Windows Vista 32-bit, Windows Vista 64-bit
Language: U.S. English
File Size: 74.3 MB
Ok, the other program I like is Precision that was included on my Superclocked EVGA GTX 280 driver CD. Precision is a little tricky meaning in order to get core clocks to stick, one needs to keep the shaders linked. After you select the core speed, you can apply 'them' and then go back and lower the shaders, then apply again. However, the biggest downfall is the fact that Vista UAC will not allow precision to set clocks or start with out your consent every time you start windows. This is a pain, however I found a cool work around.
This is how I do it. Since Nvidias OC utility will only let you go to 724 core, leave the OC settings alone for now, just make sure you agree to the terms so you can use custom overclock settings. Fire up Precision and set your core clocks by leaving the shaders linked. Apply the core speed and then go back and lower the shader clock by unlinking the core and re applying. Now do not check apply these settings at windows startup. Instead fire up the Nvidia control panel and under the performance tab/device settings you will now see the new overclocks on the core past the 724 limit, cool. Now all one has to do is adjust the shader clocks or memory (I just do shaders and core in Precision then memory in Nvidia) clocks and hit apply. You still can not adjust core clocks, thats why we use Precision. Now just X out (top red X box for closing the window) and a Nvidia pop up box will come up and ask you if you want these settings applied at windows startup. Check yes and you now have 24/7 overclocks on your video card and UAC will never prompt you after a restart or shut down. It's a set it and forget it affair and very easy to do imo.
The only other mention is GPU-Z. It has a sensors tab so one can see other temps not found in the other 2, most importantly Memory and PCB temps. These are nice to know and can be monitored and recorded in the back ground wile benching or gaming as well.
I hope this little OC guide helps. So post them up people
New results on 790i with the 177.41 driver. I was on P35 with DDR2 800 ram.
I am at 11989 for the GPU now in Vantage. My previous results were on P35 with the 177.39 driver. This new one is 790i and the 177.41 driver. My old best GPU score was 11704, now it's 11989 with the same GPU clock. Difference is driver and platform.
Total score 14605, validate-
Card was @ 742|2600 1547 unlinked.
Core overclock seems to make no difference to peformance here, shader and memory however do, try unlinking the shader and try core at stock and at 700+ and test, results are exactly the same here.