8800GTS 512mb (G92) overclock results?

Discussion in 'Videocards - NVIDIA GeForce' started by AllShallPerish, Jan 27, 2008.

  1. roguesn1per

    roguesn1per Ancient Guru

    Messages:
    9,511
    Likes Received:
    0
    GPU:
    GTX580
    Barton C++

    You memory is to high

    put it down to 1050 and shader to 1900 or 1950
     
  2. Barton C++

    Barton C++ Master Guru

    Messages:
    600
    Likes Received:
    0
    GPU:
    MSI Hawk GTX460 1GB @ SLI
    yup, installed ATiTool to really be able to discover artifacts.

    Right now, down most is shader clock and mem.

    792/1890/1053 are my artifact free MHz.

    Atitool ('latest stable') isn't even halv fully OK with Vista x64 / 8800GTS 512MB, but artifact scanning works :) 55C with fan at 80%

    EDIT ///

    796/1911/1053

    OK, any ideas about this?
    RT displays 799.2/1890/1053 in hardware monitor, and it's next higher shader clock change is 1944....weird thing I got artifacts at 1912 (1890MHz in hwmon), but at 1911 (yes, still 1890 in RT hwmon) artifacts stopped.
    Gonna test it tomorrow again...
     
    Last edited: Mar 3, 2008
  3. private-karel

    private-karel New Member

    Messages:
    3
    Likes Received:
    0
    GPU:
    Asus EN8800GTS/HTDP/512M
    Hi everyone !

    I just decided to overclock my 8800GTS, after reading this thread :p
    At this moment I'm at the following speeds (I'm testing these settings tonight, but looking at the speeds you guys are getting I don't think there will be any problems):
    700 / 1750 / 1944 MHz

    And since this is my first post on this forum (I just registered), I would like to say: what a GREAT COMMUNITY here !

    By the way, good idea to compare overclocks, it gives a good estimate of what to expect from it (I'm a noob in OC'ing :D)...
     
  4. roguesn1per

    roguesn1per Ancient Guru

    Messages:
    9,511
    Likes Received:
    0
    GPU:
    GTX580
    Use rivatuner and Unlink you Shader

    Put you Numbers straight to this

    750/1900/2000

    Should be perfectly stable
     

  5. El Bastardo

    El Bastardo Member

    Messages:
    16
    Likes Received:
    0
    GPU:
    -
    "Crysis" approved

    And I thought my card could clock higher...

    With games like "Ghost Recon Advanced Warfighter" and "Unreal Tournament III" more MHz were possibly. With these games I could clock it up to 802/1922/1195 (2390 DDR) MHz stable but under "Crysis" and many hours of constant gameplay I found my realistic maxima:

    799/1890/1152 (2304 DDR) Mhz

    It's an EVGA SSC Edition card which is factory overclocked to 700/1835/990 (1980 DDR) MHz, stock cooler (good cooler I have to say) and no voltage mods applied.

    Card did not reach a temperature higher that 65° C yet!!
     
    Last edited: Mar 4, 2008
  6. Barton C++

    Barton C++ Master Guru

    Messages:
    600
    Likes Received:
    0
    GPU:
    MSI Hawk GTX460 1GB @ SLI
    I did my final artifact scanning, final clocks are (set in RivaTuner 2.06) "802/1917/1057" ... effective clocks are 799/1890/1053 (2106)

    Great tool, ATITool, hopefully I never ever get driver stop errors :D

    Last night I tested with open window, fan @ 80% and gpu core temp at constant 50C, shader clock was 'atitool artifac scanning error free' at 1944MHz, dispite few yellow dots.

    Happy oc'ing !
     
  7. El Bastardo

    El Bastardo Member

    Messages:
    16
    Likes Received:
    0
    GPU:
    -

    That is a point we should speak about some more because I also noticed this behavior and have this "problem".

    I unticked the "link shader clock" checkbox and started to move the slider independantly and used the StatisticsServer to verify everything bit within the boundaries of 1890 to 1950 MHz whatever values I set on the slider the values reported by the StatisticsServer are either 1890 or 1944 MHz.

    I thought the shader clock can be set completely independant or does it only mean I can set the shader clock apart from the core clock (so not linked) BUT there are rules (like multiples of the core or something I do not know yet) that prevent me from setting it in 1 Hz steps??

    Also in conjunction with this: Could someone please explant to me (clarify) what the NVAPI really does for me?? - I mean as Power User setting the many different NVAPI... settings to 1 is is good or bad??


    Last thing: I think it's strange how "different" many cards respond to overclocking: I saw many people that say they can easily overclock their shader to more than 2000 but I see RAM values of "just" 1050 etc.
    Any my card freeses with shader at 1944 (see above why exactly 1944) but RAM at over 1150 with Crysis or even 1195 with other games is possible over hours of playing.
     
    Last edited: Mar 4, 2008
  8. roguesn1per

    roguesn1per Ancient Guru

    Messages:
    9,511
    Likes Received:
    0
    GPU:
    GTX580
    shader clocks go up in 54 mhz.
     
  9. El Bastardo

    El Bastardo Member

    Messages:
    16
    Likes Received:
    0
    GPU:
    -
    So my thoughts were right. Why 54 MHz?? - Hmmmm maybe it's the impulse generator clock of 27 MHz times two that seems plausible.

    So that means that there is no way to run at a shader clock between the mentioned 1890 and 1944 MHz (at least for the 8800 GTS 512 MB G92-Chip) and everyone that claims (or thinks) to run the shader at a clock like 1920 etc. is wrong and any set clock between 1890 and 1917 results in 1890 and any set clock between 1918 and 1944 results in 1944.
    Am I right so far?? - If so that first is bad because of the limited control but second is good because it makes the overclocking easy.
    In my case 1890 works and 1944 is too much for extensive Crysis!!
     
  10. == x86 ==

    == x86 == Member Guru

    Messages:
    161
    Likes Received:
    0
    GPU:
    Geforce 8800GT 512MB
    It's time to get the GTS, am I wrong?
     

  11. Barton C++

    Barton C++ Master Guru

    Messages:
    600
    Likes Received:
    0
    GPU:
    MSI Hawk GTX460 1GB @ SLI
    My MSI card has a shader clock multi of ~2.548, so if I'd use "Linked mode", I'd see artifacts already at 756/1944/1053
    This clever multi unlock gives me another 43MHz on core :D

    EDIT ///

    Just for fun, I checked "link clocks", ran ATITOOL's artifact scan with 'artifact free' core speed 799MHz. Shader clock raised to 2052Mhz and it showed hundreds of artifacts from the first frame ! So, if ppl are stuck near 755MHz, uncheck "link clocks" in RivaTuner for separate clocks.
     
    Last edited: Mar 4, 2008
  12. roguesn1per

    roguesn1per Ancient Guru

    Messages:
    9,511
    Likes Received:
    0
    GPU:
    GTX580
    Most people Say what they put in the driver tab on what they overlcocked to.

    Like me....I put 1950, but its actually running at 1944.. same with the core...i put 800 but its running at 799.
     
  13. alanm

    alanm Ancient Guru

    Messages:
    9,826
    Likes Received:
    2,001
    GPU:
    Asus 2080 Dual OC
    Seems these cards when pushed can even beat an Ultra thats pushed. WR for 3dmark06 single cards is a gts 512 getting about 23,000pts. Of course a 5.6ghz OC'd Q9650 played a big part in that score.
     
  14. roguesn1per

    roguesn1per Ancient Guru

    Messages:
    9,511
    Likes Received:
    0
    GPU:
    GTX580
    they can beat ultra in games aswell.

    upto about 1600*1200 then the ultra wins by about usually 5-10 fps in games
     
  15. RidgeMaster2k

    RidgeMaster2k New Member

    Messages:
    2
    Likes Received:
    0
    GPU:
    EVGA 8800 GTS G92
    Baffled

    Hey all I'm new here and found this thread while looking for help.

    I have EVGA's 8800 GTS (G92). I noticed when I ordered the card that the stock clocks are:

    Core Clock: 740MHz
    Memory Clock: 1940MHz
    Shader Clock:1625MHz

    Now when I opened RivaTuner it states that my clocks are as follows:

    Core Clock: 740MHz (same as stock)
    Shader Clock: 1674MHz (slightly higher)
    Memory Clock: 972MHz (HUH??)

    What exactly am I missing here? Why is the memory clock almost 1000MHz under the stock description? This is going to be my first experience w/ OCing a Graphics Card and any tips and tricks would really help.

    Thanks.
     

  16. alanm

    alanm Ancient Guru

    Messages:
    9,826
    Likes Received:
    2,001
    GPU:
    Asus 2080 Dual OC
    Double data rate (DDR) memory. 972 x 2 = 1940
     
  17. El Bastardo

    El Bastardo Member

    Messages:
    16
    Likes Received:
    0
    GPU:
    -
    Hehe or on what they "wish" they could overclock it -_^


    So it's kind of like showing off before yourself... :p just joking.
    But actually the topic is interessting:
    Does it have any kind of consequences resp. effect when you e.g. set the clock to, let's assume 1960 and Hardware monitoring reports 1944.

    Case 1: The card tries to run at 1960 but the resulting speed is only 1944 and you loose something like more electricity, overhead, whatever in the process compared to if you would set the clock to exactly 1944 within RivaTuner (bad thing)
    Case 2: No difference in the result (neutral thing)
    Case 3: RivaTuner reports 1944 MHz but the real speed is indeed 1960 or something higher then 1944 and between the two values (good thing)

    A clarification in this matter would be interessting!!
     
  18. El Bastardo

    El Bastardo Member

    Messages:
    16
    Likes Received:
    0
    GPU:
    -

    That version of the card did you buy?? - Standard Edition or KO or SSC??

    Whichever version these values are not normal for every of the three versions. I know this because... look at my profile... I also own a EVGA card (SSC Edition) and know about their clocks.

    What does GPU-Z report for you??

    The clocks should resp. be 670/1679/970 or 700/1790/990 (KO Edition) or 740/1835/990 (SSC Edition) and consider the RAM DDR-doubled.

    Your values if right seem to me like a screwed card BIOS!!
     
  19. roguesn1per

    roguesn1per Ancient Guru

    Messages:
    9,511
    Likes Received:
    0
    GPU:
    GTX580
    Setting to 1944 or 1950 makes no difference.

    They both give the same result. (1944 might put the clock down though)
     
  20. Barton C++

    Barton C++ Master Guru

    Messages:
    600
    Likes Received:
    0
    GPU:
    MSI Hawk GTX460 1GB @ SLI
    LOL :D

    Been mostly running 169.38 with G92 GTS, but on first night I tested 171.20 briefly. All my previous artifact scanning were with 169.38...However, I thought UT3 played choppy, not even close to the smoothness I had with old G80 GTS 640MB and same driver, so I installed 171.20 (Vista x64). 3DM06 dropped maybe 40 points, but I checked if UT3 played well: yup, it did.

    Now the really interesting part:

    Started artifact scanning with core slightly higher than previous artifact free speed (802 vs old 799MHz)...5-10 minutes passed and no errors (except 2-3 dots at clock change), I apply more Mem speed...still no errors :D ...more and more mem speed till I reach 1080MHz (now maybe 15min into scanning). Add more core speed...810Mhz..np, few dots at clock change again, but thats all..I'm amazed, saving clock profiles as they clear error free minutes....I stopped after 25mins error free scanning at new oc:
    810/1890/1102 (2203) MHz effective clocks- fan 80%, gpu temp 54C

    now for the real tests..CoD4, CSS, Vegas, UT3, GoW, Crysis etc

    EDIT ///

    CoD4, no artifacts or problems @ 15mins :banana:

    R6 Vegas: 810MHz on core was too much, every 3 min or so game (driver) paused for 30 secs, but never reset stock MHz as 169.38 did. Had maybe 8 driver stops (no bsod) before I changed clocks. 802/1890/1102Mhz OK so far.

    Looks like many cards have same oc limit, just individual games/settings putting different stress on card.
     
    Last edited: Mar 5, 2008

Share This Page