GTX 280/260 overclocks thread

Discussion in 'Die-hard Overclocking & Case Modifications' started by Rodman, Jun 20, 2008.

  1. BlackZero

    BlackZero Guest

    I haven't been able to figure out what causes it either but it only happens when I overlock and force a higher fan speed, after reverting back to default the clocks stay at full 3d frequencies. A restart fixes this and then it operates as normal and drops down to 2d clocks etc.
     
  2. OCZ

    OCZ Master Guru

    Messages:
    493
    Likes Received:
    0
    GPU:
    GTX 570
    Yeah it must be the driver it self ..
    Looks like it won't go back to 2D after 3D,
    Restart does it's job tho ..
    Kinda lame cause temp. is like 5-6c higher in idle then ..

    @ skugpezz

    I can control the fan just I can't set fan profiles ( u know like you own auto mode )
     
  3. Barton C++

    Barton C++ Guest

    Messages:
    603
    Likes Received:
    1
    GPU:
    MSI Hawk GTX460 1GB @ SLI
    EDIT /// 756/1512/1242 (2484)MHz FurMark artifact free for 20min @1680x1050, 4xAA. Any increase on mem or shader equals BSOD, and core can't be raised above half the shader freq.
    59C was max load temp while stress testing (got lower each day, had the card 3 days now) with fan on 80%, that's 21C lower than ATI 4800 series idle temp? ;)

    3DMark06 @ 1680x1050, with 4xAA and 16xAF: 12,691, my friend's max oc'd GTX280 (702/1458/1215 something) got up to 13,000, both Vista x64 and E8400 (mine 4GHz, his 3.6GHz)

    Summary: VERY PLEASED !


    Leadtek GTX260 with stock 576/1242/999MHz running nicely at 756/1512/1215MHz.

    I've noticed atitool shows artifacts when I go lower that the exact 2.0 ratio (shader/core), say 729/1512 would show arties(but no error reported), but not the 756/1512

    Similar with say 771/1566 (these are the actual clocks), arties right away, but with ratio 2.0: 783/1566 no arties (except BSOD at this OC after half minute :D )

    My card was packed Aug 16th (sticker on the box) and it uses a different BIOS than what's found at TechPowerUp's VGA BIOS Collection. This uses 1.15v and the other/older one 1.12v.
     
    Last edited: Aug 30, 2008
  4. boOzy

    boOzy Active Member

    Messages:
    75
    Likes Received:
    0
    GPU:
    Radeon R290
    MSI gtx280 177.83 running these atm, gonna go higher soon cause no problems yet.

    GPU 710
    Memory 1220
    Shader 1420

    Max temp 85c fan auto
     

  5. mikeyakame

    mikeyakame Master Guru

    Messages:
    207
    Likes Received:
    0
    GPU:
    GTX280 729/1566/2700
    it isn't possible to have the shader clocks a value that isn't divisible by 54MHz. All shader clock frequencies are generated from the following formula and a multiplier applied. It is impossible to have any other value which can not be generated from this formula. It's by design.

    [ int_min_rop_shader_ratio * pll_base_clock * int_multiplier ]

    int_min_rop_shader_ratio == 2
    this refers to the minimum valid rop to shader clock frequency ratio in integer form. it is always equal to 2 by design. Any ratio less than 2 is not valid for generating shader clocks based on rop clock.

    pll_base_clock == 27
    pll_base_clock is the reference clock frequency generated by the internal PLL clock generator IC that all valid clock frequency steps are generated from. It is always 27MHz.

    int_multiplier == unsigned integer
    integer multiplier ie, 1, 2, 6, 15, etc. necessary to attain requested clock frequency

    there are 2 cases that exist where a new shader clock frequency is requested.

    case 1:
    [ shader_freq % ( pll_base_clock * 2 ) <= pll_base_clock ]
    ie ( 1420 % ( 27 * 2 ) == 16 ) <= 27

    requested shader clock is the previous posters 1420MHz for example, this clock frequency when you find the remainder with the modulus sign is less than the minimum value of 27 or 0.5 of the closest available multiplier to select from. So clock change request results in the nearest multiplier that gives a frequency less than requested. In this case the resulting change would end up with 1404MHz shader clock frequency.

    case 2:
    shader_freq % ( pll_base_clock * 2 ) > pll_base_clock
    ie ( 1442 % ( 27 * 2 ) == 38 ) > 27
    Request 1442MHz shader frequency has a modulus remainder of 38 which is greater than 27 or in range of 0.51 -> 0.99 of a multiplier.

    the multiplier will always be a real number, ie 10, 11, 12...the necessary multiplier will depend on the distance from the closest previous or next multiplier. Since the next multiplier in this case is 27x, the drivers will allow the multiplier change request and the shader clock frequency will be raised to working frequency of 1458MHz.

    Valid shader clocks only exist in these 2 cases based on the above given formulas for Performance 3D mode. There is a case where this rule doesnt apply, and that is when in 2D / 3D low power clock mode. In these cases they are base off PCI-E pll ref clock (100MHz unless changed in bios) to generate the clocks.
     
  6. blunden

    blunden Guest

    Messages:
    929
    Likes Received:
    8
    GPU:
    ASUS RTX 3080 TUF
    What BIOS version is that? My Leadtek card uses the same as on TechPowerUp it seems.
     
  7. Barton C++

    Barton C++ Guest

    Messages:
    603
    Likes Received:
    1
    GPU:
    MSI Hawk GTX460 1GB @ SLI
    exctracted from RT2.10

    $1100000000 Title : WinFast GTX 260 (2008072201)
    $1100000002 Version : 62.00.1A.00.05


    I'm having issues at 756/1512/1242, CoD4 will app crash sooner or later.
    I discovered, that after I put this card inside (2x6pin), MY CPU VCORE is notch lower (1.328 vs 1.344v)...I raised it in bios to match 9hr prime95 stable (1.344).. yet to see if it's GPU oc or vcore messing it up. If i'm luckym it was only vcore droop !!! also inc mch as I was in bios (9x438)
     
  8. Barton C++

    Barton C++ Guest

    Messages:
    603
    Likes Received:
    1
    GPU:
    MSI Hawk GTX460 1GB @ SLI

    did anyone read all that ? what I know is shader clock goes by 54MHz,thats definte. core and mem clock USUALLY go by half, 27MHz...but according to oc apps, both mem and core can take much much smaller changes (core 702->, mem 1215->), 9MHz if i recall things right. That is fine tuning vs 54/27MHz

    but my card works the BEST @ 2.0 ratio shader/core..arty free @ highest oc, but with lower core and obey the 2.0 law: arty error in atitool in an instant. Haven't tested if it generates real game error, don't want to, my Vista index is pretty low by now already...

    What you're referring to might be oc app's "set clock" vs "actual clock" ?

    I mean actual clocks, ie read-outs from evga tool 1.3.2 or rt 2.10
     
    Last edited: Sep 1, 2008
  9. blunden

    blunden Guest

    Messages:
    929
    Likes Received:
    8
    GPU:
    ASUS RTX 3080 TUF
    Interesting. Mine just reads the following, probably the same as standard.

    $1100000000 Title : WinFast GTX 260
    $1100000002 Version : 62.00.0E.00

    Would you mind dumping the bios in rivatuner gpu-z or similar?
     
    Last edited: Sep 2, 2008
  10. Cubemonkey

    Cubemonkey Member Guru

    Messages:
    179
    Likes Received:
    0
    GPU:
    Crossfire 7970s
    Got my BFG GTX 260 yesterday. After running around on Very High in Crysis. I started playing with the clocks and ran this

    BFG GTX 260 OC @ 722/1448/1184
    3dMark Vantage 11,448
    http://service.futuremark.com/compare?3dmv=353802

    Seems stable. Well below the heights of some others in this thread. More to follow.
     

  11. josephmoore

    josephmoore Member

    Messages:
    24
    Likes Received:
    0
    GPU:
    Zotac GTX 295
    Got my Zotac GTX 280 yesterday, I tried some overclocking (at least to got AMP! edition clocks), It easily access 702-1458-2450. I did not try further. With shaders is at 1512, I see artifacts in Atitool but it passes furmark and nvidia test. Which one I have to believe in? Is Atitool trustable for new cards?
     
  12. Cubemonkey

    Cubemonkey Member Guru

    Messages:
    179
    Likes Received:
    0
    GPU:
    Crossfire 7970s
    The artifacts are most likely showing up b/c your core and shader clocks aren't adhering to the 2:1 ratio they like to be at. Either set the core to to 729 or the shaders to 1404. The artifacts should go away.
     
  13. Barton C++

    Barton C++ Guest

    Messages:
    603
    Likes Received:
    1
    GPU:
    MSI Hawk GTX460 1GB @ SLI
    joseph,
    I would say that OC might not be stable, it depends if you like playing MAX settings or MAX FPS. Some see artifacts in atitool/furmark, but not in their games: coz they dont put max settings on. I've experienced my max oc wasn't stable when I set Crysis very very high.

    There is a magical limit there, 1512MHz shaders....easy games pass for most, but those with 1600+ probably have 1280x1024 rez and play medium for 500fps :D
     
  14. FNK

    FNK Guest

    Messages:
    2,762
    Likes Received:
    1
    GPU:
    XFX GTR 480 XXX
    Yes me too. I will play the same games anyway at 1600x1200.
    My GTX260 should be here tomorrow!
    8800GTS (sold) > 6800XT (backup) > 260GTX (NEW!!)
     
  15. josephmoore

    josephmoore Member

    Messages:
    24
    Likes Received:
    0
    GPU:
    Zotac GTX 295
    I prefer MAX settings :)

    Last night I tried 1512 shader clock speed, in Atitool it starts artifacting immediately with fan speed %100. I think my card cannot reach 1512 shader without water cooling. Is there anyone could reach 1500+ shader with stock cooling?
     
    Last edited: Sep 5, 2008

  16. Cubemonkey

    Cubemonkey Member Guru

    Messages:
    179
    Likes Received:
    0
    GPU:
    Crossfire 7970s
    What was your core clock?
     
  17. han16

    han16 Member

    Messages:
    21
    Likes Received:
    0
    GPU:
    XFX GTX 260 SLI
    I Have 2x XFX GTX 260s, they are very fast and blow the 9800GTX tri sli i had out of the water for crysis gaming with AA turned on.

    Anyways i have these speeds:

    Core 709
    Memory 1241
    Shaders 1528

    Not really bothered trying to push the cards all that much.
     
  18. han16

    han16 Member

    Messages:
    21
    Likes Received:
    0
    GPU:
    XFX GTX 260 SLI
    Just because selling 3x 9800GTXs translates to 2x GTX 260 in money...
     
  19. blunden

    blunden Guest

    Messages:
    929
    Likes Received:
    8
    GPU:
    ASUS RTX 3080 TUF
    Is ATITool's artifact scanning reliable on these cards? I tried just bumping up my clocks a bit more than I usually do and I got this:

    [​IMG]

    How can it still say there were no errors? I thought the yellow parts were artifacts. If it is, what is most likely to cause them?
     
    Last edited: Sep 7, 2008
  20. blunden

    blunden Guest

    Messages:
    929
    Likes Received:
    8
    GPU:
    ASUS RTX 3080 TUF
    No one?
     

Share This Page