Enabling the new Shader overclock slider in Windows XP

Discussion in 'RivaTuner Advanced Discussion forum' started by Hilbert Hagedoorn, Sep 30, 2007.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    38,996
    Likes Received:
    7,685
    GPU:
    AMD | NVIDIA
  2. sunkid

    sunkid Master Guru

    Messages:
    245
    Likes Received:
    1
    GPU:
    STRIX GTX 1070 Ti
    Thx for the heads-up. But one question emerge: is this available only for 8xxx nvidia cards or for older series also?

    I've tried this on my 7950GT 512 PCI-E and it was a no-no. :)
     
  3. burebista

    burebista Ancient Guru

    Messages:
    1,734
    Likes Received:
    27
    GPU:
    MSI GTX1060GAMING X
    From release notes:
     
  4. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    38,996
    Likes Received:
    7,685
    GPU:
    AMD | NVIDIA
    Correct, Only DX10 8x00 cards have a unified shader domain.
     

  5. sunkid

    sunkid Master Guru

    Messages:
    245
    Likes Received:
    1
    GPU:
    STRIX GTX 1070 Ti
    Aye. Forgot about that. Sry! :)
     
  6. Pa3PyX

    Pa3PyX Member

    Messages:
    28
    Likes Received:
    0
    I too was wondering if it was possible to overclock all three domains in GeForce 7 series GPU's separately, rather than just core clock and memory clock. Geometric domain can obviously be overclocked separately by changing "Geometric Delta" in BIOS with NiBiTor, then flashing the resulting ROM file. So it's probably a matter of register settings, and should be possible by directly messing with the PLL registers using /WR switch, coupled with "Force constant performance level" flag -- I just was curious if there is any GUI or any power user settings that would allow separate clock adjustments for each domain, to date I haven't found any. But I probably wasn't looking hard enough, possibly did not thoroughly read the manual.
     
  7. Unwinder

    Unwinder Moderator Staff Member

    Messages:
    14,880
    Likes Received:
    2,026
    There is no such functionality implemented for NV4x family in the ForceWare.
     
  8. Pa3PyX

    Pa3PyX Member

    Messages:
    28
    Likes Received:
    0
    In other words, it cannot be done entirely at the driver level. I thought so too -- one would have to first fix the performance level somehow so that the driver doesn't mess with the affected registers, and then write registers directly. Even then, the "Test" button would be no good anymore because (1) the associated driver call probably changes the frequencies briefly, destroying register values and (2) the internal test really only tests the shader domain, from what I understand. It does not test the geometric domain, at least not extensively (as my experiments with geometric delta seem to confirm), and I'm not sure if it tests the rasterizer either. Which is OK for general purposes (overclock all three with the same frequency), because normally the shading units are what starts flaking out first (someone correct me if I'm wrong).

    In short, a "user-friendly" implementation of such functionality would be a mess... especially that overclocking different domains separately is more trouble than it's worth -- with a possible exception of geometric domain, which can already be done via NiBiTor if one really wants to squeeze every last volt and megahertz out of their card. :D
     
    Last edited: Nov 4, 2007
  9. Unwinder

    Unwinder Moderator Staff Member

    Messages:
    14,880
    Likes Received:
    2,026
    Do you realize that low-level overclocking is not just a single trivial direct GPU register write operation? Normally that includes a few dozens of reads/writes/delays.
     
  10. Pa3PyX

    Pa3PyX Member

    Messages:
    28
    Likes Received:
    0
    I do know that there are usually a few thousand stabilization cycles needed to let various circuits recalibrate themselves, yes -- at least, that's the way it is on motherboards that can change CPU clock frequencies "in software"... How many writes need to be performed and where, I would not know, and probably would not be able to figure out without a data sheet by just comparing register dumps. But probably yes, one would need to do something to shut down all I/O operations currently in progress, etc. before changing clock frequencies, or we would just get a hang. RC will probably reset our adapter, but in doing so will reset the performance level as well.

    My guess is, you figured out how to do low level ATi overclocking by stepping through relevant driver code with a kernel debugger? (Perhaps even more involved, as newer ATi boards seem to handle overclocking partially at BIOS level?)
     
    Last edited: Nov 4, 2007

  11. krusher_r

    krusher_r Member

    Messages:
    22
    Likes Received:
    0
    Cool; I've been running my XFX 8800GTX at XXX speeds, but left the shader alone (which means it was beyond XXX speeds.) The XFX XXX is 621/1400/1000 according to Hilbert's Article.

    Since I went with some aggressive fan settings, I decided to "go for it" and try the "Ultra" settings of 612/1500/1080. I played the entire COD4 demo max'd out and no artifacts or lockups.

    I was going to go for more, but I think I'll play the demo one more time and see what my temps end up. No sense cookin' it for little payback.
     
  12. Vector

    Vector Ancient Guru

    Messages:
    1,509
    Likes Received:
    0
    GPU:
    GTX 770 FTW 4gb w/bkplate

    And a no no on my 7950GX2 with 169.04 driver.
     
  13. krusher_r

    krusher_r Member

    Messages:
    22
    Likes Received:
    0
    After a bit of experimentation with Test Drive Unlimited "TDU" last night (a very demanding game on the gfx card), I've noticed something odd that I can't quite explain.

    Before I enabled the shader control, I had my standard XFX 8800GTX running at 621/1000. This runs the shader faster than a stock XXX card, because I learned from previous postings that the shader is tied to the GPU speed. I ran my card at 621/1000 for months with stock fan speeds; no problems.

    So, I unlocked the shader control and set it to 612/1500/1080 like the Ultra is set. When I went to play TDU, after a few minutes I get the "pink screen of death". (Graphics card lockup.)

    I figure OK, 1500/1080 is too fast for my shader or RAM, so I'll back it off to the real XXX speed of 621/1400/1000. TDU gives me the pink screen again after a short period of time.

    So, I undo the shader unlock in Rivatuner and go back to 621/1000; no more pink screens. Now, I'm a little puzzled. Either my particular card doesn't like the shader to be unlocked from the GPU, or perhaps the frequencies aren't being calculated correctly when in this mode? This isn't something that I suggest Unwinder spend hours pondering; as very few people unlock the shader controls and my card is perfectly happy at 621/1000. I just thought I'd mention it in case someone else has similar issues.
     
  14. Unwinder

    Unwinder Moderator Staff Member

    Messages:
    14,880
    Likes Received:
    2,026
    Or perhaps you failed to use search correctly?

    http://forums.guru3d.com/showpost.php?p=2443658&postcount=74

    Slowly reread the last sentence.
     
  15. krusher_r

    krusher_r Member

    Messages:
    22
    Likes Received:
    0
    Thanks for your reply Unwinder.

    I think I found what it was, after my Nephew played the Crysis demo for awhile I was still getting the pink screen of death so I installed the ATITool (in order to see if I was still stable at 621/1000).

    The pink screen returned after only ~2 minutes and I started to think "ok, so either my card is getting whimpy or it was because I changed something else." Yep, in this thread:

    http://forums.guru3d.com/showthread.php?t=238574

    I started to play with the card's internal fan speed settings because a "cooler card is better" right? Well, it crashed at only 68C when I had some custom settings in there after 2 minutes. When I went back to the default fan settings, sure the card went up to 80C after a half hour but it just kept going like that bunny with the battery. :)

    I was debating about trying the shader control again, but then again...if it works fine at 621/1458/1000 (according to the hardware monitor), it's probably time to call it a day. As you've said above, I could start having more problems again.

    I'll respond separately in that other thread, but for now, time for more games.
     

  16. Deliverator

    Deliverator Guest

    Do try a shader clock speed of 1674 MHz (with some hardware/driver combinations, the next step of 1728 MHz is also stable).

    Every little bit helps. Especially with all these new games that use loads of shader stuff. TimeShift's loading phases are 70% "shader processing", only the rest is everything else. Things can only get more complicated from there (the authors said "our shaders are as advanced as they come" and I can second that).

    Luckily, your card won't really need memory overclocking thanks to the bandwidth you have on it. The core speed is superb too, even if you could keep on going a bit (if you can). But those pesky shaders... set them at max. :)
     
  17. musketeer7

    musketeer7 New Member

    Messages:
    1
    Likes Received:
    0
    GPU:
    XFX 8600 GT 512
    I did the things blow, but I still can't see shader overclocking option!
    I'm using 162.18 driver. ( I know it's a little old. is this the problem?)
    [​IMG]
    and an other question?!
    is default core & memory clock of this card ok?!
    I'd read that the memory clock must be 700 for 8600GT.
    my system info:
    CPU: 6550 Core 2 Dou
    Main: Asus P5K
    Graphic: XFX 8600 GT
    ram: 2x1 GB Ballistix CL4

    EDIT: this is screen shot link
    http://xs.to/xs.php?h=xs323&d=08012&f=rivascreen.jpg
     
    Last edited: Jan 1, 2008
  18. =Oz=

    =Oz= New Member

    Messages:
    3
    Likes Received:
    0
    GPU:
    256mb GEFORCE 7600GT
    mate, yes the problem is ur driver, it clearly states in the link "This new functions is actually enabled within Vista drivers only in combo with NVIDIA ForceWare 163.67 and newer (163.71 works fine also)."
     
  19. TotalChaos

    TotalChaos New Member

    Messages:
    1
    Likes Received:
    0
    GPU:
    1GB Palit 8800GT
    Quick question regarding the Shader clock maximum? IS it limited to 1782Mhz max? When unlinked from the core clock if I set it to say 1800 the temp monitor reads 1782Mhz
     
  20. boogieman

    boogieman Ancient Guru

    Messages:
    1,943
    Likes Received:
    19
    GPU:
    MSI GTX 1080X
    ...............
     

Share This Page