GTX 970 Overclock results

Discussion in 'Videocards - NVIDIA GeForce' started by deadpool790, Oct 10, 2014.

  1. Scouty

    Scouty Guest

    Messages:
    81
    Likes Received:
    0
    GPU:
    GTX 970 OC WiNDFORCE 3X
    its normal... GPU-Z shows the BASE freq.. MSI shows DDR freq... like this.. (remember DDR stands for Double(2x) data rate) ... thats why in CPU-z an 1333mhz memory shows as 667mhz..... open NVIDIA Inspector.. i think that use the same theory as MSI AB..... so its just an way to show the clock. :)

    ps. not really this but i wrote to you understand the theory
     
  2. Riffmaster

    Riffmaster Guest

    Messages:
    103
    Likes Received:
    0
    GPU:
    MSI 580 Lightning Extreme
    [​IMG]

    [​IMG]

    msi 970 gaming 4g

    stock voltage
    core: +230
    mems: +300
    power: 110%
    fan: default

    Boosts to 1520. 3x consecutive Heaven 4.0 runs, temp maxes to 72 degs.
     
  3. MrH

    MrH Guest

    Messages:
    2,812
    Likes Received:
    14
    GPU:
    RTX 3080 FE
    Can someone explain why the boost clock in game is higher than what shows in GPUz?
     
  4. xaudiox

    xaudiox Member Guru

    Messages:
    193
    Likes Received:
    2
    GPU:
    Asus 3080 Tuff OC
    The clock shown in GPU-Z is your minimum boost clock, cards may vary on the maximum boost clock, mine is 1354mhz.
    Nvidia Inspector will show your estimated max boost clock.
     
    Last edited: Dec 30, 2014

  5. MrH

    MrH Guest

    Messages:
    2,812
    Likes Received:
    14
    GPU:
    RTX 3080 FE
    OK thanks. I'm working on my OC now, the results I got yesterday were rock solid in benchmarks but not in games, that's what was causing my driver crashes. Currently at 170/300 which seems stable, going up +10 until it's not, 195 wasn't so I'm nearly at my sweet spot for core.
     
  6. UZ7

    UZ7 Ancient Guru

    Messages:
    5,539
    Likes Received:
    75
    GPU:
    nVidia RTX 4080 FE
    Thats boost 2.0 for ya. If you increase power limit or voltage at any point it will try to clock it higher within safe ranges and if it starts hitting higher temps it will clock down.

    Example of this is my default is:
    1140/1753 Boost 1279
    +110%/+160MHz/+400MHz gives me:
    1300/1953 Boost 1439

    Yet in game (Far Cry 4) I get 1501MHz@1.2V, if I increase the voltage I think it bumps me to 1514MHz (without touching core) but if the temps go higher than 65C~ it goes back to 1500MHz, so thats boost 2.0 at work.
    [​IMG]
     
    Last edited: Dec 30, 2014
  7. MrH

    MrH Guest

    Messages:
    2,812
    Likes Received:
    14
    GPU:
    RTX 3080 FE
    After a day of trying all combos +140/+500 seems to be the sweet spot, it gave me 61.3fps minium in the Shadows of Mordor benchmark and keeps the temps and fan speed low. There was a fine line between too high and too low, both giving lower than wanted performance.
     
    Last edited: Dec 30, 2014
  8. spidermind79

    spidermind79 Active Member

    Messages:
    52
    Likes Received:
    1
    GPU:
    Pny 4090 XLR8
    hey i've tried to oc core of my 970 g1 without touching voltage and i reached +150 stable, at 170 driver error on firestrike, i noticed in gpuz that when i launch some games core frequency instead of 1329+150=1479 is at 1529 i think is boost 2.0 right? however for now i don't try to push memory, is hynix , because i read that major boost is from core, memory is only for more bandwidth, but more fps is only core or +300 or +400 from memory is good for fps?
     
  9. UZ7

    UZ7 Ancient Guru

    Messages:
    5,539
    Likes Received:
    75
    GPU:
    nVidia RTX 4080 FE
    Yeah boost will try to give you more based on your ranges but if temps start to go up it may go back down or if power limit is prioritized. For Hynix ram, you'll just have to test, some people arent able to clock high while others can. +300 or +400 should be good. Some people set it to +500 and are working fine for them. I personally do +400 for gaming and +625 for benchmarks so ram overclock helps but may not give a drastic impact in gaming fps as core does.
     
  10. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    Not even started overclocking yet due to the time of the year.

    It is strange compared to what I'm used to but I'm sure I will catch up, I've noticed that even on "stock" it's hit 70c in some benchmarks so....
     

  11. spidermind79

    spidermind79 Active Member

    Messages:
    52
    Likes Received:
    1
    GPU:
    Pny 4090 XLR8
    with +150 core i saw +4 fps on coh2 and dai , both +4fps on bench, if i try to push ram +300 or 400 how much i can gain? 1-2 fps or more? i'm thinking that real boost is from core, more bandwidth is only for benchmarks not gaming am i wrong?
     
  12. UZ7

    UZ7 Ancient Guru

    Messages:
    5,539
    Likes Received:
    75
    GPU:
    nVidia RTX 4080 FE
    Well just like with regular CPU and RAM overclocks, you will see more drastic performance boost that utilizes it the most. CPU would be more noticeable since we use that primarily while RAM cannot really be seen in regular use but moreso in programs that utilize RAM on an onload/offload bases (3d render/encode/photo etc..). Same concept applies to GPU and vRAM, you may see as much as 5 to 15+ FPS boost in GPU Clocks and may only see as much as 1-3~ FPS boost from vRAM in gaming and you may not really "see" the changes but it somewhat adds to the performance improvements. Say... +7 FPS more from GPU, +3 FPS from RAM = +10 FPS overall, so which would you want 7 FPS boost or 10 FPS boost :p.

    There are times when the performance boost and added heat/voltage isnt worth it and lets say for example regular RAM running at 1.5V (2133MHz) vs 1.60-1.65V (2400MHz) in which case not really worth it for some people. Or like for me I would rather have 1500MHz Core Clock @1.2V rather than 1550MHz @1.25V on the GPU etc..

    So overall yes you wont really see much improvements from video ram overclocking but its something you can squeeze out and you most likely wont be able to tell the difference unless you use a "benchmark" which is just a term for collecting data and comparing the results and seeing how high it goes, in which case if there was a game that has a built in "benchmark" it sometimes simulates what to expect in game and it has the same loop so you can test different speeds/settings/tweaks.

    Here is a quick example of Heaven I just made real quick to show comparison.
    [​IMG]

    [​IMG]

    So with this I just put stock +0mW, +110% PL, +160 Core, +0-500 RAM. Ran the benchmark 6 times.

    *Note this is just a quick benchmark done in a consecutive run (high room for error), but you can see somewhat of a performance difference. To get a more accurate result doing multiple tests and averaging them out would be advised.*

    TL;DR its entirely up to you if you want to overclock RAM, I just do it because it can lol, some people are just fine with stock.
     
    Last edited: Dec 31, 2014
  13. Scouty

    Scouty Guest

    Messages:
    81
    Likes Received:
    0
    GPU:
    GTX 970 OC WiNDFORCE 3X
    nice infos bro.. thx for sharing... so the gpu clock matter in most cases ... same rule as for Computer ... processor OC better than DDR3 overclock... =)
     
  14. Vipu2

    Vipu2 Guest

    Messages:
    553
    Likes Received:
    8
    GPU:
    1070ti
    Bit OT but how do I enable in WIN7 that "disable something composition" in 3Dmark so that it doesnt "crash" 3Dmark after 20sec because windows wants to change theme.

    I have googled and I have put that option to every 3dmark .exe I have but it doesnt work.
    Extra info: 3Dmark is in steam.
     
  15. spidermind79

    spidermind79 Active Member

    Messages:
    52
    Likes Received:
    1
    GPU:
    Pny 4090 XLR8
    thx a lot for the bench and the explanation, really appreciate all :)
     

  16. MrH

    MrH Guest

    Messages:
    2,812
    Likes Received:
    14
    GPU:
    RTX 3080 FE
    For me overclocking is about improving min FPS, with +140/+500 I went from 53fps min to 61.3fps min in Shadows of Mordor. 60fps locked is the most important thing to me, I'm hyper sensitive to frame rate drops and nothing ruins a game more for me. I had the same results in Grim Dawn, I used to drop to 50fps every time I would AoE a large pack whereas now I'm at 60fps locked and it's just so much more enjoyable.

    I will say overclocking my 970 was a pain because benchmark stable wasn't game stable so it took a lot longer to find my sweet spot, usually if a card can handle Furmark with 8XMSAA it'll be game stable but that wasn't the case this time. I found Grim Dawn to be very sensitive to an unstable overclock and would crash really fast so that's what I used, slowly reducing the core by - 10 and playing the game until it was stable. That being said the cooling on the MSI is incredible, I stay under 60c under load while overclocked in Dragon Age Inquisition and Shadows of Mordor and around 45c in Grim Dawn. Overall I'm so glad I went for the 4G.
     
  17. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,640
    Likes Received:
    1,143
    GPU:
    4090 FE H20
    Well there's your problem. Never use furmark, cards see that as a power virus
     
  18. Extraordinary

    Extraordinary Guest

    Messages:
    19,558
    Likes Received:
    1,638
    GPU:
    ROG Strix 1080 OC
    Ive read people saying that Furmark can physically damage GPUs, dunno if that's true but I stopped recommending it after that
     
  19. CPC_RedDawn

    CPC_RedDawn Ancient Guru

    Messages:
    10,486
    Likes Received:
    3,178
    GPU:
    PNY RTX4090
    Anyone want to help me with COD: AW

    For some reason my 2 cards in SLI don't like to run correctly.

    They boost to 1329MHz 3506MHz (this is their stock boost not overclocking)

    In COD:AW GPU1 will downclock to base clocks 1114MHz but the second GPU will stay at 1329MHz.

    I also get a nice 5fps drop every now and again on multiplayer maps too.

    I found that ALT+TABBING out of the game and back in again SOMETIMES brings the first GPU back up to full boost clocks. But most the time when playing the game it will just downclock back to base clocks and the second GPU will stay at full boost clocks.

    This is really weird. No other game does this. Sure some other games will downclock to base clocks when at menus and stuff.

    I do have "Prefer Maximum Peformance" set in Nvidia control panel, and monitor mode set to "Single display performance mode" and SLI set to Nvidia Recommended" all in the global settings.

    Any ideas?

    Temps are perfectly fine by the way top card has only reached 64C once bottom card is around 10C lower.
     
  20. Goldie

    Goldie Guest

    Messages:
    533
    Likes Received:
    0
    GPU:
    evga 760 4gb sli
    have you checked pc gaming wiki's cod aw page?
    just a sugestion as 'nvidia recommended' is pretty much all you can do about sli scaling in vast majority of games.
     

Share This Page