The GTX 1080 thread

Discussion in 'Videocards - NVIDIA GeForce' started by bugsixx, May 7, 2016.

  1. I'm doing +230 with FE on air, so +260 on water is like a must.

    Anyway EVERY card has a coil whine and not just GTX 1080, AMD or NVIDIA, doesn't matter.

    The ability to hear it depends on 2 things:
    1. FPS, higher FPS causes much louder coil whine, it gets really loud if you have more than 500 FPS
    2. Noise generated by your PC. Most ppl don't hear it just because of fact their PC generates too much noise on its own.. HDDs, fans etc..

    I use fully watercooled system for 8 years now and I've never had a silent card in high FPS, cards do it, get over it.

    If you don't want to hear any coil whine, here is a solution:
    Play your games @60FPS using adaptive VSYNC, GSYNC, FastSYNC or whatever sync you want.
     
    Last edited by a moderator: Jul 28, 2016
  2. Corbus

    Corbus Ancient Guru

    Messages:
    2,469
    Likes Received:
    75
    GPU:
    Moist 6900 XT
    Hate when people say +n number on core, mem etc, just say the frequency you're running, different skus with different base clocks , also a +100 on core with +0 voltage can be different than +100 core with + x volts, also power can influence final clocks as well, lots of variables.

    OT: my card doesn't have any coil whine but it is loud at high rpm, of course at max it's going to be louder than the msi gaming cause usually smaller fans even though in higher number ( +1 lol) need to run at higher rpm to keep up. But still, it's an awesome card, planning to watercool it sooner or later. Gonna miss the nice looking leds and cooler when ill do that though.
     
  3. There is no final frequency with Boost 3. Everything depends on GPU temperature and power draw of current scene its rendering. -> +260 on FE (founders edition) is all information you actually need.

    Also increasing voltage offsets doesn't mean anything else then reaching higher voltage bins. To be exact, maximum voltage bin of FE cards (And I would say all 1080s without modded BIOS) is 1.062V. Increasing voltage offsets to maximum allows you to reach 1.075V, 1.081V and 1.093V bins, which are otherwise unreachable no matter the conditions.

    If you want to know exact frequency at a temperature, open EVGA prec. or MSI Afterburner, set desired core offset and open the curve editor. Curve editor will show you all frequencies at their temperatures / voltages.

    From here it's just what you want... Better cooling better clocks and vice versa. Meaning -> even with +260Mhz on core your card can be running 1867Mhz when reaching temp limit.

    So what you want ppl to say? Frequency + temperature? Highest reachable frequency at low temperatures? Lowest frequency at maximum temperature? There is no such thing as exact frequency in Boost3, yes it sucks, it's would be best if somebody modded BIOS to have just 3 bins and done...
     
  4. Even if you have the best cooling in the world, you can't beat power draw limit. As I said, you will never get stable OC your freq. will jump. And the more you push the card the more it will jump. With 1.093V and 2152 and OCed memory, it jumps down to 2101 just because of power limit. Even if the card has less than 45C all the time. And I believe those 2101MHz is not the worst case scenario...
     

  5. Show me :) Show me a GTX 1080 card that can run FireStrike stable at 2152Mhz with OCed memory. I've never seen such a card and GOD I've tried :)

    And by stable I mean not fluctuating clocks. Just to be correct here.
     
  6. Corbus

    Corbus Ancient Guru

    Messages:
    2,469
    Likes Received:
    75
    GPU:
    Moist 6900 XT
    That was my point, no need to explain to me how it works, saying +260 mhz is what? Considering boost of FE cards is 1733mhz adding 260 mhz is 1993mhz, not what you actually run is it?

    It's easier to just say the frequency you mostly see while under load,like an average.Under load you have a max of 2126 and a min of 2088, guess 2.1ghz would be a good number to say. Yes it can vary in different scenarios but it's easier to get an idea of what someone's overclock is rather than just say +x mhz.
     
  7. Netherwind

    Netherwind Ancient Guru

    Messages:
    8,841
    Likes Received:
    2,417
    GPU:
    GB 4090 Gaming OC
    I know all of that but the problem I have is that I hear coilwhine at 60fps, at 120fps and 144fps...but also at 6500fps (Heaven Benchmark credit screen). And that's a new thing for me that I've not experienced before. And I obviously play with Gsync enabled.
     
  8. The difference is this:
    +X MHz tells you about the overclockability of the GPU. Because you will get higher clocks even at lower voltages 0.683V,0.7V,... when playing at 60FPS for example.

    What you are referring to is maximum stable frequency of the whole system, including cooling, some average game load, ambient temperature etc...

    I'm more interested in card on it's own, so I personally prefer +X Mhz or let's say base clock, because it tells you how the curve looks like and you can easily extrapolate what would you get with your own cooling solution under your conditions.
     
  9. OK, I must admit I can't hear any coilwhine @60FPS. I can hear it at 150FPS. I think it would bother me as well if I could hear it at 60FPS. Here I can only say that you either have more than perfect ears or your card is behaving kinda weird :)
     
  10. Solfaur

    Solfaur Ancient Guru

    Messages:
    8,014
    Likes Received:
    1,534
    GPU:
    GB 3080Ti Gaming OC
    Agreed, if I have to post my cards core clock, I simply say 2036-2063MHz, since it never drops under 2036.

    All custom AIB cards have different out of the box clocks, and they vary even from one card to the other (same model), and none respect the spec. they get from the manufacturer...
     

  11. TyrantofJustice

    TyrantofJustice Ancient Guru

    Messages:
    5,011
    Likes Received:
    33
    GPU:
    RTX 4080
    on my 1080 i can not hear any coil whine at all even if running at a screen with a 1000fps
     
  12. sajibjoarder

    sajibjoarder Master Guru

    Messages:
    607
    Likes Received:
    1
    GPU:
    gtx 1080 FE
    to all i dont think there will be a 1080ti coming.
    here is my reason-
    first 780ti- nvidia lunched 780 which had 2304 cude cores. amd was really straggling at that point too keep up with nvidia. so they decided to milk the users by putting titan in the market. suddenly or unexpectedly amd came with 290x which was a lil faster or on per with titan. they had to put something which can kill amd's 290x and nvidia was too busy with the mobile platform. but kepler was a huge chip just a part of it was served for the gaming market they wanted to keep the fully unlocked chip for the enterprise market. since they have already lost the ps4 and xone they ware desperate to keep the pc in their hand and lunched 780ti which was more powerful then titan and later titan black.

    now 980ti - 780ti was not more powerful then 290x but by cheating of the gameworks and nvidia's continuous good relationship with developers helped them a lot to show 780ti was a better product then 290x. still many people like me bought 290x over 780ti becaz of 4gb ram. and we all know that still 290x or 390x is a very competitive gpu where as 780ti has lost its color long ago. so they had to put something to fight mighty 290x and it was 980 which was slightly faster or on per with 290x and 780ti depending on games. again amd was slow to put any new product and nvidia enjoyed the time and kept the best in their hands. gm200 titan x was lunched just as an incremental gpu. later 980ti cause amd was really late. nvidia was waiting to see what amd's new card performs like. nvidia lunched 980ti on June 2, 2015. where as amd lunched fury x on June 24th 2015. i think nvidia knew amds new cards capability and they knew that it wont be able to beat 980ti or may be just as a competitor to the new furyx they lunched it. and 980ti was a superior card performance wise. and 4gb ram was a big drawback. so 980ti gained the point.

    1080TI ya this might come if and only if amd can put something to beat 1080. amd cant do this now. cause nvidia did something very smart this time they have chosen GDDR5x over HBM2. amd has chosen HBM2 for their card. and its hard to redesign. before early 2017 amd wont be able to bring any new card becaz of the low supply of HBM2 memory. by that time i think volta will be ready. and nvidia will fight amd vega with volta not pascal. nvidia can predict business way better then amd. and jen sun hung is a better businessman then the geek guru raja kaduri :)

    thats my personal opinion :) thanks
     
  13. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,107
    Likes Received:
    2,611
    GPU:
    3080TI iChill Black
    Nv would have easily released 780Ti back then, but it couldnt, it had waffer issues with GK100 chips. That's why there was no Quadro, tesla GK110 either at first.


    You will see a 1080Ti very soon. Imo sooner then expected. End 2016 or very early 2017 for sure.
     
  14. sajibjoarder

    sajibjoarder Master Guru

    Messages:
    607
    Likes Received:
    1
    GPU:
    gtx 1080 FE
    early 2017 there is a chance.. but i dont think they will bring anything out b4 amd pose a threat to them :)
     
  15. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,107
    Likes Received:
    2,611
    GPU:
    3080TI iChill Black


    That's true, big Vega could force NV.

    lol unless AMD decides to play it "safe" and releases smaller Vega first in Q4 2016.. :nerd:
     

  16. TyrantofJustice

    TyrantofJustice Ancient Guru

    Messages:
    5,011
    Likes Received:
    33
    GPU:
    RTX 4080
    i opened a thread about this but ill ask here... i have a new gigabyte 1080 gaming G1 and in hilberts review at 1440p he is in the 80s in framrates i have everything on ultra and hair works on and 4 AA and i stay in the low 60s to just hitting 70 fps is there something wrong?
     
  17. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,107
    Likes Received:
    2,611
    GPU:
    3080TI iChill Black
    Last edited: Jul 30, 2016
  18. vase

    vase Guest

    Messages:
    1,652
    Likes Received:
    2
    GPU:
    -
    -tj- do you own witcher? if you at some point have a minute - can you tell the FPS difference with high/ultra settings of hairworks ON vs. OFF in a quick comparison for the recent driver you use?

    i just tested it, never played with hairworks ON, but with newest crimson and mostly ultra settings in 1080p inside a city with around 5 people on screen it is
    30 FPS difference! with hairworks on. going from 95 down to 65. just tested it in white orchard.

    it would be interesting to see if there is a reduction of impact with the latest forceware.
     
  19. TeX_UK

    TeX_UK Guest

    Messages:
    1,951
    Likes Received:
    2
    GPU:
    EVGA 1080 FE
    The test system is quite different to your rig aswell which would account for the extra fps ;) .

    http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_1080_g1_gaming_review,12.html
     
  20. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,107
    Likes Received:
    2,611
    GPU:
    3080TI iChill Black

    No I dont have it anymore, I only installed it one time to see how it runs on this factory OC 980ti.

    From what I remember Hairworks msaa took the biggest hit, anything over 2xmsaa. Also it looked the ugliest with 2xmsaa lol.. A little lower hair tessellation factor helped as well and kept it above 60fps in any scenario @ DSR 1440p.
     

Share This Page