Review: GeForce RTX 3080 Founder edition

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 16, 2020.

  1. illrigger

    illrigger Master Guru

    Messages:
    340
    Likes Received:
    120
    GPU:
    Gigabyte RTX 3080
    I dunno man. It uses almost exactly the same amount of power as my 1080Ti, but gives about twice the performance. So no change to my power bill, but a massive change to the enjoyment of my hobby. That's a win in my book.
     
  2. BReal85

    BReal85 Master Guru

    Messages:
    487
    Likes Received:
    180
    GPU:
    Sapph RX 570 4G ITX
    Mate, please, check the 2 graphs I linked. Both the 1080 and the 3080 had node shrink compared to their predecessors. The 1080 had 55% upgrade while the 3080 has 17%. Isn't that a massive difference? Plus from the video:

    "Here we're seeing that the RTX 3080 does offer the most performance per Watt so you can't deny its efficiency, though having said, you'd probably be hoping for more than an 8% improvement over Turing from a node shrink."

    Hm? :) I'm just saying that the efficiency gain from previous gen, taking the node shrink into account is very low. For example, check the RX 480's efficiency gain compared to the 300 series (380/X or 390/390X, whatever you would like), and remember what how people laughed at its efficiency or power consumption. I can remember. :)
     
    Last edited: Sep 17, 2020
  3. jbscotchman

    jbscotchman Guest

    Messages:
    5,871
    Likes Received:
    4,765
    GPU:
    MSI 1660 Ti Ventus
    After reading/watching many reviews this is basically a cross gen GPU just like we've had in the past so many times. With the PS5 and XSEX releasing soon the 10gb VRAM is gonna be an issue once real next gen games start releasing. And with Sony pretty much releasing all their exclusives on PC in the future its gonna be a concern. It's probably not gonna have an impact until 2 years from now, but if you're gonna invest $700 (a lot more when the card is actually available) you might wanna hold off. That's just my opinion.
     
    Valken and Lily GFX like this.
  4. southamptonfc

    southamptonfc Ancient Guru

    Messages:
    2,628
    Likes Received:
    660
    GPU:
    Zotac 4090 OC
    I might even underclock mine at least in most scenarios. A 10% clock reduction would get it close to a 2080ti in power consumption with still a decent performance improvement. If this time round they let us drop the vcore on this seriens a little, it would be be even better.

    I'm running 2560x1080 166hz btw.
     

  5. BReal85

    BReal85 Master Guru

    Messages:
    487
    Likes Received:
    180
    GPU:
    Sapph RX 570 4G ITX
    TBH, I love Digital Foundry's analysis of different games performing on different platforms, but BF5 and Doom gains are nowhere near that much as shown in the video. And as I interprete Richard's lines, he is speaking like a 80% upgrade was true for the 3080 compared to the 2080. Which is in fact around 66% in average.
     
  6. jbscotchman

    jbscotchman Guest

    Messages:
    5,871
    Likes Received:
    4,765
    GPU:
    MSI 1660 Ti Ventus
    Underclocking and undervolting? That is bizarre.
     
  7. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    Cross gen entirely because of the VRAM? I don't buy it.
     
  8. BReal85

    BReal85 Master Guru

    Messages:
    487
    Likes Received:
    180
    GPU:
    Sapph RX 570 4G ITX
    That's funny to read as we can remember how some reacted to that aspect regarding Vega for example. :)
     
  9. jbscotchman

    jbscotchman Guest

    Messages:
    5,871
    Likes Received:
    4,765
    GPU:
    MSI 1660 Ti Ventus
    Have you been gaming for the last 20 years? The original Xbox, 360, PS3, Xbone, PS4, destroyed GPU's with half the memory.
     
  10. southamptonfc

    southamptonfc Ancient Guru

    Messages:
    2,628
    Likes Received:
    660
    GPU:
    Zotac 4090 OC
    Most of the games I play are either CPU bound in a lot of scenarios or can manage nearly or actually 166fps with a 2080 Super. To be able reduce vcore especially and save 50W+ would be nice in-terms of heat and efficiency.

    For more demanding games, I'd overclock and overvolt if the option is available :) Every fps counts..
     

  11. jbscotchman

    jbscotchman Guest

    Messages:
    5,871
    Likes Received:
    4,765
    GPU:
    MSI 1660 Ti Ventus
    Well I understand heat and efficiency is very important. But running under the manufacturer's rating seems a bit much.
     
  12. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    I played my first game today, I still don't buy it.
     
  13. illrigger

    illrigger Master Guru

    Messages:
    340
    Likes Received:
    120
    GPU:
    Gigabyte RTX 3080
    Then you should go and buy a 1080 instead of a 980Ti, I guess? How is that in any way relevant here?

    In the end, there are still big gains in PPW made here, and you are splitting hairs about how 4 years ago (when they just took Maxwell's CUDA cores, shrunk them, and added more) things were better? Did you take into account that Ampere has much more complicated CUDA/SMs and basically two other entire processors Pascal and Maxwell didn't, and that maybe you shouldn't compare them as if the design process was in any way comparable? And they STILL managed double the performance per watt that Pascal has?
     
    alanm likes this.
  14. jbscotchman

    jbscotchman Guest

    Messages:
    5,871
    Likes Received:
    4,765
    GPU:
    MSI 1660 Ti Ventus
    Haha. Now that the new consoles have high end hardware its gonna make things twice as hard for gaming PC's.
     
  15. JoeyR

    JoeyR Member

    Messages:
    18
    Likes Received:
    8
    GPU:
    2x eVGA GTX 980Ti's
    I was very skeptical with Nvidia on this release, but man; That's a freaking fast card, even with early on drivers! Personally very impressed. There was definitely some major CPU bottlenecking going on with some titles, especially at 1080P, maybe even some at 1440. This makes me believe there will likely not be a single CPU on the market that will be able to fully push the 3090 unless running 4k or higher. I am all for the underdog, and god bless Lisa Su, but I really don't think we'll be seeing anything in this performance category from AMD anytime even remotely soon( I do hope I am wrong however). $700 is still a large pill to swallow, and Nvidia with all their naming scheme shenanigans the last few years has permanently left a bad taste in my mouth. But given this case; the 3080 is indeed blazing fast......faster than the 2080 Ti which for a "Flagship" Ti card shouldn't have been more then $750 to start with. But hey, when AMD is so far behind the curve, the market is yours to do as you wish. Awesome write-up Hilbert! Thank you as always!
     

  16. BReal85

    BReal85 Master Guru

    Messages:
    487
    Likes Received:
    180
    GPU:
    Sapph RX 570 4G ITX
    If you play games that you can already run 166 fps with your 2080S, why would you buy the 3080?

    Because we are speaking of efficiency, and the efficiency gain bw the 980 and the 1080 was triple the amount compared to the 2080-3080. Even the 1080-2080 change was only 7% less than the 2080-3080 with node change.
    [​IMG]

    The 3080 is 37% more efficient than the 1080. You claimed it is 100%.
     
    Last edited: Sep 17, 2020
  17. Maddness

    Maddness Ancient Guru

    Messages:
    2,440
    Likes Received:
    1,739
    GPU:
    3080 Aorus Xtreme
    Some really nice gains over a 2080 and under. This would absolutely stomp my RX480. Still, must wait on RDNA2 release. This card would fit my needs nicely though.
     
  18. ChicagoDave

    ChicagoDave Guest

    Messages:
    46
    Likes Received:
    2
    GPU:
    EVGA 1060 / EVGA 970
    Thank you for the in-depth review as always Hilbert. Especially appreciate the FLIR photos.

    Overall this made things harder for me to decide...I need to build a new computer (still on a i7-4770 and SATA ssd's on the main gaming rig) and get a new GPU (current is a 1060 6gb or 970 4gb). For the CPU I'm pretty much set on the upcoming Ryzen 3 for both performance (SC and MC) and PCI-E 4.0 availability. On the GPU side though....I've never bought an x80 series card as the 60's and 70's were way more affordable and still packed a punch. But the sheer power of this card over the 1080, 2080 or even 2080Ti is astounding. It's almost 4x the FPS/scores compared with my 1060. Nearly 3x the scores of 1070s. And besting the previous $1,200+ card by ~25%-50%...crazy. The solid, reliable 60+ fps 4K across the board was a bit unexpected for me, I figured the 3090 and future 3080Ti would be that card but not the regular 3080. I run a 3440x1440 monitor on my gaming rig, so if a card can reliably hit 60fps on 4K it should be perfect for the slightly less demanding WQHD. I'm not sure the 1070 will be able to do the same, but I guess I'll wait a few more weeks. Also hoping AMD has some competitive cards in the 3060-3080 performance bucket... I don't really care about bleeding edge $1000 cards but obviously they inform us of what's in store for the future.

    I would likely undervolt the one that may go into my HTPC located on my TV stand. It's in a nice looking horizontal ATX-sized case with a bunch of fans, but at the end of the day I don't want ~400-450watts of heat exhausting out of that box. If I can dial the power down like 50-60 watts while still retaining 90-95% performance, I'm more than happy with that. It will all depend on my specific location/setup - how loud it gets, how hot the GPU and other components get, etc. It may be fine without any modifications, but I'd certainly consider it if the performance drop-off isn't bad. Plenty of people did that with R9 290's back in the day, and sometimes they'd even run FASTER with a lower voltage since it didn't throttle as much from hitting the temp limit every 3 seconds.
     
  19. ChicagoDave

    ChicagoDave Guest

    Messages:
    46
    Likes Received:
    2
    GPU:
    EVGA 1060 / EVGA 970
    Dude I think you're the only person on this forum that cares about the difference in efficiency gained going from the 980 to the 1080 vs the 2080 to 3080.

    Any discussions on efficiency gains should be comparing this to prior models - as you have pointed out yourself there are node size changes, architectural changes, memory changes, etc. There are like 30 variables so comparing the change going from A to B vs going from C to D isn't really a worthwhile exercise. Plus the TDP or whatever figure they're using is just that, a figure, one made up by the manufacturer. We now deal with Speed Boost/Turbo/whatever so while the max TDP is one number, the typical draw *may* be much less. One GPU may be rated at 350 watts but only draws 250w under 90% of conditions (making up numbers here), meanwhile another card could be rated at 300w and it regularly pulls 290w. At the end of the day, look at the card's total performance and performance per watt. Total performance tells us how strong the card is, and P/W gives a rough indication of work efficiency. It's not perfect but I don't see how generational comparisons really change your opinion of today's current top card. Some card in the past may have blew the doors off its predecessor, and another was barely an improvement. Neither of those tidbits should really drive your opinion of the current card given how different nearly every aspect of the 3080 is to the 980.

    Also on the topic of node shrinks and power consumption - I think we're at the point where we should no longer expect big (or any) power improvements from a node shrink. We're down to single nm nodes now - resistance (and heat) are going to start skyrocketing and it's unrealistic to think things will continue to get more efficient JUST because the node shrunk. We ain't on 65nm or 45nm anymore...8nm (improved 10nm) is really running into the silicon wall.
     
    Last edited: Sep 17, 2020
    alanm likes this.
  20. Lebon30

    Lebon30 Member Guru

    Messages:
    178
    Likes Received:
    117
    GPU:
    EVGA RTX 3070Ti 8GB
    SpajdrEX likes this.

Share This Page