Review: GeForce RTX 3080 Founder edition

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 16, 2020.

  1. BReal85

    BReal85 Master Guru

    Messages:
    433
    Likes Received:
    141
    GPU:
    Sapph RX 570 4G ITX
    I'm just curious about those people's opinion about the power consumption who in the old days made fun of the 290X or 390X, or even the 480 or Vega cards. What do they say now about the extra 90W power it needs compared to the 2080 or the efficiency increase compared to the 2080 vs. 1080 to the 980?

    And BTW do I remember well that Huang said 3080=2x2080?
     
  2. labidas

    labidas Master Guru

    Messages:
    273
    Likes Received:
    98
    GPU:
    290


    Fanboys & passionbuyers never change.
    I'm in the same boat - RTX3080 is underwhelming - overpromised and underdelivered.
    I'm suspecting AMD's answer will be faster and more efficient (based on all the rumors) than an RTX3080. Only time will tell.
    The wrong thing right now would be to upgrade or buy these new cards before AMD comes out with theirs - faster or slower, it's a fact AMD's new cards will be very competitive, forcing a price reduction in nVidias lineup.
    And I won't be buying any card. Perfectly happy with my old R9 290, having owned various cards from both companies.
     
    BReal85 and Fediuld like this.
  3. AuerX

    AuerX Active Member

    Messages:
    99
    Likes Received:
    38
    GPU:
    PNY RTX2070 OC
    Vega's were made fun of because they didnt perform AND used a ton of juice.
     
  4. illrigger

    illrigger Master Guru

    Messages:
    240
    Likes Received:
    76
    GPU:
    Gigabyte RTX 3080
    MSFS is basically broken at this point. It loads one CPU core to 100% 99% of the time, and that process limits everything else happening in the game. Until they fix that issue, no amount of GPU or CPU power will make it run better.

    As for Witcher 3, the game is a DX9 title that came out 5.5 years ago. Just because CDPR did such an incredible job making the game that it still looks so great in 2020 doesn't change the fact that it's a VERY old game that lacked proper optimization for hardware that was new 5 years ago, let alone the stuff available today.

    In contrast, the current engine that will run the enhanced edition of Witcher 3 and CP2077 is a completely modern and optimized DX12 platform with RT support baked in. You may as well be comparing them to DOOM rather than the OG Witcher 3.
     

  5. Jamethe80sman

    Jamethe80sman Member

    Messages:
    23
    Likes Received:
    2
    GPU:
    2060 Super
    I'll get one next April and will my RX750m be good enough? Or a new PSU on that list as well?
     
  6. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,457
    Likes Received:
    483
    GPU:
    Sapphire 7970 Quadrobake
    This seems like an amazing release from Nvidia. Hopefully AMD has an answer.

    I'm not getting anything until I have benches from both.
     
    Valken, jura11, chispy and 3 others like this.
  7. alanm

    alanm Ancient Guru

    Messages:
    9,826
    Likes Received:
    2,003
    GPU:
    Asus 2080 Dual OC
    There is no equivalence. its not the power draw in itself that is important but how much performance you get for the power draw. That is referred to as efficiency, or perf per watt. The 290x was miserable at perf/watt. The 3080 is excellent at perf/watt. It draws a lot, but it performs a lot.

    And re Huang saying 2x 2080, he said "UP TO" twice the performance of the 2080 @4k, and here he is accurate. In fact there are a couple of titles that it is more than 2x 2080. The use of the "up to" phrase is a very common practice in marketing.
     
    AuerX likes this.
  8. Elder III

    Elder III Ancient Guru

    Messages:
    3,692
    Likes Received:
    292
    GPU:
    Both Red and Green
    It definitely has Nice 4K results.
    It also has high power consumption, but I personally don't really care about that as long as the cooling can keep up with it.
    The price going back to "normal" is very welcome too.

    I'm not so sure about the cooling design since I really don't want to dump all of that heat into my CPU cooler.... I'd love to see some results showing how much the CPU temps increased on a high end tall heatpipe cpu cooler with this vs a similar tgp video card with a traditional style cooler.
    The biggest deal breaker for me is 10GB of VRAM.... that is what will drive me to either buy from AMD or wait for a 3080 (Ti?) with 20GB. For my personal usecase 10GB is already not enough let alone giving me any room for future proofing. :(
     
  9. Francesco

    Francesco Active Member

    Messages:
    89
    Likes Received:
    33
    GPU:
    Sapphire 4870
    Now... most people don't care for power consuption. I care. For me, performance improvement is there only when there's also an improvement in efficency, and it is not the case here. If they managed to lower a little the wattage compared to 2080Ti while raising performance by a 30% it would have been a good card, but for 90+ watts this card should really deliver much more.

    300+Watts for a videocard are really too much.
     
  10. illrigger

    illrigger Master Guru

    Messages:
    240
    Likes Received:
    76
    GPU:
    Gigabyte RTX 3080
    Oh, a FACT, is it? Did you get a private press briefing showing you the performance a month ahead of everyone else?

    I don't disagree that buying now is probably foolish, but to paraphrase a great man "Putting faith in AMD's graphics division? That's a long wait for a train don't come." Or maybe "Going to an AMD GPU launch event is nothing but a container full of tentacles and disappointment".
     

  11. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,434
    Likes Received:
    189
    GPU:
    RX 580 8GB
    Yep this is the way.
     
  12. BReal85

    BReal85 Master Guru

    Messages:
    433
    Likes Received:
    141
    GPU:
    Sapph RX 570 4G ITX
    IMO the 3080 is not excellent at perf/watt regarding the node change.



    The 3080 costs exactly the same as the 2080. The only difference is that the Founders Edition model is $100 less, but the AIB models don't cost $100 less like with the 2080.

    In what terms didn't it perform? It was faster than the 1070 but consumed more power.
     
    Last edited: Sep 17, 2020
  13. alanm

    alanm Ancient Guru

    Messages:
    9,826
    Likes Received:
    2,003
    GPU:
    Asus 2080 Dual OC
  14. Lily GFX

    Lily GFX Active Member

    Messages:
    94
    Likes Received:
    83
    GPU:
    AORUS 3090 Master
    Thank you for the awesome review<3

    I hope AMD next gen give Nvidia cold feet and make them release 3080Ti with 12GB sooner at a nice price:>
    If not I will just keep setting off money for a 3090 until I can afford it, since I already use a 4k freesync2 monitor at 120hz.
     
  15. BReal85

    BReal85 Master Guru

    Messages:
    433
    Likes Received:
    141
    GPU:
    Sapph RX 570 4G ITX
    Please watch the video I linked. I set it to the part where Steven speaks about that. Plus watch these 2 graphs
    [​IMG] (4K is the same, but 1440p is more adequate to the power of the 980-1080)

    [​IMG]


    And do not forget that the 1080 could be OCd by about 13% or so, while the 3080 can only be OCd by about 4%, so there is a 10 %point difference (the reason I wrote the 1080 is that

    Yes, there is about 6% difference between Intel and AMD at 1440p and 8-9% at FHD. But as far as I know, we are speaking of a 4K card, aren't we?
    Having an average performance upgrade doesn't mean the minimums are higher with that exact percentage. And Hilbert doesn't display minimums, so he could get the same amounts of minimum fps like the guys at Hot Hardware.
     
    Last edited: Sep 17, 2020

  16. alanm

    alanm Ancient Guru

    Messages:
    9,826
    Likes Received:
    2,003
    GPU:
    Asus 2080 Dual OC
    @BReal85 I dont think you understand perf/watt. So I'll leave you with whatever conclusions you find yourself content with. ;)
     
  17. RavenMaster

    RavenMaster Maha Guru

    Messages:
    1,196
    Likes Received:
    138
    GPU:
    1x RTX 3080 FE
    When will the AIB partner cards be reviewed?
     
  18. BReal85

    BReal85 Master Guru

    Messages:
    433
    Likes Received:
    141
    GPU:
    Sapph RX 570 4G ITX
    I understand it, you don't need to use emoticon. But you forgot to react to the 2 graphs or the video. 1080 was 55% more efficient than the 980. The 3080 is 18% more efficient than the 1080. And both series had node change compared to their predecessors.
     
  19. illrigger

    illrigger Master Guru

    Messages:
    240
    Likes Received:
    76
    GPU:
    Gigabyte RTX 3080
    Did YOU watch that video? Because there's big charts showing just how much more efficient in both performance per watt and performance per dollar it is compared to every other card. This can be seen at around 20:40 in. The 3080 gives about 7% more performance per watt than a 2080Ti, 24% more than a 2080, and a whopping 39% more than a 1080Ti. The sentiment Steve makes his point about in the end isn't about the performance per watt, it's about the power draw in general, which is the same as a 1080Ti from 3 years ago in most cases. His lament is that the power use is so high, not that it's not well used.

    Cherry picking points in the video that only fit your narrative while leaving out the actual relative data is not cool, bud.
     
    alanm likes this.
  20. XenthorX

    XenthorX Ancient Guru

    Messages:
    3,472
    Likes Received:
    1,397
    GPU:
    3090 Gaming X Trio
    Same price as 2080 2 years ago, and quote Hilbert review:
    That's the SUPER version, and if you're looking at 4K Benchmark exclusively you're leaning toward the biggest difference possible.

    BFV: 71.8%
    Metro Exodus: 71.9%
    RDR2: 77%

    Now if you check this video:


    It explains how Nvidia reached the twice performance number, and actually state 1.9x:
    Long story short, standard raster render is faster, ray tracing is faster, DLSS is faster, and DLSS can now be computed concurrently to raytracing instead of previously computed at the end of the frame.
    Doing all that you lean to 1.9 times the performance allegedly.

    Don't take their word for it? No problem - from Hilbert review on BF5:
    Back on MLD:
    Clearly i wouldn't expect drama/rumors/undocumented youtube channel to praise anything Nvidia would have released.
    He doesn't want to go out of business afterall.

    That especially applies to "Moore's Law Is Dead" which brings absolutely zero extra value to what he's "reporting" : there's no rationality, no data analysis, just small talk.

    Take the part about BF5 early in the video you linked, he states that Digital Foundry turned Ray Tracing off because of VideoRam usage that would have put 3080 at a disadvantage, that's a blank statement that just doesn't hold up.

    First of all this reasoning is logically wrong, which doesn't surprise me, turning ray tracing ON would have increase the FPS gap in favor of the 3080, and biased the results while mixing the classical raster rendering performance (that the majority of the games out there are using), and the technological iteration from RTcore gen 1 to gen 2.
    Second, clearly he lacks the rendering knowledge difference between memory used and allocated. You can develop a software which allocate 200Gb of video memory and will seem to use all VRam available if you want, that has little to do with memory actually used at a given time, and beside engine rendering engineer of a given company, you'll be hard pressed to get any tangible information on the inner working of a given renderer.

    All ending in a beautiful conclusion that Digital Foundry was immoral and purely cherry picking because of Nvidia paying them.

    "Blank Statement based on logically wrong analysis" + "lack of rendering knowledge and logically wrong metric interpretation" doesn't equal truth.

    Watching this guy is a waste of time and i already want my 10minutes of lifetime back.
     
    Last edited: Sep 17, 2020
    pharma, AuerX and Lily GFX like this.

Share This Page