Rumor:GeForce RTX 4080 Gets 420 W TDP

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 7, 2022.

  1. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21

    RX 6900 X.jpg


    what they don't realize is that's going to make same difference in electicity cost or heat production in just 4hrs a day as 30W does during 2 hours of gaming.
    if you have the pc on for just for 6-10 hrs,working and watching yt,the differeence will be the same as 3-5 hours of gaming on a card that's 30w more power hungry in games.
    look at 6900xt vs 3090, 26w difference in video playback, 27w 6800 vs 3070

    for someone using the pc the way I am, 1-2hrs of gaming, then working for 2-4 (6-8 in lockdown) and then for videos and music later on, even 50w in gaming power consumption for 1-2hrs is very little compared to 15-25w for 10-12hrs of using the pc for work and entertainment
     
    Last edited: Jun 8, 2022
    Krizby likes this.
  2. PPC

    PPC Master Guru

    Messages:
    361
    Likes Received:
    191
    GPU:
    7800XT
    Yeah a redesign of ATX motherboard layout is in order tbh, whole box really. Nothing in this design was meant for 400W+ cards, GPU's cant dump the heat as it is. No one noticed that CPU temps are harder and harder to maintain when GPU is in use? Its simple physics, something that outputs 400W of heat should have its own ventilation funnel in the design but you cant really do that properly with current design because it was never meant even for 2 slot cards, let alone monsters we got today.
     
    southamptonfc and Truder like this.
  3. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,740
    Likes Received:
    9,632
    GPU:
    4090@H2O
    So guys, one thought crossing my mind: Does it make sense to intentionally buy a PCIe 5.0 mainboard when it comes to this future Nvidia generation, just to make sure power delivery over PCIe is as potent as possible?
     
  4. Maddness

    Maddness Ancient Guru

    Messages:
    2,440
    Likes Received:
    1,738
    GPU:
    3080 Aorus Xtreme
    Isn't the PCI-e slot still only 75watts.
     

  5. Krizby

    Krizby Ancient Guru

    Messages:
    3,092
    Likes Received:
    1,770
    GPU:
    Asus RTX 4090 TUF
    reviewers should test next gen high end GPUs with normalized TDP of 300W like what igorslab did with 3090Ti, sure it's not stock config but everyone buying high end GPU should at least know how to adjust the power limit slider.
     
  6. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,740
    Likes Received:
    9,632
    GPU:
    4090@H2O
    I guess you are right and it makes no difference from there
     
  7. alanm

    alanm Ancient Guru

    Messages:
    12,267
    Likes Received:
    4,466
    GPU:
    RTX 4080
    Last edited: Jun 9, 2022
    Undying and carnivore like this.
  8. Truder

    Truder Ancient Guru

    Messages:
    2,400
    Likes Received:
    1,430
    GPU:
    RX 6700XT Nitro+
    That's a bug that is haunting AMD with memory clocks. I had the same issue with my 6700XT. Dual monitor setup (One high refresh rate) idle desktop usage consuming over 30W with memory clocks remaining at full speed but it is something that can be solved. The issue boils down to monitors not correctly reporting appropriate vblanking data and can be fixed through using CRU or the driver EDID editor and increasing vblanking rate. After doing this my memory clocks entered idle states and the card now only consumes up to 5w at idle.

    It's a complex issue though, AMD has been trying to address this through driver updates and some configurations slip through the net but it's also monitor manufactures using non standard timings which add to the complexity.
     
  9. fry178

    fry178 Ancient Guru

    Messages:
    2,078
    Likes Received:
    379
    GPU:
    Aorus 2080S WB
    @alanm
    thats just consumption, not showing it in relation to output.

    i can have a newer card using (w) what the X1900 does, yet is able to run something like siege, which non of the old cards would do (example: res) ...
     
  10. alanm

    alanm Ancient Guru

    Messages:
    12,267
    Likes Received:
    4,466
    GPU:
    RTX 4080
    Yes of course. Just a bit odd how some of us (incl myself) had presumed that greater efficiency would keep power down with succeeding gens. Like Pascal which consumed about same as Maxwell yet with massive performance increase.
     

  11. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    That is impossible
     
  12. Truder

    Truder Ancient Guru

    Messages:
    2,400
    Likes Received:
    1,430
    GPU:
    RX 6700XT Nitro+
    Yes of course it's impossible, please show me your 6700XT that you personally own and have experience with.

    I recommend that you go through the AMD section of this forum, you'll see many many posts about high idle vram clocks - it's not a new problem, it's occurred on many generations of AMD cards and has been a problem many many many AMD owners have had to deal with.

    Obviously I cannot check total power draw of the card but on complete idle the card is reporting 5w consumption (this is not just chip power, it does include vram power usage) with GPU clock at 0 state and vram clock at 14MHz (sure enough it may indeed be using slightly more for the VRMs but we all know it's a negligible amount) the point is, dual display scenario, your source of information is showing the characteristic high vram clock bug that many have been aware of which is something that can be fixed.

    RadeonSoftware_YUH6SFgJNv.png

    Curiously enough, I cleared the settings I used to fix my high vram clocks to show you the idle clocks/power usage and it appears AMD have indeed fixed the issue for my configuration (I'm on 22.5.1 drivers, my guess, my previous driver reports were looked at then so I'd probably have to roll back to an earlier driver to illustrate high idle power usage).

    I don't know why yourself and Krizby have to have so much prejudice against owners of AMD cards but can you please both relax on this, it gets to a point that it becomes toxic and abrasive and for that matter, Agonist too, demonstrating just the same attitude but against Nvidia owners, we shouldn't be butting heads against one another but respecting each others input and providing help and understanding (which is what I'm trying to do here illustrating my experience).

    AMD and Nvidia have both been leapfrogging each other on who can have cards use the most power (in an effort to extract as much performance out of their silicon and trying to attain performance crowns) so it's not like either side are any better than the other on this but as owners of these graphics cards we can help each other through knowledge and experience as mentioned in the previous paragraph, my previously graphics card, R9 Fury left my system a hotbox and in summer it was a nightmare - particularly in the last year I owned it as it was being pushed to the limit going upto about 300W and I decided then to never get a card in that power range again. 6700XT being 220W, for my regular use I barely see it breach 200W and I don't even bother to undervolt it, I just leave it stock.

    I suspect this is why Agonist is stating that his card is only using 200W but likely because he's not maxing the card out - it will undoubtedly in the future reach 300W just like my Fury did when later on in the cards life will be rendering more and more demanding things pushing it to the limit.

    Nevertheless, this trend of increasing power usage looks to be ever increasing and we can certainly guarantee that AMD will do this too.... I actually wonder just how much of a perf/w improvement we'll actually see, I'm very sceptical for this next generation.
     
    carnivore, alanm and cucaulay malkin like this.
  13. Agonist

    Agonist Ancient Guru

    Messages:
    4,287
    Likes Received:
    1,312
    GPU:
    XFX 7900xtx Black
    My 6800xt uses 250w avg when gaming stock settings. Not 300w. Only when I push it too 2650/2125 does it pull 300w.
     
    cucaulay malkin likes this.
  14. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    nice
    still,I'm quoting tbp as you are comparing against tbp

    tldr
    5w power on a card is impossible
    0mhz is impossible
    needs to be verified with something that measures real draw,even as simple as ups reading with the fix on and off
    all this tells me is that those readings are wrong
    if this is true,I need the same on my 3060ti
    send me the afterburner reading log please, or just the screenshot


    I too share the scepticism
    on one hand,amd is trying something that's never been attempted before.
    on the other hand, it's just out of neccessity. when there's node parity or nvidia has the advantage, there's no other way for them to compete.
    it's good planning ahead on their side though. they might turn a disadvantage into a win.
     
    Last edited: Jun 9, 2022
  15. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    8,704
    Likes Received:
    10,787
    GPU:
    RX 6800 XT
    [​IMG]
     
    Truder likes this.

  16. cucaulay malkin

    cucaulay malkin Ancient Guru

    Messages:
    9,236
    Likes Received:
    5,208
    GPU:
    AD102/Navi21
    this is single monitor, isn't it
    not the subject of our conversation

    https://tpucdn.com/review/sapphire-radeon-rx-6950-xt-nitro-pure/images/power-multi-monitor.png

    amd pulls twice the amount of electricity.
    6700xt twice 3060ti
    6800xt twice 3080
    6950xt more than twice 3090ti

    now go look at video playback
    ho-ly-crap
    6800xt takes twice the amount of power as 3090ti

    like I said, this all depends on your usage, but if you have the same computer for gaming as work and other entertainment, this is majorly overlooked.

    my 3060ti is at 25w having a dozen open windows on two monitors, youtube playing , office document open.
    same thing every day, for the whole year , same pc I game on I also work and browse web on.
     
    Last edited: Jun 9, 2022
  17. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,748
    Likes Received:
    1,868
    GPU:
    EVGA 1070Ti Black
    3060 was already pushing 200 watts if going up 50watts that to 220w that would imply it 180w card which from what seen it not true less i looking at wrong charts
     
  18. alanm

    alanm Ancient Guru

    Messages:
    12,267
    Likes Received:
    4,466
    GPU:
    RTX 4080
    For reference, power efficiency of current cards.
    [​IMG]
     
    cucaulay malkin likes this.
  19. Agonist

    Agonist Ancient Guru

    Messages:
    4,287
    Likes Received:
    1,312
    GPU:
    XFX 7900xtx Black
    Ok i gotcha. Makes sense. Just seems that Nvidia cards are way too power hungry these days. Im not as concerned with power consumption as I am cooling capacity.
    Imagine a midrange gpu then size of a 6800xt/RTX 3080.
     
  20. Rich_Guy

    Rich_Guy Ancient Guru

    Messages:
    13,145
    Likes Received:
    1,096
    GPU:
    MSI 2070S X-Trio
    Rumoured to be 4x slot coolers :D
     

Share This Page