MSI: Radeon RX Vega needs a lot of power

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 22, 2017.

  1. Denial

    Denial Ancient Guru

    Messages:
    14,039
    Likes Received:
    3,871
    GPU:
    EVGA RTX 3080
    Yeah but we also know the reasons why AMD's Tflops don't equal Nvidia's in game performance and Vega's changes were supposed to remedy most of those issues (architecture balance/drivers/etc).

    I still personally think the top end Vega will trade blows in various games with the Ti, especially at higher resolutions. I just don't know if the year and 3+ months of architecture design that Vega has over Pascal justifies what would essentially be a tie. I'm kind of on the fence when I think Volta will launch. I assumed it wouldn't be until 2018 because I thought GV100 was going to be delayed.. but it's pretty clear that GV100 is going to be shipping ~Q3.

    https://pbs.twimg.com/media/DBJY_f8XoAAbaZA.jpg - they already have them shipping to select partners in test servers.

    So now I'm thinking Nvidia might just move the launch of Volta up to sometime this year. I don't think they'll put out a 600mm2 card, but if they did put out a ~400mm2 GTX1180 @ 180w with 10-15% more performance than the 1080Ti (Kind of like the 1080 to the 980Ti) - it would definitely hurt Vega sales.

    AMD would have to wait for a 7nm refresh of Vega(20). Problem is Nvidia is also moving to 7nm next year but they are using TSMC's process which is rumored to be 6 months ahead of Samsung/GFs in terms of full production. So I'm not quite sure how that will play out or what Nvidia would even launch at 7nm ~August 2018 timeframe.

    IDK, things are definitely going to get interesting for AMD though. Especially when Navi rolls around, which I think is going to use a similar paradigm to multi-core design as Epyc/TR is. Multiple smaller dies connected by infinity fabric. How they will do the scheduling/memory management, idk, HBCC maybe? But it's definitely going to be cool - but won't be around till 2019.
     
    Last edited: Jun 22, 2017
  2. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,099
    Likes Received:
    946
    GPU:
    Inno3D RTX 3090
    Last time I checked, the RX580 was using a max of 22% more power, and depending on the load it was actually less than that.

    [​IMG]

    [​IMG]


    Less than ideal, but not 40%.
     
  3. fry178

    fry178 Ancient Guru

    Messages:
    1,930
    Likes Received:
    344
    GPU:
    Aorus 2080S WB
    @Only Intruder
    The same for you. Where's your proof that it is like you said?

    Based on the facts that the last couple of amd gpu releases were slower than Nv, while consuming more power, i assume its gonna be the same.

    Because if the cards are "so fast", why not have the card perform at Nv level, and thus reduce power consumption?
    This to me means the card is only able to compete with pascal when cranked up to level 11, hence the huge amount of power needed.
     
  4. Paulo Narciso

    Paulo Narciso Maha Guru

    Messages:
    1,226
    Likes Received:
    36
    GPU:
    ASUS Strix GTX 1080 Ti
    Well 1070 destroys a 580 consuming less power.

    It's not very hard to conclude that polaris is not very efficient and VEGA will be the same.
    Even if it have the performance of two 580, that's not enough to beat a 1080 ti.
     

  5. sammarbella

    sammarbella Ancient Guru

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    Most.

    The gap in Tflops is narrow now (Vega VS 1080s) than it was in Fury X vs 980Ti.

    At least this time the stupid power limitation error AMD made in Polaris will not be a problem.

    Maybe we will even see some kind of OC in VEGA gaming GPUs. LOL

    I agree, AMD seems to be late to the party...since 290X.

    A tie in price and performance is not enough for the underdog, it wasn't for Fury X...or Polaris.

    1070 performance and price make it THE dedicated GPU of choice in the medium gaming range.In the high end range AMD is missing since 290X.

    AMD 56 should be able to battle it.64 should battle 1080 and Gaming Vega should battle 1080Ti.

    I simply don't see it.

    Do you have any doubt Nvidia has Volta ready to release and kill AMD gaming GPU options if Vega can deliver?

    I don't have any doubt.
     
    Last edited: Jun 22, 2017
  6. Denial

    Denial Ancient Guru

    Messages:
    14,039
    Likes Received:
    3,871
    GPU:
    EVGA RTX 3080
    Why do you conclude Vega will be the same? Half the architecture changes that have been outlined are designed to save power.

    You can't just look at the past few generations and say "this is how it was, this is how it will be" - not when we have pretty detailed information on the changes going on under the hood. Will it have the same efficiency as the 1080Ti? I don't know, maybe in some games when it outperforms it - but I'm fairly confident it's perf/w is going to be 15%+ than Polaris on average.
     
  7. malitze

    malitze Active Member

    Messages:
    89
    Likes Received:
    0
    GPU:
    Sapphire Fury X
    I took a quick look at Boss' latest 580 review throughout which the tested 580 is averages ~77 fps (100%) vs. ~70 (90%). So at least in this case "1060 it still performs the same averaged across many titles" is bit off.

    I think AMD just has to go for the one-fits-all approach for reusing the same chips with graphics and compute solutions resulting in a bit of an overhead in compute power than it can actually translate into fps.
     
  8. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    Hate to burst your bubble but that's total system power draw in those tests. The system in that test looks to be drawing about 120-125W that puts 1060FE at about 120Wand 580 at about 200W in the gaming bench yep looks like 40% there. Fumark is not ideal but you can't not count it.
     
  9. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,099
    Likes Received:
    946
    GPU:
    Inno3D RTX 3090
    I honestly didn't notice it was full power draw. You're both correct guys.
     
  10. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    7,236
    Likes Received:
    3,696
    GPU:
    HIS R9 290
    Hasn't it already been established by AMD themselves that its performance sits between the 1080 and 1080Ti? Why is everyone expecting and hoping it will beat the Ti?

    As for power consumption, I've noticed AMD GPUs heat up tremendously when you tell them to do relatively simple tasks (like Furmark) at full force, but their wattage becomes adequate when they're doing something more complex (like Unigine Valley). I get the impression AMD's GPU pipelines are much shorter than Nvidia's, but they have more of them. This is why they're good for mining, and why they can build up heat so quickly doing the same task.
     
    Last edited: Jun 22, 2017

  11. Clouseau

    Clouseau Ancient Guru

    Messages:
    2,809
    Likes Received:
    491
    GPU:
    ZOTAC AMP RTX 3070
    Ok, if this really requires a significant increase in the power envelope, would it be fare to say that this is AMD's answer to help get the cards into gamer's hands instead of miner's. Otherwise it is game as usual...AMD/ATI were always known to run hotter than the competition; nothing new there. Only exception was Fermi tried to take that crown away. Title is still held by AMD though. What were the old adages:

    DX based games - ATI
    OpenGL based games - Nvidia
    More accurate colors - ATI
    Close enough colors - Nvidia
    Game Performance - Nvidia
    Close enough game performance - AMD
    More Heat - ATI/AMD
    Less Heat - Nvidia

    Now the whole picture is just a mixed bag but the last four have been more consistent.
     
  12. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    Honestly it's easy to mis in that chart.
     
  13. Truder

    Truder Ancient Guru

    Messages:
    2,159
    Likes Received:
    1,184
    GPU:
    RX 6700XT Nitro+
    What? Didn't you read what I said? I said we can't draw conclusions yet, we need to wait til you know... we actually have the product and the reviews are out, all we're doing is speculating.
     
  14. Quicks

    Quicks Master Guru

    Messages:
    558
    Likes Received:
    12
    GPU:
    Red Devil RX 470 / 4GB
    Funny how it is always speculating but when it comes out and its true then its like aah well its not that bad?
     
  15. __hollywood|meo

    __hollywood|meo Ancient Guru

    Messages:
    2,991
    Likes Received:
    139
    GPU:
    6700xt @2.7ghz
    i think thats coming on quite strong. objectively it just needs to perform well & be priced competitively.

    as an aside, i personally dont care much about team reds wattage efficiency issues with recent gens cards. i dont fold or crunch coins.
     

  16. Quicks

    Quicks Master Guru

    Messages:
    558
    Likes Received:
    12
    GPU:
    Red Devil RX 470 / 4GB
    mostly I agree with you, but the fact its 1 Year late to the party might be a problem, as Nvidia can just do a quick refresh and kick AMD in the face again. Then they will take another year to catch up?
     
  17. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,199
    Likes Received:
    1,468
    GPU:
    2070 Super
  18. alanm

    alanm Ancient Guru

    Messages:
    11,523
    Likes Received:
    3,642
    GPU:
    RTX 4080
    Ha ha... where did you get that? From Nvidias HDMI bug that was long resolved? Or from NV's FX series 15 years ago?
     
  19. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,199
    Likes Received:
    1,468
    GPU:
    2070 Super
    its one of those ATI superior image myths

    If there was any truth at all in these myths, one would think that AMD would have used this to demo their superiority in color representation.
    But we've never ever seen any such demo, have we
     
  20. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,099
    Likes Received:
    946
    GPU:
    Inno3D RTX 3090
    I wonder if the power consumption is that great due to a weird voltage cut off the chip has, or it's a general trend with Vega.

    I remember that most Furies could be undervolted for significant thermal and power gains, but overclocking and overvolting them would send consumption through the roof. Perhaps Vega is similar and the lower-clocked/voltage cards actually do compete well with NVIDIA in perf/watt (see the Nano vs the GTX 980), but the big boys which will most likely go over the optimal voltage/clock thresholds need a lot of work to get there.

    I would say it's that way just by the fact that AMD seems to provide a Vega model with a single fan on it, and no huge (initial) differences between it and the watercooled model.
     

Share This Page