AMD on the Road: takes Radeon RX Vega to the Gamers

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 12, 2017.

  1. xrodney

    xrodney Master Guru

    Messages:
    368
    Likes Received:
    68
    GPU:
    Saphire 7900 XTX
    Efficiency and clocks are more or less limited by GloFo 14nm process, after all it was created based on low power architecture and then cranked up to high power usage. I expect Navi on 7nm build for high power will fix that.

    And your statement that anyone can anyone can create TFlops performance product is clear bull****. You need to have a lot of resources and experience to be able create CPU/GPU and that's with not even mentioning walking over licences/patents minefield.
     
  2. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super

    No... just no.
    980Ti and Fury X are built on same 28nm TSMC.
    GTX 780 Ti and GTX750/Ti are built on the same process as well.
    And they are all words apart when it comes to perf/W

    The fact that both the process and the architecture (as well as the implementation, for example 1080 FE perf/W >> custom 1060) are responsible for products efficiency should be obvious same as:

    No ****... I thought that was an obvious hyperbole :3eyes:
     
  3. malitze

    malitze Active Member

    Messages:
    89
    Likes Received:
    0
    GPU:
    Sapphire Fury X
    I think so too. In the end efficiency is depending on the workload and only comparable if that Basis is the same so the question is how good the clock and Power gating is able to keep the power down for unused facilities.
     
  4. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    I don't disagree that GCN is becoming a bit inefficient for today's standards. There is some truth that "efficiency=performance", but where you're a bit mislead is how you think better efficiency opens doors to push the hardware harder for more performance, which is often true but definitely not always. For example, Ryzen has better efficiency per-core than a similarly clocked Kaby Lake but can't overclock nearly as high. In another perspective, in many OpenCL tasks, an AMD GPUs often get better performance-per-watt than Nvidia, but worse performance-per-watt in gaming (regardless of which GPU gets the higher framerate). The architecture, the silicon quality, and the transistors themselves play more of a roll in how fast you can push something than the efficiency of the design.


    As a side note:
    AMD GPUs tend to heat up a lot more under synthetic benchmarks, but when you have v-sync on with a normal game, they still have worse performance-per-watt vs Nvidia but not to the point where it's worth noting. My GPU, for example, is known to reach 300W under FurMark, but it tends to remain below 250W under a normal gaming session.
     
    Last edited: Jul 12, 2017

  5. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    I am sure you'll agree that Ryzen is an oddball when it comes to OC.

    For the sake of the argument imagine if custom Vega RX come at ~375W of real ingame consumption and lets assume it equals 1080Ti FE @250W.

    Who's more likely to be a faster card when OC-ed?
     
  6. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    Well yeah, obviously the 1080Ti will be faster. But that's also a very pessimistic and biased outlook.

    Again, I don't disagree that something like a 1080 or 1080Ti has more potential, I'm just saying your statement is a little too cut-and-dried, and there are a lot more variables involved than "efficiency" and "perf/mm2". Ultimately your point still stands, I'm just saying careful not to generalize too much.
     
  7. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    From everything we've seen Vega FE is atm(!) slower in games than 1080 FE!
    While consuming 280W and dowclocking itself to ~1440MHz.
    So it's easily 300W+ at 1600MHz.

    Knowing this, does 375W for custom Vega RX seem far fetched?
    And then I've added something like ~25% to its per clock performance so that Vega RX equals 1080 Ti FE, LOL I've even assumed perfect scaling.

    How is that "very pessimistic and biased outlook" :confused:
    Nevermind that I even said "For the sake of the argument imagine if", this hypothetical scenario does not look out of this world - at all
     
    Last edited: Jul 12, 2017
  8. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    To my recollection, Vega FE is not unanimously slower than 1080 FE. Much like the Titan series, to my understanding, Vega FE also involves more transistors for things like double-precision floats. An RX Vega is likely going to offer better performance-per-watt to Vega FE for gaming purposes (but worse for workstation tasks).
    375W sounds very far-fetched. Unless RX Vega offers 3x 8-pin connectors, I don't see how a single GPU could consume that much power, let alone get that much without being hazardous. Wattage does not scale linearly with clock rate. It might in theory (I haven't actually checked), but in practice the laws of thermodynamics kick in. Intel's i9 series is a good example of this. Some chips get better performance-per-watt as clock rates increase, some get worse.

    Again, you're not considering enough variables.
     
    Last edited: Jul 12, 2017
  9. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    wattage scales linearly with clocks UNTILL...
    until more voltage is needed and then it scales linearly with something like clocks*Voltage^2

    2X8 pin + PCI-E =375W, which does not mean that 375W is actually the maximum power available. It's the recommended(!) maximum
    Remember reference RX 480 and >75W on PCI-E?

    Vega FE has **** double precision.
    And I fail to see what does sample variance has to do with the discussion at hand :)
     
  10. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Hmm, there is definitely some changes that are going to occur for RX - but I don't think it's going to be too different from FE.

    FE's double precision is already cut to 1/16th FP32. Mixed math is inherent to the CU's so that's also likely to remain with RX - AMD stated they wanted to use the mixed math to accelerate their game libraries like TressFX. HBCC/Tile Raster are both supposedly disabled. I don't know how HBCC would effect power, but it could help minimum performance if/when the card is bandwidth starved. Tile Raster slightly improves performance as the card spends less time moving data from memory - it also slightly lowers power. So if those are disabled, we should see some perf increase directly from the features, plus additional increase due to any lower power consumption.

    In the end AMD can obviously sell the card for cheaper to make it competitive. No matter how bad hardware is, the cost is the ultimate factor. But by building expensive/complex hardware and then selling it at a lower cost, AMD's only hurting themselves. Analysts already consider AMD's margins really low for the tech industry, Vega under performing for the cost to manufacture only makes things worse.

    I think AMD is banking on 7nm for both Zen and Vega. Zen, architecturally is in a perfect position. It's limited to 4Ghz by the process - keep Zen the same and clock it to 4.5 at linear power scaling and it completely blows Intel's products out of the water. 7nm is going to allow that to happen - but Zen plus is also the second iteration of the design, which historically has always had the largest IPC gains with new architectures. Baring AMD screwing anything up, I think Zen+ is going to be really, really good.

    Vega is also going to be good at 7nm, but I think AMD faces different competition from Nvidia than Intel. Intel is competing through litigation and marketing at this point. Nvidia is actually innovating at incredible paces. I think the market is big enough though that AMD can find room, even if their products are slightly behind Nvidia's in terms of various metrics people use to gauge which product is better.

    Navi will most likely bring the TR/Epyc/Infinity Fabric design to GPU's - Nvidia is taking the same approach. That's when I think AMD is going to have an advantage, as they've basically been heading in this direction for the past decade with heterogeneous computing.
     

  11. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    Again, theoretical scaling isn't the same as what happens in practice. Unless you provide the equation, there's not much point continuing to discuss this. Regardless, you don't have to change the voltage for the scaling to skew. Heat output will change the equation.
    EDIT:
    I understand this may sound nit-picky, but when you consider the sheer amount of transistors, the already high wattage, and the high clocks, the rate at which wattage increases becomes a lot more chaotic. If we were talking about something like an i3 on water cooling, then yeah, the wattage scaling is going to be pretty linear.

    Exactly, 375W is not the actual maximum power available. It isn't the recommended maximum (that would be 362W, where you get 288W from the 2x 8-pins and 74W from PCIe) but the industry standard maximum. It is possible to exceed 375W, though it is frowned upon.

    The point is it affects wattage.
     
    Last edited: Jul 12, 2017
  12. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    I've lost you completely tbh...

    There is no such thing as theoretical power equation for ic circuit.
    power ~ clocks*voltage^2 is an approximative empirical equation that is usable under very limited scope of circumstances, merely a good starting point.

    You keep repeating "Wattage does not scale linearly with clock rate." As if I had claimed that that is always the case. Matter of fact power DOES scale linearly with clocks - AT BEST.
    In practice it often scales worse, sometimes much worse, especially going past the clock/power sweet spot ( which AMD lately has no trouble passing), especially closing in on max. OC. How does this, or your claim of power not scaling linearly help our Vega RX, ... I have no idea.
    I WISH IF VEGA POWER CONSUMPTION SCALED LINEARLY WITH CLOCKS PAST 1600MHz! There I said it.

    And how the **** did you draw me into this discussion when all I've said is: lets imagine 375W custom AIB Vega.
    Which was a simple - for the sake of the argument, yet not out of this world assumption. Now I need to provide a whitepaper on this, else I am very pessimistic and biased?
    And what about equating 1080 Ti, biased also?

    But lets see you try:
    Knowing that 1440MHz Vega draws 280W. How much would you assume that custom OC-ED 1700MHz Vega might draw?
    Negative zero?



    Sample variance affects the wattage... yes and??
    You might wanna talk about the specific golden chip, I am interested into volume averages.
     
    Last edited: Jul 12, 2017
  13. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,754
    Likes Received:
    9,647
    GPU:
    4090@H2O
    No this time I'm innocent! :D
    I actually said this afterwards:

    So I meant, politics aside, there will probably be more customers ready to attend such an event in London than in Budapest. At least that's what I though.



    Thanks for explaining Denial.

    Although I have to say I was more talking about electrical efficiency (power). Sure you right with that 40$ bill, but then again, we're enthusiasts... it's our hobby, so I personally do not tend to think about electricity bills too much. And here it's probably even more than at your place (Austria has rather costly electricity opposed to Germany, for example). Just from the point of view saying that every hobby costs money. But of course you are right, it's subjectively thinking in my case.

    As in for HBM2, you are probably right with the costs of bringing out different SKUs. I just feel that AMD would have done their customers a greater favour with releasing the cards half a year earlier with GDDR5X, than later with HBM2, which arguably is not that huge of a performance gain right now. Maybe they should have done their refresh with HBM2, but that is only a point if HBM2 is delaying Vega at all. If Vega is coming now because they couldn't have the chips earlier half a year ago (just an example), that's a wholly different story.
     
  14. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    You kind of just admitted yourself that it is inaccurate. Even without considering temperature, that equation doesn't tell the whole picture.

    Take high-school physics for example. They'll tell you Earth's gravity is 9.8m/s^2, which is true, but the equations you're told to solve don't account for air resistance, terminal velocity, starting velocity, air density, and so on. So when the teacher asks "how fast will the penny have moved by the time it hits the ground?", multiplying the distance by 9.8m/s^2 will give you a very, very wrong answer. Processor wattage is no different.

    Ok, and that's why I pointed out to you that you're being a little too general about what you're expecting. The fact of the matter is, when you push hardware as hard as AMD has pushed Vega, the equation gets very complicated. Again, consider the disproportionate wattage of an overclocked i9.

    I never said the scaling would work in favor of Vega. In fact, I wouldn't be surprised if it works against it. But as bad as it could get, I still think you may be over-estimating. We don't know enough about the wattage per transistor. Also just to clarify, is the 280W you referred to TDP or the actual measured wattage under full load? Because advertised TDP is a real crappy way to calculate wattage, for any product.

    375W is not a good number to have, and it is a number you came up with based on a loose equation. You brought it up as a way to express how inefficient you felt the architecture was. You then compared it to a 1080Ti, something the product isn't advertised to compete against. That sounds pretty pessimistic to me.

    Well given the variables I have (so no temperature, no fan speeds, no voltage, etc.) the equation is left to be:
    ((1-(S/O))*A)+A=B
    Where S is stock frequency, O is overclocked frequency, A is stock wattage, and B is the final overclocked power draw.
    ((1-(1440/1700))*280)+280=B
    ((1-0.85)*280)+280=B
    (0.15*280)+280=B
    42+280=322
    So that's a 53W difference without considering other variables that may improve or worsen wattage. This is the difference between nearly exceeding the PSU specifications and "just a very hot GPU".

    Unless I'm not understanding what you mean by "sample variance", it can be as much as a 50W difference.


    Remember - I'm not saying Vega is efficient. I'm overall not impressed by it, but, I just think you're over-estimating how bad it is.
     
    Last edited: Jul 12, 2017
  15. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    OK now you're conflating the unknowns with the approximations within the model. Like... you really really need to know the initial velocity(!) to have any clue about the final velocity. While not having GPU temp merely suggests a rough model, an approximation.
    And since you're being pedantic, you forgot the height above sea level, and dozen other initial conditions :) And even if you had all of them, you still woul;dn't be able to solve this "simple" problem analytically, because as far as I know there is no general and the exact motion equation which accounts for air resistance.
    So again you're are back to approximations and some kind of experimental model. But ok so far we agree.


    would you have been any happier if you had temp, fan speed, voltage?
    would this attempt at power calculation had been any different?

    So after chastising me for being overly-simplistic in my pessimistic approximation,
    you yourself went with the most basic, linear approximation (which you yourself said is wrong), and the one that everyone should knows is impossible in the real world.
    What happened to common sense, why not add few %?
    Anyone with a clue should know that power going linear past max. boost clock all the way to 1700MHz is VEEEERY optimistic. <-- DONT YOU AGREE?


    Take a look:
    Vega FE
    1650MHz, 1.2V

    375 Watts from 2x 8-pin alone

    overclocking is kinda broken because once you OC, GPU goes to 1.2V
    https://www.youtube.com/watch?v=IfSGboBX1QE
     

  16. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    I didn't list all variables, just an example of many - you are right though. But, it isn't pedantry when your result is well beyond the margin of error. That's my only point, and 375W based on my rough calculation is beyond the margin of error.

    Yes, in addition to actual measured wattage (so not TDP). But despite what you may think, I'm not that picky either. After all, you were giving an estimate, just a very high one.

    I'm not chastising you for being over-simplistic; I don't care about an approximation. My gripe is you intentionally over-estimated, pitted a product against another of a higher performance tier (keep in mind the 1080Ti has more transistors than the 1080), and used that as a way to ridicule Vega's efficiency. To reiterate, I don't think Vega is that efficient either, but it isn't that bad.

    You asked, I obliged. The reason I did that was to show prove that a calculated rough estimate should've been a lot lower than what you said, which it was.


    Yup, but who says that's a necessary scenario? Again, RX Vega isn't supposed to be pitted against the 1080Ti, so what's the point is making such a comparison when discussing efficiency?
    In another perspective:
    Overclock a 1080 to perform like a 1080Ti and you'll find its efficiency isn't so stellar either (when compared to the 1080Ti).
     
    Last edited: Jul 12, 2017
  17. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    Well OK then.
    If you think that perf wise Vega RX should be 1080 competitor, then indeed there is no sense comparing it on the same performance basis with 1080 Ti.
    But that is more pessimistic than anything that I have envisioned for Vega RX.

    Reported to AMD for being a Debbie Downer :D
     
  18. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,230
    Likes Received:
    1,494
    GPU:
    2070 Super
    BTW why do think that perf-wise Vega is more 1080, than 1080Ti competitor?

    Wouldn't by any chance Vegas lowish clocks,relatively to Pascal, have anything to do with Vega's performance,
    ie wouldn't Vega being only at 1080 level have anything to do with it's power consumption, ie being TDP limited?

    So there you go -> efficiency=performance ;)
     
  19. Elder III

    Elder III Guest

    Messages:
    3,737
    Likes Received:
    335
    GPU:
    6900 XT Nitro+ 16GB
    At this point we just do not know anything for sure one way or the other. Until Hilbert, or some other reliable review site releases reviews with benchmarks nothing is certain.

    With that said, my very rough guess is that it will land in between a GTX 1080 and 1080 Ti. That would put AMD unfortunately late to the game, but also have a very needed benefit of lowering prices for the high end gaming GPU market. So far Vega (frontier version) doesn't look like much of a game changer for mining, so hopefully the stock will not get swallowed up immediately by mining farms in China that run hundreds of GPUs on $0.01 electricity costs. :p
     
  20. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    Based on my understanding, Vega is supposed to be a healthy level ahead of the 1080, but distinctly behind the 1080Ti. I never heard AMD claiming it was ever meant to compete against the 1080Ti. Considering AMD likes to cherry pick results, that's saying something.

    I can see why you'd think my view is pessimistic, but I don't think Vega is a bad product. It's a little more power hungry than I'd like to see, I'm slightly disappointed that's the best they could do, and I don't really understand who the target demographic is, but I think it's a solid product. I know many people here expect it to outperform the 1080Ti, and I find that a bit unrealistic.


    Because AMD said so, and because from the few results I've seen of Vega FE, Vega RX isn't bound to be that much different.

    No? They're completely different architectures, to the point they don't even have the same memory controller. They're so different that you can't compare clock to clock.
    It is within the brands interests to stay within a certain power envelope (not TDP, because that's not the same thing). This is why you'll rarely see reference GPUs exceed 300W in benchmarks.
    I'm sure there is OC headroom for Vega, and I am fully aware it is relatively inefficient compared to Pascal - again, I'm not denying that. I doubt AMD themselves will push Vega to reach 1080Ti levels, even if that's theoretically possible.
    However, I do think 3rd party companies like Sapphire, Asus, Gigabyte, and so on will do their own "superclocked" variants.
     
    Last edited: Jul 13, 2017

Share This Page