AMD RX Vega Shown Against GTX 1080 at Budapest Event

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 19, 2017.

  1. airbud7

    airbud7 Ancient Guru

    Messages:
    7,578
    Likes Received:
    4,302
    GPU:
    pny gtx 1060 xlr8
    but that's how I learn stuff is a good debate/...there cool too.
     
  2. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,770
    Likes Received:
    2,208
    GPU:
    5700XT+AW@240Hz
    Here you go:
    Yes, Krizby does not understand TDP.

    @Krizby: No, simply no. Heatpipes/metal-block/liquid-medium are part of cooling solution. If TDP of cooler is 250W, it has to be able to take, transfer and dissipate 250W given time interval no matter construction.

    And since you think that TDP is value belonging to dissipation of radiator only. Then you can rate side panel of your PC as 300W cooler second you connect it to GPU with piece of wire. And that is beyond stupid.

    TDP belongs to cooling solution as a whole and only to cooling solution. And if cooling solution is more efficient thanks to liquid medium, then it has proportionally higher TDP.
    Following statement of yours is simply based on your belief, that values AMD places on cooling solutions next to TDP are real:
    "While watercooling is a more efficient way of remove the heat from the GPU to keep gpu temp lower, the heat output into the environments is still equal to the power the GPU uses. 2 coolers with the same TDP can cool very differently if one has heatpipes and the other had none"
    And you are very wrong. HD 7970 had 200W TDP cooling at best (hence overheating issues). TDP of Fury X's cooling solution is likely around 400W.

    In reality, if you used that so called 250W TDP cooling from HD7970 on Fury X, it would lead to GPU/VRMs overheat within 30s under load, even if GPU was limited to 225W. Not because bad heat transfer between GPU and radiator fins, but because radiator itself was pretty poor. (HD7970 used vapor chamber and radiator fins were almost as hot as GPU)

    Btw. that 2nd statement of yours I bolded is in direct violation with 1st law of thermodynamics.
    And that underlined part, You are just twisting TDP into something perverted.
    And part you wrote about TIM... I have hard time to evaluate where it came from without getting few infraction points :D
    = = = =
    Basically only thing you made there which was not completely dumb was:
    "A gpu that uses 400W when under air or water would still heat the room as much as a 400W heater, is that too much to understand ?"
    => And funny part about it is, that we (Noisiv & me) are not even discussing 1st law of thermodynamics, because there is nothing to fight over there.

    Considering that you wrote nonsense after nonsense... Do not expect me to reply to more of it. If you manage to make something meaningful, you'll get my keyboard time.
     
  3. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,657
    Likes Received:
    499
    GPU:
    2070 Super
    Try to define "thermal solution design" yourself, and you'll see that you're in a world of pain.

    For a starter, what exactly your "thermal solution" consists of.
    Is it just the cooler itself, or is it cooler + CPU, or perhaps it's "cooler + CPU + case"

    If it's just cooler, or even cooler + CPU, I suspect you could have sufficient "thermal solution" according to intel's TDP requirement, yet failing in practice due to terrible airflow AND/OR hot ambient temperatures.
    I doubt that such system could ever be entirely divorced from its surroundings and precisely defined on its own to your(Intel's) satisfaction (for example you don't want to overshoot TDP req. because that's bad press)

    And "thermal solution" sure as hell not does not refer to "cooler + CPU + case" because hell...that system is surely an overkill and able to dissipate much more heat.

    Of course they are capable to define TDP precisely, for professional users or even consumers(like later care), but sure as hell one or two sentences will not suffice.
    Look at any real world industrial standard specification for example.
     
  4. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,770
    Likes Received:
    2,208
    GPU:
    5700XT+AW@240Hz
    I do not expect it to go over 350W at stock. Actual PCB design can go to 500W and it will not matter in other way than being safe from burning.

    AMD simply can't pull more than 60W from PCIe slot and even that's not wise since last time people had some mental issues from it. And more than 150W per 8-pin will lead to people getting crazy over standard even while half decent PSU can give 300W per 8-pin.
    (Because there will be people who will not even read values on their PSUs, or check professionally done tests to their PSUs, and blindly connect their chapo PSU to power hungry graphics card.)

    I personally feed my GPU by one set of wires going to 8-pin, and from that 8-pin wires jump to next 8-pin connected to same GPU. No problems because I checked that this PSU can deliver more.

    And yes, once OCed with vBIOS or other way of Power Limit increase, it will surely eat a lot. But that's everyone's choice. They may under-clock and under-volt it a bit and get it into some reasonable power efficiency ratio.
     

  5. Exascale

    Exascale Banned

    Messages:
    397
    Likes Received:
    8
    GPU:
    Gigabyte G1 1070
    Did you mean to quote the other guy talking about "cooling solutions"?
     
  6. Noisiv

    Noisiv Ancient Guru

    Messages:
    6,657
    Likes Received:
    499
    GPU:
    2070 Super
    No. You brought up Intel's and AMD's TDP definitions, and Intel is talking about "thermal solution design targets".
    I suspect there might be addendum to that definition buried somewhere in their internally circulated literature.
     
  7. Exascale

    Exascale Banned

    Messages:
    397
    Likes Received:
    8
    GPU:
    Gigabyte G1 1070
    Oh that was just an excerpt from the Intel Whitepaper called "Measuring Processor Power".

    http://semiengineering.com/controlling-heat/ thats another good read.
     
    Last edited: Jul 19, 2017
  8. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,770
    Likes Received:
    2,208
    GPU:
    5700XT+AW@240Hz
    And on top of that intel likes SDP, which I hate.
    SDP is "property" connected to Chip and not to cooling solution.
    They call it "Scenario Design Point" and says that chip eats this kind of energy under some scenario defined by intel. (Which they call workload placed by average user.)

    It is supposedly information for system builders. But in reality, value they provide is mere fraction of maximum power consumption of chip and therefore cooling solution based on given value is very insufficient.

    In case of Atoms they state 2.5W, yet chip peak power draw is 10W and it stays there as long as cooling is provided which is rare thing for tablets, so heavy throttling.
     
  9. sammarbella

    sammarbella Ancient Guru

    Messages:
    3,931
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    350W at stock is not exactly a power saving or power efficient GPU compared to Nvidia GTX 1080 180 TDP :D

    I agree: AMD can't make their GPU dumb-proof...

    Here i did the same, a bit more complex (one 8+8, another 8+8 and third one for 6+6 shared between GPUs) but the idea is the same: common sense and PSU exceeding requirements by a good margin in Watts and Amps per rail.


    Who is going to buy the big watercooled RX Vega to underclock and undervolt it to save power?

    AMD drivers already have their own performance disruptors, i mean power saving features: chill and wattman.
     
  10. RzrTrek

    RzrTrek Ancient Guru

    Messages:
    2,354
    Likes Received:
    632
    GPU:
    RX 580 ❤ MESA 19.2
    I couldn't have said it better myself, but if you dare to mention it, you'll be in deep trouble on this forum.
     

  11. airbud7

    airbud7 Ancient Guru

    Messages:
    7,578
    Likes Received:
    4,302
    GPU:
    pny gtx 1060 xlr8
    Is this gonna be another bitcoin miners dream card?....$$$

    AMD is doing real good thanks to bitcoin....Just saying.
     
  12. Redemption80

    Redemption80 Ancient Guru

    Messages:
    18,348
    Likes Received:
    192
    GPU:
    GALAX 970/ASUS 970
    No one is objective, and I tend to trust the ones who are obvious with their bias.

    His videos rarely provide anything new when it comes to information, but he has grown on me and is alot more critical if AMD than most people.
     
    Last edited: Jul 20, 2017
  13. sverek

    sverek Ancient Guru

    Messages:
    5,454
    Likes Received:
    2,324
    GPU:
    NOVIDIA -0.5GB
    Before the GPU release?

    After GPU release, yes. We all looking for accurate numbers and comparing benchmarkers results. Some people still might ignore the facts.
    Until then, there no facts. There only circlejerk or flame war.

    But yeah, I should of applied the right scope for my statement.
     
  14. Exascale

    Exascale Banned

    Messages:
    397
    Likes Received:
    8
    GPU:
    Gigabyte G1 1070
    I was referring to what Redemption80 said mostly.

    But yeah, since no one has these cards yet it is just a flame war basically. Its kind of funny tbh.
     
  15. Krizby

    Krizby Master Guru

    Messages:
    586
    Likes Received:
    1
    GPU:
    Nvidia Gtx 1080ti
    Man maybe school is not for you, perhaps all that smoke from your HD 7970 vrm must have screwed with your brain. From your limited samples of only 7970 and Fury X and you think the 7970 is overheating @80C, all the R9 290X reference users (myself included) would just be laughing our asses off.

    "HD 7970 had 200W TDP cooling at best (hence overheating issues). TDP of Fury X's cooling solution is likely around 400W."

    Yeah sure you can keep running your Fury X at 400W power draw and still have good GPU temps while your pump hasten to its death, probably before your "overheating" 7970 dies.

    "TDP belongs to cooling solution as a whole and only to cooling solution. And if cooling solution is more efficient thanks to liquid medium, then it has proportionally higher TDP."

    If you think the cooler performance is tie entirely to its TDP that is way beyond ignorance man. I mean way beyond "saving the planet" kind of ignorance, try explaining how 120mm AIO cooler perform similarly with their bigger counterparts at stock CPU settings while lagging behind when the CPU is overclocked ?

    https://www.*******.com/arctic-liquid-freezer-120-aio-cpu-cooler-review/5/

    https://www.*******.com/cooler-master-masterliquid-120-aio-cpu-cooler-review/5/

    Yes once you exceed the dissipation capacity of a cooler, its performance will deteriorate. MasterLiquid Maker is a prime example of how slapping watercooling in a tower cooler does not increase it TDP. Ever heard of TEC cooling before ? a TEC can keep the surface temperate below ambient temp only if the CPU/GPU is below the TDP it can handle, once you exceed that TDP TEC cooling is just a useless piece of metal. You could slap an exotic phase changer to a cooler that is meant to dissipate 65W and still achieve nothing beside breaking the phase changer.

    haha yeah, so many people feel butthurt when someone just pointing out the lunacy of AMD marketing.
     
    Last edited: Jul 20, 2017

  16. Evildead666

    Evildead666 Maha Guru

    Messages:
    1,252
    Likes Received:
    250
    GPU:
    Vega64/EKWB/Noctua
    For those asking, AMD won't release any card over the PCIe Spec.
    It may have a BIOS switch to change it over to an overclocked Spec, but that will be up to the card purchaser, and probably void the official warranty (AMD's, not the vendors).
     
  17. haste

    haste Master Guru

    Messages:
    847
    Likes Received:
    199
    GPU:
    GTX 1080 @ 2.1GHz
    Budapest Event:

    Positives:
    - 2 sexy girls sweating in their short shirts

    Negatives:
    - RX VEGA
     
  18. ruggafella

    ruggafella Member

    Messages:
    23
    Likes Received:
    0
    GPU:
    MSI 1080 8G
    Nope. It looks like there's nothing in it to attract miners. It doesn't have an exceptionally good hashrate (a bit better than an Rx 480/580) and it has a massive mark up in power usage and price (tbc) compared to the Rx 480/580.

    Not bitcoin btw. Modern GPUs are crap for bitcoin - these are used for other cryptocurrencies (Eth primarily for Radeon) that are ASIC resistant.
     
  19. Evildead666

    Evildead666 Maha Guru

    Messages:
    1,252
    Likes Received:
    250
    GPU:
    Vega64/EKWB/Noctua
    I suspect that when the cards become available, the miners will modify the code to take advantage of the HBM, like it is with the Fury's.
    There isn't any reason for it not to be a good miner...unfortunately.
     
  20. sammarbella

    sammarbella Ancient Guru

    Messages:
    3,931
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    Some new details about the AMD showcase:

    http://www.pcgamer.com/amd-takes-radeon-rx-vega-on-tour-and-compares-it-against-a-geforce-gtx-1080/
     

Share This Page