Radeon R9 380X with HBA high-bandwidth memory

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 13, 2015.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,388
    Likes Received:
    18,558
    GPU:
    AMD | NVIDIA
    More and more rumrs start to surface on the  Radeon R9 380X using HBA memory, this round from resumes at Linked-in. And yeah, we know the name now as well, R9 380X as in LinkedIN Ilana Shternshain,...

    Radeon R9 380X with HBA high-bandwidth memory
     
  2. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,125
    Likes Received:
    969
    GPU:
    Inno3D RTX 3090
    It's 300W because everybody is stuck on 28nm and AMD apparently won't compromise on compute performance *cough* Maxwell *cough*
     
  3. And the tale goes on and on and on...


    And speaking of lyrics... AMD users could also use Up The Irons slogan to...
     
  4. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,693
    Likes Received:
    9,572
    GPU:
    4090@H2O
    Not really impressed with the 300W... could run 2 980s with that kind of power, couldn't I? I'm pretty curious what that HBM can do though.
     

  5. Undying

    Undying Ancient Guru

    Messages:
    25,332
    Likes Received:
    12,743
    GPU:
    XFX RX6800XT 16GB
    While in nvidia headquarters 128bit cards are ready to be released. :D

    AMD is serious about 4k gaming.
     
  6. Lane

    Lane Guest

    Messages:
    6,361
    Likes Received:
    3
    GPU:
    2x HD7970 - EK Waterblock
    No you cant.. stock 980 have a TDP of 181W ( bios max power limit ), but most oc models have a Boost TDP limit at 230W... ( the magic number of 163W is just an average calculated by Nvidia ).

    In fact, the 980 have basically the same TDP of the GTX680. ( who is allready incredible when you see the performance gain on same process ).

    This said, most gpu are designed to run at 300W ( who is the limit set ), its concern the electric board power and cooling capabilities warranty. not the power used ...

    based on this simple source, its a bit hard to know if the gpu is drawing so much power or if it based on something else. ( anyway if still on 28nm with a full DP setups / scalar units etc, no miracle i believe ) ...
     
    Last edited: Jan 13, 2015
  7. Battlefieldprin

    Battlefieldprin Guest

    Messages:
    146
    Likes Received:
    2
    GPU:
    ASUS 780TI DC 2 OC 3GB
    well why companies are making 1200 W and 1500 W for ? then if the technology and performance are worthy then why not even if it consumes 300 W. However , there is the issue of heat that I myself can not tolerate
     
  8. Goldie

    Goldie Guest

    Messages:
    533
    Likes Received:
    0
    GPU:
    evga 760 4gb sli
    no answer to Maxwell's light weight power consumption.
    hope it's out soon and that it sells well though.
     
  9. shymi

    shymi Guest

    Messages:
    152
    Likes Received:
    2
    GPU:
    EVGA RTX 3070
    Another day - another rumor...
     
  10. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    Who gives a shizle about power consumption, this 3d memory will lower heat output a lot.
    Im just curious if it will deliver like in that leak..



    980gtx is also "power friendly" but in end its what 15-30W less then 780GTX, wow.. :p

    [​IMG]
     

  11. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    I agree for the most part that as long as it's 300w or lower I don't really care but 3D memory isn't going to drastically reduce heat/power.

    Also 30w less then 780GTX but it's as fast as a 780Ti and that's without a die shrink. It's impressive. Especially when thats loaded -- it gets even better efficiency when doing driving miscellaneous tasks, which is an obvious byproduct of their work in mobile computing.

    I think Nvidia should have put a larger bus on the 980, but a 960 isn't going to run 4K, even if it was 10billionbit.
     
    Last edited: Jan 13, 2015
  12. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    I personally never cared about lower power consumption.. Its just a marketing thing to make people go wow, but its more efficient bla bla. Same with 580GTX vs 680GTX..

    Or same thing now with this AMD chip, but still new type of memory and up to 50% will lower heat for sure.


    Just like 980gtx runs cooler now - less power.. But obviously not so extreme like 3dmemory, Imo 50% difference gDDR5 vs 3Dmemory is something else.

    [​IMG]

    [​IMG]
     
    Last edited: Jan 13, 2015
  13. VultureX

    VultureX Banned

    Messages:
    2,577
    Likes Received:
    0
    GPU:
    MSI GTX970 SLI
    Yes, memory bandwidth really is a downside of nVidia products with the 9 series. I'd like to see a 512-bit 8GB version of their cards.
     
  14. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    It lowers the power consumption of memory. Which isn't a significant source of heat/power on a video card in the first place.

    Like if you take a card that hits 88c with GDDR5 and toss some HBM memory in, that card isn't going to magically go down to 60c. Or use like 100w less of power. I mean it's a nice gain but the way you worded your statement is like it's going to magically decrease heat across the entire chip or something.
     
  15. GhostXL

    GhostXL Guest

    Messages:
    6,081
    Likes Received:
    54
    GPU:
    PNY EPIC-X RTX 4090
    All rumors, I agree. I'm not really paying much attention to current cards being released as much as I usually do. Main reason is I've yet to find any game at 1440p maxed out that one 980 OC'd can't handle.

    I know AMD is trying to push 4K.

    I've seen 4K, upscaled 4K etc. It does not look that much better than 1440p. Very little difference in clarity. Sure it's there, but it just does not justify spending the money.
     
    Last edited: Jan 13, 2015

  16. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    Yes it is, why does 290X run so hot then, 512bit bus is taking its toll,..

    They even had to downclock it to 5ghz or it would heat & consume too much power and it still was hot as hell unless you put on a custom cooler.
     
    Last edited: Jan 13, 2015
  17. GhostXL

    GhostXL Guest

    Messages:
    6,081
    Likes Received:
    54
    GPU:
    PNY EPIC-X RTX 4090
    Yeah and that 512bit bus does not shine as it should due to architecture. It's why I'm very happy with the GTX 980.
     
  18. dean469

    dean469 Member Guru

    Messages:
    127
    Likes Received:
    12
    GPU:
    XFX RX480GTR @1385
    I don't really understand all the complaints about power usage lately. Sure if you're running a server farm or something. But private home users?

    The average price of a kilowatt hour in the U.S. is .12 cents. If you game 4 hours a day, 5 days a week average, at 100 watts extra an hour, that's 2 Kwh extra a week. It's like 13 extra dollars a YEAR.

    For home users, I just don't see the big deal.
     
  19. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    Because you design your architecture put a bus on it and memory. If the architecture uses 200w then you only have 100w for everything else to hit your 300 target. (Obviously heavily simplified)

    300w is 300w -- it doesn't matter what is generating it the heat will be the same. I'd actually argue that 300w card with HBM would actually show higher core temps. Since a larger % of that 300w is coming off the core and not off the outer extremities where the memory resides. But idk, if AMD enhances their cooler temps will drop. Regardless if the card is a 300w card, doesn't matter how much they lower memory power consumption, it's still 300w of heat.

    It's not a big deal honestly. But it does give you some insight to how performance could possibly scale. Nvidia could easily create a 300w Maxwell variant that would be roughly ~30% faster than current without making a single optimization. For the most part, I agree though, as long as it's 300w or under I couldn't careless.
     
    Last edited: Jan 13, 2015
  20. pbvider

    pbvider Guest

    Messages:
    989
    Likes Received:
    0
    GPU:
    GTX
    What comes with a high TDP? Oh, that's right. Heat!
     

Share This Page