Radeon RX Vega to Compete with GTX 1080 Ti and Titan Xp

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Apr 26, 2017.

  1. Clouseau

    Clouseau Ancient Guru

    Messages:
    2,607
    Likes Received:
    353
    GPU:
    ASUS STRIX GTX 1080
    It's...POOF...
    Where'd it go?

    ...squirrel...
     
  2. Silva

    Silva Maha Guru

    Messages:
    1,291
    Likes Received:
    481
    GPU:
    Asus RX560 4G
    This made me laugh.

    Not going to say you're wrong, but what Nvidia did recently with 1080Ti and Titan Xp is prof of a very greedy company.

    We've seen videos of it having close to same performance of 1080 months ago...I don't see why not.

    ^^This^^
     
    Last edited: Apr 26, 2017
  3. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,125
    Likes Received:
    2,960
    GPU:
    5700XT+AW@240Hz
    Actually intel is not as much guilty as they look like. Their CPUs are brutally priced over manufacturing costs.
    But they pump most of that into other divisions/projects.

    Some time ago, there was discussion about it and I did full calculation. Intel simply can't afford to go with CPU prices down much as they would have to make cuts elsewhere.

    As far as nVidia's pricing goes, they do what any company does when they are alone in certain market segment. (While intel did exactly same thing, nVidia reinvesting those additional funds in same segment which makes them.)
    Situation would be much different if RX 480 clocked around same as GTX 1070/1080. But 1st 14nm simply proved to be worse than 16nm + AMD still did not have power gating on level of nVidia, so their GPUs are wasting more energy and therefore can't clock that high as waste increases chance that transistor will end up in uncertain state.
     
  4. darkcoder

    darkcoder Member

    Messages:
    13
    Likes Received:
    0
    GPU:
    MSI RX 480 Gaming X 8GB
    First, if people look back at the older Vega tech demos, they were beating the GTX1080 already. The real question has been since the 1080Ti got out, is how good compared to that Vega will be, and more important, at what price.

    But now looking at this news objectively, I don't see any link to the famous Q&A article. And I guess since the Internet rule is if there is no proff (where's the damn link), then is Lies!.
     
    Last edited: Apr 27, 2017

  5. GALTARAUJO

    GALTARAUJO Active Member

    Messages:
    54
    Likes Received:
    0
    GPU:
    2 x GTX980 Strix
    It seems Economics 101 could come in handy.

    But for Tomas Aquino, Saint Augustine and few other people, EVERYBODY is greedy. Everybody. This is a fact of life, not a problem with capitalism, much less with the GPU/CPU market anno 2017.
    Therefore, all companies will maximize their profits, which means milking their customers as much as they can. If there is serious competition, then there are limits to how much they can milk us, because our pockets - or our tits - have limits as well: we have to share our budget - or our milk - with everybody.

    Intel and nVidia are just maximizing their profits, the way they should be. Of course prices would be lower if there was some serious competition, but it just has not been the case for close to 10 years.

    Next subject, please.
     
  6. wavetrex

    wavetrex Maha Guru

    Messages:
    1,297
    Likes Received:
    891
    GPU:
    Zotac GTX1080 AMP!
    It's funny that so many people today say "this is sooo expensive, that is soo overpriced, companies are milking us"

    Yet we are getting these multi-teraflop supercomputers in a 2x2cm tiny square, with absolutely amazing graphics in photo-realistic games on Ultra-HD resolutions that were unthinkable just a few years back.

    I started my "accelerated" 3D Gaming with "S3 Trio 3D" on AGP... which was basically displaying a slideshow of flat shaded triangles and called it a game...
    And I payed a $hitload of ca$h on something that looks like this:
    [​IMG]

    AND I LOVED IT. It was my amazing gaming VGA card, and it could do this amazing 3D thing in a world of Mario-like sidescroller games.

    Please, stop it with the milking... seriously. E'nuff is e'nuff !
     
  7. HeavyHemi

    HeavyHemi Ancient Guru

    Messages:
    6,963
    Likes Received:
    963
    GPU:
    GTX1080Ti

    That's great, but your *if* is exactly my point. They didn't release, thus missed that window. And, not just since the release of the 1080 Ti, but also since the release of Pascal. You're almost repeating Vase's argument which is faulty for reasons already explained.
     
  8. XenthorX

    XenthorX Ancient Guru

    Messages:
    3,469
    Likes Received:
    1,396
    GPU:
    3090 Gaming X Trio
    As much as i like Guru3d, an article on such an obscure quote, i'm used to better than this. Could means raw performance, could be performance per dollars, could mean anything.
     
  9. tensai28

    tensai28 Maha Guru

    Messages:
    1,416
    Likes Received:
    365
    GPU:
    2080ti MSI X TRIO
    Yawn. Like usual with AMD; I'll believe it when I see it. Hopefully that will be next month when I go to computex.
     
  10. sverek

    sverek Ancient Guru

    Messages:
    6,097
    Likes Received:
    2,953
    GPU:
    NOVIDIA -0.5GB
    If AMD is being hysteric and wants to be picked up by media, it will. Go get outside and shout, someone might notice you.

    It's not really a news or anything, its AMD sending a message. Media picks it up and shows it. Whether its useful for you or not is depending on your critical mind and vision.

    I am not really sure about AMD communication lately, but whatever...

    #BETTERRED #DEATHTONOVIDIA #LULNOVIDIA #AMD4LIFE ... edit: wait I saw this somewhere...

    [​IMG]
     
    Last edited: Apr 27, 2017

  11. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,456
    Likes Received:
    482
    GPU:
    Sapphire 7970 Quadrobake
    It's not delayed. We know it's supposed to be out on 1H 2017, since at least June 2016 and the AMD investor calls.

    This is just wishful thinking from AMD fans, until Volta comes out. Truth be told, it seems that NVIDIA seemed to tackle too many things at the same time with it and unlike Vega it is quite delayed at this point, but we'll see.

    Doesn't the Vega memory controller make this point moot? I mean they have experimented with mixed storage with their professional line already, and one of the very few concrete stuff we know about Vega's controller is that it can work with everything, even network storage, as long as it has some fast local cache. I kinda think that the reason that Vega is taking its time is proper clock modulation and drivers, seeing how they put so much importance in the Linux driver this time and they're introducing a whole new driver platform with it.

    There is no way this will happen with either the Ti or Vega. When people say 4k60 fps, they don't really mean 60 fps gameplay. 60fps gameplay means minimum 16.67ms frames, even paced in a frame. With that criteria, these cards are really 1440p60 hardware.

    I'm not sure that HBM2 is that much more expensive from GDDR5x, especially for AMD, since they hold most of the HBM patents. Vega looks like a very nice compute GPU, and AMD has had tradition in great compute performance since GCN 1.0. I believe that along with their new memory controller, they'll do great in the Ai market. They are behind in software, but GPUOpen is already paying off and they are focusing a lot of people in Ai.

    I would argue that the tiled rasterization, the polygon culling stuff and the new command processor and memory controller aren't really power saving features. The mc and the command processor are the most intriguing part of Vega to me, just because of the potential. A Fury X with 8GB of VRAM at 1.5GHz would also demolish the 1080, and probably reach 1080Ti levels, so Vega competing with them isn't out of the question at all. I just prefer to hold a smaller basket, especially for the initial release benchmarks where AMD will disappoint as is tradition.

    Pretty much this. If it's ~5-10% close to the Ti initially, with a $600 price tag, I'm hitting it just for the price performance. I honestly see NVIDIA's DX11 lead eroding and mattering less and less though.
     
  12. angelgraves13

    angelgraves13 Ancient Guru

    Messages:
    2,181
    Likes Received:
    636
    GPU:
    RTX 2080 Ti FE
    It'll likely be $499
     
  13. GeniusPr0

    GeniusPr0 Maha Guru

    Messages:
    1,249
    Likes Received:
    15
    GPU:
    RTX 2080
    most likely only in ultra high res, the architecture is interesting. Claiming the HBM cache boosted performance in some areas in DE:MD.
     
  14. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    6,972
    Likes Received:
    204
    GPU:
    980
    If this delivers pretty comparable performance to 1080ti totally buying it. Been waiting which one to buy. Rather use amds drivers actually and geforce experience seems horrible with it's added osd.
     
  15. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    11,866
    Likes Received:
    3,885
    GPU:
    2080Ti @h2o

    Most sensible post in here so far. QFT.
    As in Vega, I believe it when I see it. Don't fall for the marketing bs... real numbers are important. Also, no cherry picked benchmarks in AOTS, please AMD, thank you.
     

  16. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,759
    Likes Received:
    1,119
    GPU:
    EVGA 1080ti SC
    The thing is if it's at that performance level and $600 the updated 1080's are a better $/FPS than the Vega will be. The 1080+ is around 15% slower than reference TX and Ti and I'm seeing them for about $510.
     
  17. Rich_Guy

    Rich_Guy Ancient Guru

    Messages:
    12,581
    Likes Received:
    601
    GPU:
    MSI 2070S X-Trio
  18. GeniusPr0

    GeniusPr0 Maha Guru

    Messages:
    1,249
    Likes Received:
    15
    GPU:
    RTX 2080
    AMD said 50% more performance just from the cache, I don't really understand how this card is going to work.

    I think it's going to be all over the map, and since PS4 and Xbox Scorpio are AMD, the future bodes well.

    Not worth speculating IMO.
     
  19. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,142
    Likes Received:
    2,628
    GPU:
    AMD S. 5700XT Pulse
    That was only in a specific demo of Mankind Divided though from what I can find, "50%" increase in average and "100%" increase in minimum but I can't find what the actual framerate was so it might not be all that impressive, game is pretty demanding on the current Fury GPU even without going 2560x1440+ on the display resolution though I guess percent wise it's still a pretty good gain but it was a specific demo for showing off the Vega HBM's cache thing so I doubt every game is going to see such results.
    (Probably was with DX12 too and might have had some game specific tweaks as well.)
     
  20. rl66

    rl66 Ancient Guru

    Messages:
    2,615
    Likes Received:
    247
    GPU:
    Sapphire RX 580X SE
    Because AMD have done some hype in the past with:

    -GPU with GDDR5 for bench... but only DDR3 in shop

    -OC bios GPU in desguise of standard GPU (not the only one i agree, but more than an habit for AMD)

    -full GPU send to press while cut down one is the only one in shop (more in asia for that trick).

    -and of course demo with not commercial mod that advantage the result.

    for all of that and despite AMD is way better than 5 year ago ... i will wait the review from HH :)
     

Share This Page