AMD publishes Radeon VII benchmark results from 26 games

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 11, 2019.

  1. Lino69

    Lino69 Guest

    Messages:
    2
    Likes Received:
    2
    GPU:
    RX 570
    The 16 GB of HBM2 memory is very much needed for future proofing and new upcoming games

    RE2 already bottlenecks with only 12 gbs and the memory requirements are only going to increase from now on

    12 GB is clearly not enough anymore and with the new next gen consoles on the horizon I can only Imagine how much more the memory needed to run newer games will be
     
  2. Maddness

    Maddness Ancient Guru

    Messages:
    2,440
    Likes Received:
    1,738
    GPU:
    3080 Aorus Xtreme
    Going on what AMD have said, it should be right around that figure. Still seems like a bit of overkill for 1080p unless you have a 144hz monitor.
     
  3. Killian38

    Killian38 Guest

    Messages:
    312
    Likes Received:
    88
    GPU:
    1060
    Mining is what killed it for me. Well over a year I couldn't buy a GPU because I refused to pay what they were asking. And I still refuse. Nvidia still has their head in the clouds. I hope AMD gets this right. But I think AMD understands what's going on here. How the card compares to nvidia isn't as huge as how the price compares. AMD gets it I think. One way or another I'll buy a new AMD card this year. New or old tech.

    As for cards needing a ton of memory to play X game with X memory. Eh, that is not the whole story. Too many folks get caught up in the ingame benchmark telling them they don't have enough memory. When in fact their card is just fine. With in reason, I never cared about how much memory they tack on.
     
  4. Maddness

    Maddness Ancient Guru

    Messages:
    2,440
    Likes Received:
    1,738
    GPU:
    3080 Aorus Xtreme
    Normally i would totally agree, but certainly not with the price of this card. If it launched will a $500 i would agree.
     

  5. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    I'm going to try that now on 6gb vram :p


    Btw re7 already used all vram, but it's mostly because of extra streaming buffer caching, I doubt this needs that much as well :)
     
  6. Aura89

    Aura89 Ancient Guru

    Messages:
    8,413
    Likes Received:
    1,483
    GPU:
    -
    That's not how CPU bound works.

    The higher the resolution, the less the CPU matters. Therefore, the higher the resolution, unless something else becomes the bottleneck, such as the VRAM, the more accurate performance differences you will get.

    Your logic is backwards.
     
  7. Undying

    Undying Ancient Guru

    Messages:
    25,343
    Likes Received:
    12,754
    GPU:
    XFX RX6800XT 16GB
    They probably already knew how much vram these games are using being bundled with DMC5, D2 and RE2.

    16gb sounds sweet and who thinks otherwise there is always half the vram nvidia cards for the same price. :p
     
  8. Exodite

    Exodite Guest

    Messages:
    2,087
    Likes Received:
    276
    GPU:
    Sapphire Vega 56
    Perhaps, though that bundle isn't new with Radeon 7 as I got those exact games bundled with my Vega 56 that I bought after Black Friday last year.

    Just saying that I don't think they were cherry-picked on VRAM usage, though the bundling does kinda explain AMD showcasing and testing those games early on. :)
     
    HARDRESET and Undying like this.
  9. Undying

    Undying Ancient Guru

    Messages:
    25,343
    Likes Received:
    12,754
    GPU:
    XFX RX6800XT 16GB
    Yeah i forgot they are also bundled with Vega cards but like you said they could have tested the games early and knew.
    DMC5 is running on the same engine as RE2. We can probably expect similiar vram req.
     
    HARDRESET likes this.
  10. Maddness

    Maddness Ancient Guru

    Messages:
    2,440
    Likes Received:
    1,738
    GPU:
    3080 Aorus Xtreme
    I doubt that these games would need anywhere that much ram to run smoothly. If they did it would have a bad effect on 99% of AMD GPU's out there. By the time games require that much to run well, this card would be well and truly outdated. The 16gb it's for content creation. Now that is where this card makes more sense.
     

  11. BuildeR2

    BuildeR2 Ancient Guru

    Messages:
    3,208
    Likes Received:
    437
    GPU:
    ASUS 4090 TUF OG OC
    STRANGE BRIGADE VOLCANO!


    Ahem, on topic if I didn't have my 1080Ti FTW3 I would totally buy this GPU and pair it with a nice large Freesync screen. I like the fact that they finally have a good stock cooler. I haven't purchased a blower card since my X850XT AGP back in the day.

    Lastly, the 7nm refreshed 570 could be quite an interesting 1080p low power option before Navi arrives. All in all, good times ahead for the tech industry this year!
     
  12. Undying

    Undying Ancient Guru

    Messages:
    25,343
    Likes Received:
    12,754
    GPU:
    XFX RX6800XT 16GB
    Most of us already have 8gb vram on a disposal. Amd cards always had more vram compared to nvidia counterparts and trend continues.
    Its not our fault nvidia is selling 6gb vram gpus for 350$.
    I can definitely see higher end Navi gpus also having more than 8gb later this year.
     
  13. HARDRESET

    HARDRESET Master Guru

    Messages:
    890
    Likes Received:
    417
    GPU:
    4090 ZOTAEA /1080Ti
    Yup, in settings ,you can specified over 12 GB of vram , when playing not using close to that at all , not attractive at all .
     
  14. Undying

    Undying Ancient Guru

    Messages:
    25,343
    Likes Received:
    12,754
    GPU:
    XFX RX6800XT 16GB
  15. HeavyHemi

    HeavyHemi Guest

    Messages:
    6,952
    Likes Received:
    960
    GPU:
    GTX1080Ti
    The GPU is no more powerful than a 1080 Ti and will not take advantage of the *extra VRAM to increase performance over either the 1080 Ti or the 2080. The end.
     
    Xtreme1979 and Maddness like this.

  16. Maddness

    Maddness Ancient Guru

    Messages:
    2,440
    Likes Received:
    1,738
    GPU:
    3080 Aorus Xtreme
    That's what I'm saying. 99% of AMD GPU"s has 8gb or less VRAM. I should have made myself more clear. I wasn't talking system ram. My bad.
     
    Undying likes this.
  17. Maddness

    Maddness Ancient Guru

    Messages:
    2,440
    Likes Received:
    1,738
    GPU:
    3080 Aorus Xtreme
    I agree. I know it looks like that was the minimum they could have used. But it is total overkill for gaming. Now content creation is a different story.
     
  18. moo100times

    moo100times Master Guru

    Messages:
    566
    Likes Received:
    323
    GPU:
    295x2 @ stock
    OK I may be late to this party, but honestly, here in UK RTX 2070 is 450 - 600 GBP, and the lower price ones are out of stock. The RTX 2080 is 800 GBP+.
    The RTX 2070 is already in the ballpark for 700 USD. Certainly here, I cannot see anyone that would pick a RTX 2070 over the above tested Radeon 7 - faster GPU and double VRAM, and drivers not even finalised (adrenaline 18.5). I want to see real life benchmarks on a less limited rig, but with these numbers and relative costs, does not look like there is much debate to me.
    They are offering in region of top tier for price of rivals middle tier, and will be useful for both gaming and content creation (and correct me if I am wrong, but aren't nvidia usually kind of fickle in this domain and want creators to buy quadro > GTX/RTX for more $$$).
    As long as Radeon 7 does not catch fire, looks like a bargain to me.

    As for the people complaining of 2 year old performance - gains are gains. If you are unhappy with this level of performance, feel free to spend double for a card that will do 20% better with features that no one yet implements on their games. I would however recommend considering the possibility that no one has discussed - Crossfire (or whatever new efficient bridge to pair 2 gpu's). Can get 2 of these cards for price of x1 2080Ti. I reckon Nvidia know this as they removed this possibility from their own 2070 and other non top tier cards to not cannibalise their own sales.
    Whilst I appreciate this isn't everyone's cup of tea, if more native dual gpu support coming in dx12 and vulkan, could have a competent rival with more power and resources available for same money. Hell, I would love them to make a dual gpu card, especially if it comes with 16gb x 2 VRAM.
     
    carnivore likes this.
  19. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,011
    Likes Received:
    7,353
    GPU:
    GTX 1080ti
    Yes

    and no.

    if your resolution also incurs an aspect ratio change, cpu requirements also scale up non linearly.
     
  20. chrislondon

    chrislondon Member

    Messages:
    27
    Likes Received:
    9
    GPU:
    RTX 2080 Ti
    1. 2070 is in stock at GBP459 at multiple large retailers.
    2. Multiple large companies have MSI RTX 2080 Ventus for GBP 649, and one has EVGA’s black for 650.
    3. You haven’t seen the actual price of Vega VII yet. Vega 64 was supposed to cost GBP 450 at launch but apart from a few hundred that sold out on pre-order within a couple of hours it didn’t for a very long time. So I would wait and see before making strong statements. (There is no mining craze so maybe this time it will be different.)

    I am generally not excited about the card.
    1. 2 year-old performance at a similar price. People who will buy the card now for $699 could have bought a 1080 ti 2 years ago and enjoyed the higher performance all along.
    2. AMD charges 50% more for 30% extra performance. Which is pretty similar to GTX1080Ti vs RTX2080Ti. Just sayin’.
    3. Plus most of that performance increase comes from a die shrink anyway - nVidia managed to get the same uplift while staying on (roughly) the same manufacturing process (12nm is not that much better than 16nm as the Ryzen 2xxx series showed). When nVidia moves to 7nm later this year/early 2020 they will get a “free” 30% boost too. AMD’s top card will be competing with nVidia’s 3rd best again.

    An 8GB version at $499 could have been great. This - not so much as it is on the same price/performance curve as the nVidia cards. I don’t even think nVidia will have to respond (they might do as they certainly have the room to do it).
     

Share This Page