Review: AMD Radeon VII (16GB)

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Feb 7, 2019.

  1. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,788
    Likes Received:
    1,139
    GPU:
    EVGA 1080ti SC
    AMD has a card with the most VRAM so this lie starts up again.
     
    yasamoka and alanm like this.
  2. leszy

    leszy Master Guru

    Messages:
    325
    Likes Received:
    17
    GPU:
    Sapphire V64 LC
    Do you think that NVidia marketing lied when he promoted the 1080TI and Titan cards?
     
  3. Denial

    Denial Ancient Guru

    Messages:
    13,235
    Likes Received:
    2,725
    GPU:
    EVGA RTX 3080
    RVII loses to the 2080 in FFXV at 4K in both average fps and 99th percent.. and even then it's like 40fps on average - I also don't even believe it uses that much ram or needs to. The textures in that game are average at best - its probably just one of those games that caches everything until VRAM is filled. I also like how you state RT/DLSS will only be used in 10 per 1000 games but that same logic doesn't apply to the number of games that will exceed 8GB of VRAM but still offer enjoyable framerates.
     
  4. -Tj-

    -Tj- Ancient Guru

    Messages:
    17,133
    Likes Received:
    1,903
    GPU:
    Zotac GTX980Ti OC
    Ffxv is bad benchmark full of flaws and very nvidia biased
     
    moo100times, HandR and fantaskarsef like this.

  5. alanm

    alanm Ancient Guru

    Messages:
    10,014
    Likes Received:
    2,179
    GPU:
    Asus 2080 Dual OC
    Not sure why many fail to understand "uses" more vram vs "needs" more vram in whatever scenario or game applicable.
    [​IMG]
    True, but point still stands re vram usage vs need.
     
    yasamoka, sverek and BlackZero like this.
  6. Denial

    Denial Ancient Guru

    Messages:
    13,235
    Likes Received:
    2,725
    GPU:
    EVGA RTX 3080
    I agree but I'm also not the one using it as an example of 16GB of HBM being necessary.
     
    yasamoka and BlackZero like this.
  7. -Tj-

    -Tj- Ancient Guru

    Messages:
    17,133
    Likes Received:
    1,903
    GPU:
    Zotac GTX980Ti OC
    Ah ok I didnt read that. Well imo 16gb won't be a thing for a long time. Even re2 remake is fine with just 6gb vram at something like 1620p.

    Same with newer cod's or hmm idk I saw few more that could use 10-12gb vram and all ran fine on my 980ti 6gb.
     
    Last edited: Feb 8, 2019
  8. leszy

    leszy Master Guru

    Messages:
    325
    Likes Received:
    17
    GPU:
    Sapphire V64 LC
    With "fluffy bisons" or without them? They gimped my fps from 75 to 28 fps :) I played FFV with hi-res textures with anabled HBCC and card was using 11.5GB VRAM in cities and 10.8 in plain.
    edited:
    From where I know that so few games will use RT? Poor speed of adaptation in previously announced titles means that coding for using RT (so as not to kill fps) is very time consuming = expensive. Probably this will be so until the consoles accept RT as a standard.
     
    Last edited: Feb 8, 2019
  9. Undying

    Undying Ancient Guru

    Messages:
    14,915
    Likes Received:
    4,026
    GPU:
    Aorus RX580 XTR 8GB
    Games are increasingly using more vram. Same time next year 8gb will be on the lower side. When you buy an 700$ card you want to last more than a year, right?
     
    Last edited: Feb 8, 2019
  10. metagamer

    metagamer Ancient Guru

    Messages:
    1,866
    Likes Received:
    731
    GPU:
    Palit GameRock 2080
    8gb vram a year from now will be on the low side? Dude, lay off whatever AMD are feeding you.

    Here's a tip. Don't live by the numbers you see in MSI Afterburner. Because for example, the AMD logo video in the startup process of The Division 2 is apparently using 5.1gb of VRAM, acording to MSIA. Damn, must upgrade to 128gb gpu if a silly 2 second logo takes up 5gb, right?
     

  11. HWgeek

    HWgeek Master Guru

    Messages:
    441
    Likes Received:
    315
    GPU:
    Gigabyte 6200 Turbo Fotce @500/600 8x1p
    Yep, if Game devs can have the freedom they can use even more then 16GB vram for even beautiful world in games, they can't do so if the 90%+ of the market is limited by 4GB/6GB cards.
    Or go all the way with 2080TI or take 2060/2070.
    NV just nerfed the 2080 so it's days will be short and you will upgrade sooner.
    8GB is great, but not at RTX 2080 price level.
     
    Undying likes this.
  12. MonstroMart

    MonstroMart Master Guru

    Messages:
    873
    Likes Received:
    349
    GPU:
    GB 5700 XT GOC 8G
    The 2070 is not powerful enough for proper RTX at the resolution someone paying this kind of money for a GPU would likely play at (2k 144Hz or 4k) so it should not be accounted for. It's like having a F1 engine in a Mazda 6. Or 16GB HBM2 memory instead of 8GB on a card not powerful enough ... Nobody would pay for a F1 engine inside a non modified Mazda 6. Doesn't make any sense.

    It remains to be seen if DLSS will be useful and not blur the image too much. I have my doubt personally and i will wait for more games to support it before considering it a feature worth paying for. TAA is awful and i would rather poke my eyes and immolate myself than use it. DLSS will not only need to perform better (which it will no doubt) but also not blur the image as much. imo no AA >>> TAA specially at 4k. Remains to be seen if DLSS will fall into the same category.

    This card might (does) look worse than the 2070 and 2080 but it's like choosing between a polished turb and a normal turd. It's still a turd. Right now the only card worth paying for in the high end segment is the 2080 Ti (unless you got an oldie like 5xx-6xx-7xx).
     
  13. metagamer

    metagamer Ancient Guru

    Messages:
    1,866
    Likes Received:
    731
    GPU:
    Palit GameRock 2080
    Omg what is this forum?
     
  14. Undying

    Undying Ancient Guru

    Messages:
    14,915
    Likes Received:
    4,026
    GPU:
    Aorus RX580 XTR 8GB
    Different opinions man. No need to insult.
     
  15. metagamer

    metagamer Ancient Guru

    Messages:
    1,866
    Likes Received:
    731
    GPU:
    Palit GameRock 2080
    I'm not insulting anyone. What I do find insulting is people acting like we suddenly now need 16gb gpus because AMD released one. hilarious.
     
    yasamoka, RzrTrek and Xtreme1979 like this.

  16. Undying

    Undying Ancient Guru

    Messages:
    14,915
    Likes Received:
    4,026
    GPU:
    Aorus RX580 XTR 8GB
    You just told me to lay off the stuff amd is feeding me. Thats just silly.

    I just said 8gb will not be enough in a distant future not that 16gb is needed. There are also 11gb cards. ;)
     
  17. HWgeek

    HWgeek Master Guru

    Messages:
    441
    Likes Received:
    315
    GPU:
    Gigabyte 6200 Turbo Fotce @500/600 8x1p
    And what I find insulting is that NV taking 2 years old 1080TI, replacing it with same performance card at higher price and cutting down the ram to only 8GB and to see ppl thinking it OK and there is NP.
    So it's not the 8GB the problem, its that it's has 2 years old GPU performance with it's memory cut down with higher price.
    Save for 2080TI or take2060/2070 and save the money for Real upgrade next year. IMO 2080 is irrelevant.
     
  18. MonstroMart

    MonstroMart Master Guru

    Messages:
    873
    Likes Received:
    349
    GPU:
    GB 5700 XT GOC 8G
    Honestly 16GB is overkill but 8GB looks very borderline for proper 4k. Graphics did not progress much over the last 5-10 years. When i was young you would load a 10 years old game and it looked extremely dated. Now you load a good looking 10 years old game and it looks just fine (even compared to ray tracing).

    Maybe we don't need more than 8GB because cards don't come with more than 8GB and devs have to cut the corners. I mean i can't really remember the last time a game wowed me graphically speaking. Most of the times the textures look bland and low res to me on my 2k monitor.
     
    moo100times and HWgeek like this.
  19. metagamer

    metagamer Ancient Guru

    Messages:
    1,866
    Likes Received:
    731
    GPU:
    Palit GameRock 2080
    Well, in the distant future we'll be 3-4 generations of cards from now (read 3-4 years at least) and then the 16gb will be standard, I'm sure. But we'll also have cards that will be 200%+ faster than current cards.
     
  20. metagamer

    metagamer Ancient Guru

    Messages:
    1,866
    Likes Received:
    731
    GPU:
    Palit GameRock 2080
    For 4k, I'd consider 8gb being a touch too close, agreed. Still, 8gb even at 4k will be enough for a year or 2. 4k is still a niche market and it's not because we don't have gpus with enough vram, it's because we're just about getting gpus that can play 4k60 and those gpus cost over $1000.

    By the time most people can run 4k60 comfortably, we'll sure have gpus with more vram.

    I'm on 1440p btw, 8gb is plenty for me for the next year or two, then I'll upgrade anyway.
     

Share This Page