AMD publishes Radeon VII benchmark results from 26 games

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 11, 2019.

  1. moo100times

    moo100times Master Guru

    Messages:
    577
    Likes Received:
    328
    GPU:
    295x2 @ stock
    I used overclockers.co.uk as point of reference, where only 2070 cards available are B-grade stock or a zotac mini. Rest of available cards are 550GBP +. The cheapest 2080 they have is 700GBP and most other models are above or way above. Comparable to your comment between MSRP and actual retail price.

    Announced price of Vega is 700USD. After exchange rates cheaper than both my aforementioned.
    Vega at 450gbp at approx 1.3 exchange is about 600usd, so I cannot see where the 50% more price comment comes from either.
    I wouldn't consider a card costing the price of a decent rig as something that is really raising the bar for reasonable gaming or card performance. If nvidia's gpu tech works on 7nm well and equals a significant performance increase then good for them, we will have a competition on our hands, though again a big assumption considering their newest releases. Why would nvidia rush to bring 2xxx series to market when they are dominating with 1xxx series, especially when they could have done this gen on smaller manufacturing node and facing trying to clear massive backlog of chips?
     
    Last edited: Jan 12, 2019
  2. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,221
    Likes Received:
    1,540
    GPU:
    NVIDIA RTX 4080 FE
    16 GB of expensive VRAM is stupid on this card. 12 GB or 8 GB would have made it a much stronger competitor to NVIDIA’s RTX 2080 as it could be sold for much less, which in turn would make the lack of extra features such as ray-tracing and DLSS less of an issue. I wonder if Jensen would have called that card’s performance lousy? ;)
     
    mohiuddin and Maddness like this.
  3. HARDRESET

    HARDRESET Master Guru

    Messages:
    891
    Likes Received:
    417
    GPU:
    4090 ZOTAEA /1080Ti
    USA EVGA RTX 2080 , $699 is the cheapest.
    Also i got a great deal on a new 1080Ti Gigabyte gaming oc , back in September 2018, for $629.00 !
     
  4. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,035
    Likes Received:
    7,378
    GPU:
    GTX 1080ti
    There'd be no point to it, they would still be buying the full 16GB hbm chip as its a single 4096bit module.
     
    JonasBeckman likes this.

  5. waltc3

    waltc3 Maha Guru

    Messages:
    1,445
    Likes Received:
    562
    GPU:
    AMD 50th Ann 5700XT
    Looked like it was doing a good job running at 4k to me...;) I'd bet that 99.9% of everyone who uses a 4k monitor daily could care less whether he can support a 144Hz refresh--I know that was not on the bullet list when I bought mine. Sort of pleased by just how much visible difference there is between 2560x1440 @ 27" and 3840x2160 @ 32"--the wife got the QHD--4k is the only place I want to be. If I can play the game smoothly and without stutter, it's fine. Still waiting for one that I can't play like that, fortunately...!
     
    HARDRESET likes this.
  6. waltc3

    waltc3 Maha Guru

    Messages:
    1,445
    Likes Received:
    562
    GPU:
    AMD 50th Ann 5700XT
    This is a very early look at a new product--it's an in-house release as well (nVidia is no stranger to releasing in-house benchmarks before a product release that turn out to have been been exaggerated in some key fashion.) The purpose here was simply to demonstrate that this is a real new product which is close to finalization and shipping--it's not a definitive product review and was not meant to be. But as long as this product rings the bell at a significantly lower price than nVidia's trying to get atm, it's going to be well received, imo.

    I agree about using Intel CPUs, it does seem counter productive--especially at 4k--when the cpu used doesn't mean a hill of beans. For lower resolutions, the cpu is only important because game-engine compilers favor Intel architectures because they have been unchanged for many years--whereas Ryzen is still comparatively new and compiler optimization for game engines hasn't matured--hasn't even started to mature yet, really--although things are surely a lot better now than they were at the Ryzen launch! (And they were never *that* bad even then.) I expect that things will pick up this year in terms of Ryzen 3k and beyond.
     
  7. GREGIX

    GREGIX Master Guru

    Messages:
    856
    Likes Received:
    222
    GPU:
    Inno3d 4090 X3
    How's so? I mean, vega 64 was 699$ at launch time, top, water cooled. Seems to me price is the same, so where do you find that 50% more charges???
     
  8. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Other possible causes exist. Since it has been stated that Radeon 7 eats around same as Vega 64, there may be TDP limit in game.
    Basically some games may force card to run empty cycles so much that it drastically decreases work done.
    Unlocking TDP and undervolting will likely stabilize results unless other limit like CPU is in place.
     
    Geryboy likes this.
  9. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,793
    Likes Received:
    1,396
    GPU:
    黃仁勳 stole my 4090
    It likely also heavily depends on how memory hungry the games are. That'll add to the variance a lot in this case. There's a reason why AMD picked 4K, it's a 16GB card, self-published benchmarks are always cherry picked in some way.
     
    Xtreme1979 likes this.
  10. HARDRESET

    HARDRESET Master Guru

    Messages:
    891
    Likes Received:
    417
    GPU:
    4090 ZOTAEA /1080Ti
    Once you go big screen you don't go back, especially when you get older.
     
    carnivore and BuildeR2 like this.

  11. Elder III

    Elder III Guest

    Messages:
    3,737
    Likes Received:
    335
    GPU:
    6900 XT Nitro+ 16GB
    Yeah, I've had my 4K screen for enough years now that it's easily the last thing I could stand to downgrade on my PC (with a SSD for the OS drive a close second).
     
    HARDRESET likes this.
  12. sneipen

    sneipen Member Guru

    Messages:
    137
    Likes Received:
    16
    GPU:
    Nvidia 1080
    What do you keyboard warriors think regarding how this card will perform in the future? Reason i ask is the "fine vine" some are talking about, and can sees in old and newer benchmarks. Is the architecture to similar to the old vega that we can see some increases. Or do you think they are going to be able to get more out of the chip after some time? The card has some tweaks compared to olver vega, also after som research. Its not a binned vega 7nm chip, they have done some changes. So i was wrong when i speculated this was a binned chip.
    And anyone dare to speculate about how pricing will be in 5-6 month's, both this and rtx? Also a 4k gamer, big screen. Was fixing a pc and used my old monitor. 1080p 27inch, worked fine when i used it. Now after getting used to my newer monitor, it looked like crap. Tho, i must say a big ultrawide screen with decent resolution dosent look shabby... :p
     
  13. HWgeek

    HWgeek Guest

    Messages:
    441
    Likes Received:
    315
    GPU:
    Gigabyte 6200 Turbo Fotce @500/600 8x1p
    Mark my words! Division 2 will come with DXR support for VEGA7!.
     
  14. BlackZero

    BlackZero Guest

    I'll believe it when I see it.
     
    fantaskarsef and HARDRESET like this.
  15. warlord

    warlord Guest

    Messages:
    2,760
    Likes Received:
    927
    GPU:
    Null
    No, no pal, with DXR instructions enabled RTX 2060 is gonna send radeon VII flying with an AI and efficiency uppercut. It won't see it coming.
     

  16. Valken

    Valken Ancient Guru

    Messages:
    2,924
    Likes Received:
    901
    GPU:
    Forsa 1060 3GB Temp GPU
    It will be nearly on PAR with a 2080 stock so check your main goto game and then decide. I think VII is going to be a surprise.

    I am aiming for a 4K upgrade so this or Navi is on my radar. For me its a big win with all that compute power and RAM as that is usable with mods and high textures even at 4K vs RTRT.

    Don't get me wrong, I am pro RTRT but its not there yet. If asked to trade turning RT on @ 1080p vs 4K, I would take 4K any day.
     
    HWgeek likes this.
  17. DeskStar

    DeskStar Guest

    Messages:
    1,307
    Likes Received:
    229
    GPU:
    EVGA 3080Ti/3090FTW
    Seems like AMD has made stock in placeholders.....
     
  18. Then do some research into it and find out?

    Hmmm.... right.... :rolleyes:

    Seems a little bit of a unbelievable claim. Insider info?
     
    Last edited by a moderator: Jan 13, 2019
    BlackZero likes this.
  19. BuildeR2

    BuildeR2 Ancient Guru

    Messages:
    3,219
    Likes Received:
    442
    GPU:
    ASUS 4090 TUF OG OC
    This could be possible, considering they flew the Ubisoft guy all the way out to Vegas for the AMD keynote. Also, why do so many people think AMD cards will suck at RTRT compared to NV? Isn't RTRT stuff compute heavy? As far as I know the AMD cards have always had strong compute, whereas NV rips that out so you have to buy a Quadro card. I feel like NV only has to add these "special" tensor/AI cores because of how they have been crippling the gaming line of GPU's for some time.
     
    HWgeek likes this.
  20. Maddness

    Maddness Ancient Guru

    Messages:
    2,440
    Likes Received:
    1,739
    GPU:
    3080 Aorus Xtreme
    I actually think AMD will do well at Ray Tracing when they either enable it or have the hardware for it. Like you i think all that compute power will definitely help.
     

Share This Page