AMD announces Radeon VII (7nm)

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 9, 2019.

  1. D3M1G0D

    D3M1G0D Guest

    Messages:
    2,068
    Likes Received:
    1,341
    GPU:
    2 x GeForce 1080 Ti
    As others mentioned, Vega 7 likely just a repurposed MI50, not something designed from the ground-up for gaming. Navi is the next big hope for gaming, not Vega - the moment that they mentioned Vega it should have been obvious what this was - a stopgap until Navi.
     
    Maddness likes this.
  2. Mere

    Mere Guest

    Messages:
    124
    Likes Received:
    4
    GPU:
    amd fury 3840 1100/500
    - 700 USD is TOO MUCH!

    - SELL the 3 damn games, son!

    :D
     
  3. fry178

    fry178 Ancient Guru

    Messages:
    2,078
    Likes Received:
    379
    GPU:
    Aorus 2080S WB
    As long as its on par with 1080 performance and has more than 10gb of vram, i might switch.
     
  4. nizzen

    nizzen Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,157
    GPU:
    3x3090/3060ti/2080t
    2 years old performance for the same 2 years old prize of 1080ti.

    1080ti was the best buy for years !
     
    fantaskarsef, Maddness and Solfaur like this.

  5. chispy

    chispy Ancient Guru

    Messages:
    9,988
    Likes Received:
    2,715
    GPU:
    RTX 4090
    I expected a little more out of this :/ , sadly is still a Vega gpu that matches more or less a GTX1080Ti and a RTX2080. It seems Navy is far far away ... I was hoping for a replacement for my GTX1080Ti but this is not it :( . I'm torn between between selling my 1080Ti and buy a 2080Ti or not :confused: and i thought i would never spent more than $1,000US Dollars on a video card ! decisions decisions ...
     
    fantaskarsef and Maddness like this.
  6. Dj_ALeX

    Dj_ALeX Master Guru

    Messages:
    223
    Likes Received:
    77
    GPU:
    RTX 3060 Gaming X
    Good card but.. AMD lost the train 2 years ago.. & the price.. ohh GOD! :D
     
  7. GlennB

    GlennB Master Guru

    Messages:
    262
    Likes Received:
    101
    GPU:
    Sapphire Vega 56 EK
    The Die size increased by 60% for the 2080 TI (754mm2) from the 1080 TI(471mm2) while coming with around 30% extra cuda cores. A larger die size also means worse yields which results in higher cost. They added both the tensor cores and integer cores so they will add to the size of the die.
     
  8. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    1080Ti is not a good example because it doesn't support 2x FP16 per cycle which requires fixed hardware. GP100, Volta/Turing do which is why I used GP100 for comparison. Tensor cores aren't necessary for RT as shown by BF5 which doesn't use them. The separated INT32 pipeline improves performance per SM on non-RT workloads:

    https://images.anandtech.com/doci/1...8_Updated090318_1536034900-compressed-011.png

    Also INT32 capability wasn't added it was just separated from FP32 path for concurrency improvements (Turing has twice the number of dispatch units per SM). Transistor cost per SM should be roughly the same and AFAIK it doesn't have much to do with RT other then improving scheduling (It was split in Volta too which doesn't have RT).

    So yeah - like you can argue that the chip is too big, costs too much, whatever but the size of the chip has nothing to do with accelerating raytracing. Most of the size comes from cache which got doubled and typically takes up a fair amount of space along with scheduler changes and tensor cores.
     
    yasamoka likes this.
  9. Any confirmation on what type of drivers Radeon Vll will receive? Anyone know if it will it match a/the format of the Vega Frontier Edition for example, considering its very high price.

    Example: Radeon Pro Software™
     
    Last edited by a moderator: Jan 10, 2019
  10. HARDRESET

    HARDRESET Master Guru

    Messages:
    891
    Likes Received:
    417
    GPU:
    4090 ZOTAEA /1080Ti
    My Gigabyte gaming oc 1080Ti , new $629.00 , was a steal back in September , and now still is !
    AMD would have had me at $599!
    Good till next generation.
     
    Maddness likes this.

  11. Srsbsns

    Srsbsns Member Guru

    Messages:
    192
    Likes Received:
    54
    GPU:
    RX Vega 64 Liquid
    If this is an MI50 card wouldn't that mean you could connect two GPUs via infinity fabric? This was pretty much everyone's dream.
     
    Deleted member 213629 likes this.
  12. Halfmead

    Halfmead Guest

    Messages:
    275
    Likes Received:
    50
    GPU:
    GigaByte 2070 Super
    @Hilbert Hagedoorn I hope you will include a good section with content creation benchmarks (Blender, premiere, etc etc) on this card when reviews are available...Also will be interesting to see if this card is a good overclocker...

    Seems to me a really good price here when you consider it is 16 GB of HBM and 60 CU's , for a compute card that is also able to game at 4k 60+ fps it is a good deal for people who wants the best of both worlds tbh, i do not get the outcry over the pricetag...

    As for pure gaming (and pricing), if one wants AMD then maybe wait for the actual NAVI products to emerge or the 11xx Nvidia series.

    I think most of us have too high expectations for AMD at this point in time, but hopefully they'll have time (and money) now to properly mature the NAVI segment so that we'll see proper competition to NGreedia.
     
  13. "AMD will be employing some mild product segmentation here to avoid having the Radeon VII cannibalize the MI50 – the Radeon VII does not get PCIe 4.0 support, nor does it get Infinity Link support"

    - Source AnandTech.

    https://www.anandtech.com/show/13832/amd-radeon-vii-high-end-7nm-february-7th-for-699
     
    fantaskarsef likes this.
  14. xrodney

    xrodney Master Guru

    Messages:
    368
    Likes Received:
    68
    GPU:
    Saphire 7900 XTX
    I would definitely go for more ram over RTX
    1) I personally thinks that RTX performance is 2-3 generations behind to be really usable at decent resolutions and framerates and in acceptable amount of titles to be worth it.
    2) games currently easily use 6-8GB of VRAM, but OS and programs on background can easily use additional 2-3GB. Plus, I time from time run two games @once because of multiple reasons.
     
  15. DrKeo

    DrKeo Guest

    No idea what you are even trying to say with this or how its even related to the subject. Nvidia had RT tools for offline rendering years before AMD. They hired RT experts years ago and developed dedicated silicon for RT. Obviously Nvidia is taking RT much more seriously than AMD.

    Running RT on your GPU is easy, all you need to do is allow it because DXR and Vulkan already do all the work for you. The thing is? A full Vega64 can't do what 2060RTX RT cores do even if they use the whole GPU for RT (and they still need to do all the rest too, like shaders and lighting).
     

  16. nevcairiel

    nevcairiel Master Guru

    Messages:
    875
    Likes Received:
    369
    GPU:
    4090
    Many games will use more VRAM if more is available. Absolute usage figures are hard to compare, and how using more VRAM translates into a performance advantage is hard to judge.
    Ultimately the benchmarks should tell you if it really matters.

    Of course if you have special needs that warrant more VRAM for you, then that may be a good option. But for the average person, the answer "more VRAM is always better" is not quite so clear cut.
     
  17. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,103
    Likes Received:
    2,606
    GPU:
    3080TI iChill Black
    I don't find it thaat expensive, 650$ would be ideal.


    IMO a great competitor for Turing in general, yeah it's not 2080TI rival, but hey it's also not frickin 1200$+.. get real people. I'm pretty sure it will be a fine 1440p card, better then expected. I will personally hold off on a new 2080RTX for now, not if I can get same or better perf. at less money.


    The only deal breaker for me now would be thermals, if it remains ~ 70C and a good OC'er lets say at least 1950-2000MHz, idc about TDP, never was.. Then I'm sold. Bye bye nvidia, you've been a love hate relationship for far too long xD
     
    moo100times, fantaskarsef and BetA like this.
  18. Srsbsns

    Srsbsns Member Guru

    Messages:
    192
    Likes Received:
    54
    GPU:
    RX Vega 64 Liquid


    Toms is conflicting Anandtech somewhat with PCIe 4.0. They are also saying AMD boards will allow PCIe 4.0 upgrades via bios.

    https://www.tomshardware.com/news/amd-ryzen-pcie-4.0-motherboard,38401.html
     
  19. Crap ok - good to know as Toms is a known-good source (not that anand) isn't but ... means ... that this isn't known yet until I can find a AMD post like a spec sheet or maybe a twitter clip on their feed of something... might just ask em myself...

    Thanks @Srsbsns

    That was a really cool article about different OEMs saying their tests concluded that only one lane (the closest PCI-E x16) to the CPU could operate at 4.0 speeds just fine but the rest would revert back to 3.0 - so cool! All via BIOS update! I really hope AMD approves that cause apparently the OEMs were all saying they needed AMD's approval? :/ aahh please AMD approve it just don't "support" it.. like back when X79 SandyBridge-E days
     
  20. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,038
    Likes Received:
    7,379
    GPU:
    GTX 1080ti

    Technically, ATI introduced tesselation with DX8, it didn't take off and only a handful of games have release or patched in support for it.

    the ATI tesselation engine for DX11 has never been their strong point, but there is never a "proper time" to introduce new capabilities,

    RTX is for those who want to dabble, and the developers who want to get in on bringing DXRT to their games now, there is nothing new going on here with graphical options being provided that are intended for future graphics hardware.
     
    Maddness and warlord like this.

Share This Page