Radeon RX Vega to Compete with GTX 1080 Ti and Titan Xp

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Apr 26, 2017.

  1. pharma

    pharma Ancient Guru

    Messages:
    1,670
    Likes Received:
    484
    GPU:
    Asus Strix GTX 1080
    Last edited: Apr 27, 2017
  2. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,731
    Likes Received:
    3,324
    GPU:
    6900XT+AW@240Hz
    But it still remains that window of opportunity is just bit smaller.
    It is exactly same with Ryzen. Intel has CPUs with performance equal to any Ryzen processor in market for long time.
    Yet, people still buy Ryzen because it costs less than equally powerful intel's CPU.

    Imagine someone with GTX 980Ti/Fury. And that they want upgrade, but consider only GTX 1080Ti level of performance as solid upgrade since they want to move to 1440p @144Hz. But GTX 1080Ti is too expensive for them.
    Then Vega pops in with similar performance in given resolution and is available at lower price which is affordable for this person.
     
  3. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,651
    Likes Received:
    581
    GPU:
    Inno3D RTX 3090
    The cheapest Ti I can find them in the EU is 716 euros without delivery, and that's for the ****ty one-fan model. Total cost probably close to 740 euros.
    The fact is I don't really believe that the 1080 has the chops for good long-time performance. It's basically a 2560 core card with a 256-bit memory interface. It's nice, but it's essentially midrange sold in exuberant prices because there is no competition on that level.

    They said that in cases that the GPU runs out of memory, the performance is +50% higher than it would have been if a traditional memory controller was used.

    I'm not sure that NVIDIA and AMD face the same delays in HBM 2 access. Do we happen to have anything remotely concrete on the actual HBM vs GDDR5 production/implementation cost? Because there are Furies being sold ~$260, with obvious profit (but not great one) involved.

    Ryzen supply would make sense, but if you notice their Linux driver commits for the past year, they are trying something very specific. They want to change their driver model so that it's as cross-platform and open as possible, and at the same time provide at least acceptable performance through Day 1 driver releases. If you combine that with the HBM 2 factor and production overtaken by a different popular product, it makes sense. As for design spins, I think that's the reason why the emphasize the memory controller so much. If I have understood the diagram correctly, they call it High Bandwidth Cache controller, because it always needs some fast HBM on board to do its thing, and then it can connect to everything that's on the bus. That means that mixed models would also have been possible (small amount of HBM plus GDDR5). Why would they need to split designs if that is the case? It's clear that Pascal's memory controller can't do that and that they need separate silicon for HBM and GDDR5x, so it would make sense to split there.

    Seeing how Polaris performs now compared to launch, I'm not so certain that even if the Vega design was finalized GloFo could produce it in good yields with the frequencies required. "Polaris 20" is basically a GloFo spin, not an actual redesign of the chip. Now things look quite different on that front.

    We have to wait and see, because I have this feeling that AMD in general works with less refined approaches to hardware issues like that. Ie, I don't believe they'll enable/disable it depending on the application like NVIDIA does. That's just an uneducated guess from my point.

    AMD's bottleneck has always been graphics performance and their DX11 driver. It's still there, but the changes in Vega might at least change the graphics performance part a bit. I'm curious to see the effects of the new memory controller in microstuttering, because I don't believe it will only matter in out of vram situations. The next huge questionmark is the command processor and how it can be used by the DX11 driver. AMD's DX12 driver is actually quite nice, despite some recent sh*tiness (like Forza Horizon 3 having artifacts and immense loading times with the latest versions *cough* *cough*).

    I want to see what that controller does to frametimes in general. I wish they could provide even a hidden switch to turn it on or off, just to see the effect.
     
  4. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,793
    Likes Received:
    1,148
    GPU:
    EVGA 1080ti SC
    The 1080+ memory bandwidth is higher than the 980Ti and just a tick behind the Fury X. Don't fall for the bus width shenanigans.
     

  5. Valken

    Valken Ancient Guru

    Messages:
    1,724
    Likes Received:
    179
    GPU:
    Forsa 1060 3GB Temp GPU
  6. RandomDriverDev

    RandomDriverDev Banned

    Messages:
    111
    Likes Received:
    0
    GPU:
    1080 / 8GB and 1060 / 6GB
    Do NOT use the inclusion of an AMD core in a console that uses native machine code and built for product SDK's as any sort of basis that a part will do well.
     
  7. GeniusPr0

    GeniusPr0 Maha Guru

    Messages:
    1,261
    Likes Received:
    17
    GPU:
    RX 6800 XT
    As an indicator of how far graphics get pushed, it works quite well. This has been proven with multiple times with "console killer" review builds. Optimization is a different story.
     
  8. RandomDriverDev

    RandomDriverDev Banned

    Messages:
    111
    Likes Received:
    0
    GPU:
    1080 / 8GB and 1060 / 6GB
    on the pc front, AMD keeps half arsing the front end of the scheduler and then pulling the 'NVidia is a cheat' card when their hardware cannot cope with particular games.

    You won't get that on a console, the game is made for the specifications available.

    on PC, games are made to take advantage of a wide range of hardware.
     
  9. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,651
    Likes Received:
    581
    GPU:
    Inno3D RTX 3090
    I don't. But it should have had that memory with a minimum 384-bit bus. It being higher than the 980Ti and a tick under the Fury X in raw specs, simply enforces that hardware-wise it's a mid-range card.
     
  10. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,731
    Likes Received:
    3,324
    GPU:
    6900XT+AW@240Hz
    I would not be so critical here. They use APIs for ease of development.
    And it may be surprise to you, but XBox One uses DirectX 11 & 12 as Microsoft wants to have good compatibility with PC.
    And GNMX (high level PS4 API) is close to DX11.
     

  11. RandomDriverDev

    RandomDriverDev Banned

    Messages:
    111
    Likes Received:
    0
    GPU:
    1080 / 8GB and 1060 / 6GB
    all of the Xbox's have used a DirectX 'Like' language for graphics, with the first and last of them using x86 processors.

    Doesn't mean zot for how they work on an actual PC though.

    One day AMD might have the finances to put together a driver team that knows how to read old uncommented code and fix the issues in their OpenGL and Direct3D11 drivers.
     
  12. TheF34RChannel

    TheF34RChannel Master Guru

    Messages:
    277
    Likes Received:
    3
    GPU:
    Asus GTX 1080 Strix
    This is so funny! 1. The AMD guy works in a completely unrelated department and 2. If you read his post he did not say that at all yet the Internet is going mental over this rotfl. Guru is becoming more of an unchecked rumour mill since last year and that's a real shame as it lowers the outstanding quality this site has always had I feel :/
     
  13. pharma

    pharma Ancient Guru

    Messages:
    1,670
    Likes Received:
    484
    GPU:
    Asus Strix GTX 1080
  14. GeniusPr0

    GeniusPr0 Maha Guru

    Messages:
    1,261
    Likes Received:
    17
    GPU:
    RX 6800 XT
  15. Denial

    Denial Ancient Guru

    Messages:
    13,293
    Likes Received:
    2,776
    GPU:
    EVGA RTX 3080
    1200mhz isn't the final clockspeed. That's why the score is so low.
     

  16. GeniusPr0

    GeniusPr0 Maha Guru

    Messages:
    1,261
    Likes Received:
    17
    GPU:
    RX 6800 XT
    Well 1200Mhz Vega is roughly at GTX 1070 levels, so 1.25% more clocks, assuming it scales linearly since memory won't be the bottleneck, would be 1080, right on the dot, or around 7K GPU.

    Eh
     
  17. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,026
    Likes Received:
    242
    GPU:
    6800 XT
    Last edited: Apr 30, 2017
  18. Denial

    Denial Ancient Guru

    Messages:
    13,293
    Likes Received:
    2,776
    GPU:
    EVGA RTX 3080
    Which I guess is right around the Doom/BF1 performance that was shown a few months ago. So yeah, might be accurate for the given clockspeed.
     
  19. kx11

    kx11 Ancient Guru

    Messages:
    3,381
    Likes Received:
    414
    GPU:
    RTX 3090
    WCCFTech were late to steal the article from guru3d
     
  20. Romulus_ut3

    Romulus_ut3 Master Guru

    Messages:
    673
    Likes Received:
    143
    GPU:
    AMD RX 570 4GB
    They've put their own spin on it.
     

Share This Page