New Upcoming ATI/AMD GPU's Thread: Leaks, Hopes & Aftermarket GPU's

Discussion in 'Videocards - AMD Radeon' started by OnnA, Jul 9, 2016.

  1. Humanoid_1

    Humanoid_1 Master Guru

    Messages:
    959
    Likes Received:
    66
    GPU:
    MSI RTX 2080 X Trio
    The little bits I keep hearing make it sound a more interesting card despite many of it's part counts sounding all too familiar with Fury.

    Being built from the ground up with with a leaning towards higher clock speeds, which can introduce a small amount of latency, but that could well be countered with having the very low latency HBM 2.

    Apparently AMD think they have removed bottlenecks in the architecture that could conservatively see 30% performance improvements even without the aforementioned increased clock speeds.

    I'm really looking forward to this launch and some Good reviews here ^_^
    ...being able to actually afford one at launch for once helps too!
     
  2. OnnA

    OnnA Ancient Guru

    Messages:
    15,814
    Likes Received:
    4,954
    GPU:
    3080Ti VISION OC
    May 16th: AMD will be Taking
    The Covers Off Vega,
    Navi
    & Zen+


    AMD will finally be disclosing more information about its next generation CPU & graphics architectures Vega, Navi and Zen+ in 10 days. The company is set to unveil its long-term CPU & graphics roadmaps for 2017 and beyond in a little over a week, sources close to AMD have told us. If you’ve been waiting to hear more about Vega, Navi & Zen+ make sure to tune in to wccftech on Tuesday May 16th.

    I should make it very clear that this is not going to be a product launch. AMD’s CTO Mark Papermaster, RTG Chief Architect Raja Koduri and Computing & Graphics head Jim Anderson will all join CEO Lisa Su on stage at the company’s headquarters in Sunnyvale to discuss AMD’s long-term vision. Vega, Navi and Zen+ cores will take center stage. This will not be a Vega launch.

    AMD’s Building Blocks For Ambitious 14nm & 7nm Roadmaps – Vega, Navi, Pinnacle Ridge/Zen 2 & Beyond
     
  3. OnnA

    OnnA Ancient Guru

    Messages:
    15,814
    Likes Received:
    4,954
    GPU:
    3080Ti VISION OC
  4. Maddness

    Maddness Ancient Guru

    Messages:
    2,198
    Likes Received:
    1,430
    GPU:
    3080 Aorus Xtreme
    That's good news that they are finally releasing some news on Vega.
     

  5. Agonist

    Agonist Ancient Guru

    Messages:
    4,035
    Likes Received:
    1,135
    GPU:
    Dell 6800XT 16GB

    Honestly it sucks big time with Freesync/Gsync.

    Personally, Nvidia can shove Gsync up their arsehole.
    Im not and AMD fanboy, I just hate Nvidias proprietary over priced stuff.

    Eventhough I had the money, the gysnc version was $500 more then my freesync version.

    And I was able to buy 2 290x DD black edtions to go with it for less then the monitor.

    If Vega is 10% near 1080ti I will be happy personally.

    It would be nice to be able to play newer games at 60 fps high settings @ 3840x1600.

    My Fury struggles in even Crysis 3 ultra to achieve above 40fps @ 1600p
     
  6. OnnA

    OnnA Ancient Guru

    Messages:
    15,814
    Likes Received:
    4,954
    GPU:
    3080Ti VISION OC
  7. OnnA

    OnnA Ancient Guru

    Messages:
    15,814
    Likes Received:
    4,954
    GPU:
    3080Ti VISION OC
    Last edited: May 8, 2017
  8. Anarion

    Anarion Ancient Guru

    Messages:
    13,600
    Likes Received:
    384
    GPU:
    GeForce RTX 3060 Ti
    Hmh... So they'd need to clock that Vega closer to 1,4 GHz to match decently clocked GTX 1070. 1,5 GHz wouldn't be enough to match GTX 1080 (especially after OC). 1080 Ti? Not even close.
     
  9. OnnA

    OnnA Ancient Guru

    Messages:
    15,814
    Likes Received:
    4,954
    GPU:
    3080Ti VISION OC
    Yup

    We need more Valid Leaks :)
    One thing is sure - at last 3 variants....
     
  10. Turanis

    Turanis Ancient Guru

    Messages:
    1,779
    Likes Received:
    489
    GPU:
    Gigabyte RX500
    Dont forget the ROPs.With 64 ROPs,Vega will be toooo weak. :)
     

  11. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,412
    Likes Received:
    512
    GPU:
    6800 XT
    Not too weak. But weaker than nvidia on that department. While being strong in others
     
  12. Anarion

    Anarion Ancient Guru

    Messages:
    13,600
    Likes Received:
    384
    GPU:
    GeForce RTX 3060 Ti
    This is a bit troubling mostly because HBM. It's expensive and NVIDIA can definitely lower GTX 1070 prices and still make €€€. Does it make sense to have three variants with HBM? Doesn't sound cost effective to me... For the sake of competition I truly hope that AMD is not in trouble with Vega. They sure are taking their time with it.
     
  13. Embra

    Embra Maha Guru

    Messages:
    1,406
    Likes Received:
    585
    GPU:
    Red Devil 6950 XT
    It is cost effective, less design & production expense.
     
  14. Anarion

    Anarion Ancient Guru

    Messages:
    13,600
    Likes Received:
    384
    GPU:
    GeForce RTX 3060 Ti
    HBM is really expensive. It would have been better for them to use GDDR5X/GDDR5 for second best GPU and reserve HBM for their top product. It's likely that HBM is still in short supply meanwhile GDDR5X/GDDR5 is probably getting really cheap.
     
  15. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,412
    Likes Received:
    512
    GPU:
    6800 XT
    As I think some have said. Using HBM is because they don't have money to separate their server side/professional cards from consumer cards. Also the price difference is not that huge.
     

  16. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    Yield difference is however.
     
  17. Denial

    Denial Ancient Guru

    Messages:
    14,013
    Likes Received:
    3,805
    GPU:
    EVGA RTX 3080
    http://electroiq.com/insights-from-leading-edge/wp-content/uploads/sites/4/2016/03/Sys-plus-1.jpg

    HBM1 with 4 stacks was ~$40 more expensive - but you also have to remember that by the time that cost hits consumers it's roughly 2.5x more expensive due to the markup down the supply chain. So it adds about $100 to the card. HBM2 on Vega is only 2 stacks - so it should be a little cheaper for that reason, but now there are more rumors coming out that HBM2 is still in short supply so I have no idea how much it costs AMD to do it.

    I personally don't think it's worth it on gamer cards but I think AMD is in a position where it doesn't have a choice anymore. In order to compete with Nvidia in servers they need it and they can't afford to spin half a billion different designs like Nvidia does.
     
  18. Tuga

    Tuga Member

    Messages:
    48
    Likes Received:
    0
    GPU:
    nvidia 670 gtx
    Imo good cards are needed for gamers too. Vega / 1080Ti seems much more futureproof choice for "future games". This is all speculation (and off-topic too), but who knows if microsoft decides to launch next gen with scorpio eventually? I doubt they're going to limit it to play current gen games only (it's matter of time imo).

    If valve/htc and pimax decide to launch more demanding VR headsets next year, those will need better gpu too (wider FOV / higher res).

    As for monitors.. most IPS monitors are still 1000:1 . Even some "HDR" IPS monitor from benq (SW320) seems to have 1000:1 static contrast. It seems the HDR is applied almost too "loosely", and maybe the only requirement is support for different color spaces, and not so much higher static contrast. Kinda wish we'd get more VA monitors, those have 3000:1 static contrast. I haven't been too excited about monitors yet :(
     
    Last edited: May 9, 2017
  19. OnnA

    OnnA Ancient Guru

    Messages:
    15,814
    Likes Received:
    4,954
    GPU:
    3080Ti VISION OC
    New WinX Update for CU is On-Line.
     
  20. Valken

    Valken Ancient Guru

    Messages:
    2,510
    Likes Received:
    677
    GPU:
    Forsa 1060 3GB Temp GPU
    Are ANY GPU currently DP 1.4 certified? Because DP 1.3 won't be able to do 4K RGB + HDR. Only 1.4 and 1.5 can (4K @ 144 hz 24 bit)!!

    I hope Vega comes at least 1.4 certified!
     

Share This Page