New Upcoming ATI/AMD GPU's Thread: Leaks, Hopes & Aftermarket GPU's

Discussion in 'Videocards - AMD Radeon' started by OnnA, Jul 9, 2016.

  1. Tuga

    Tuga Member

    Messages:
    48
    Likes Received:
    0
    GPU:
    nvidia 670 gtx
    I sort of was expecting to see more optimized polaris cards, but was expecting them to be called something different than RX570 or RX580, like maybe RX490 or RX495. Kinda disappointing to not see any vega cards yet, but I guess I'll keep waiting.
     
  2. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,895
    Likes Received:
    767
    GPU:
    Inno3D RTX 3090
    Well, you still don't know. :banana:

    :infinity:
     
  3. Valken

    Valken Ancient Guru

    Messages:
    1,928
    Likes Received:
    274
    GPU:
    Forsa 1060 3GB Temp GPU
    Thanks for the suggestion and I just had a quick look at local shop prices:

    470 4GB 150 USD GIGABYTE G1 GAMING
    480 4 GB 173 USD MSI ARMOR
    570 4 GB 173 USD MSI ARMOR OC
    1060 3GB 193 USD MSI
    580 4 GB 225 USD MSI ARMOR OC
    580 8 GB 256 USD MSI ARMOR OC

    If a 470 is double over CFX 6950 that is a steal at 150 USD @ 1080p. I just need to tough it out while squeezing a game of ARMA 3 now and then, but ARMA is know to be Nvida DX11 optimized.

    I originally was targeting 980Ti or 1070 performance but would support AMD and since there are no more Fury/FuryX to be found, the Vega wait is just killing us.
     
  4. mtrai

    mtrai Maha Guru

    Messages:
    1,175
    Likes Received:
    369
    GPU:
    PowerColor RD Vega
    I am taking a walk on the dark side while my RX 480 is out for RMA with an unknown return date now.

    Thinking it would be another couple of weeks before it is even shipped back to me, I noticed my local bestbuy had a EVGA 1050 TI SSC on sale for 132 USD and thought I would take a gander at the dark side.

    While I know a 1050 TI will not compete with the RX 480 I just wanted to see how the otherside was faring on the cheap.
     

  5. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,793
    Likes Received:
    1,148
    GPU:
    EVGA 1080ti SC
    http://www.anandtech.com/bench/product/1769?vs=1869
    This is comparing 670 (which was equal to 6950 (unlocked) CF in some case where CF scaled properly or better in cases where CF was pants) vs a 570 (which a 470 can be clocked to match easy.

    Though that 480 4GB for $173 would be a tremendous stop gap for you till Vega.
     
  6. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,366
    GPU:
    6900XT+AW@240Hz
    I would have nVidia GPU too, if nVidia allowed for PhysX HW acceleration in presence of AMD's GPUs.

    But I have some vibes about it. In year's time intel will use AMD's GPU as iGPU...
    Where will nVidia PhysX work properly then?
     
  7. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,793
    Likes Received:
    1,148
    GPU:
    EVGA 1080ti SC
    What is this PhysX you speak of. Nvidia seemed to have moved on from it a while ago. Gameworks uses direct compute instead of x85 and therefore can be used without PhysX drivers. Correct me if I'm wrong.
     
  8. Fox2232

    Fox2232 Ancient Guru

    Messages:
    11,809
    Likes Received:
    3,366
    GPU:
    6900XT+AW@240Hz
    OK, looks like I forgot that PhysX is not creeping in shadows of Gameworks...
    https://developer.nvidia.com/what-is-gameworks

    I probably forgot that PhysX nowdays is no longer using nVidia's GPUs and is exclusively run on CPU on both AMD and nVidia.

    And I kind of forgot that even while over 2 years ago nVidia promised fully Direct Compute PhysX which would run on multi cored CPUs and GPUs of AMD, they actually did not deliver.
    And as result games like Planet Side 2 which gave their promises to their communities never delivered.

    I forgot that Hairworks in Wither 3 looks like poop on wolves as long as you do not run nVidia's GPU.
    Honestly, while i could run it easily on max, I opted to turn it off because I rather looked at wolves with standard textures than those weird spots of hair missing. As that made all those wolves look rather sick and about to die. But that was apparently not intention as in nVidia's YT videos they look healthy and great.
    (And this is not even PhysX based.)
     
  9. Denial

    Denial Ancient Guru

    Messages:
    13,563
    Likes Received:
    3,117
    GPU:
    EVGA RTX 3080
    I agree that PhysX is crap but I don't remember Nvidia ever promising a DirectCompute port?

    Waveworks/Flex/Flow all recently got updated to use DirectCompute - it's no longer CUDA based which allows it to run on AMD - but I've only seen one or two people over at /r/amd actually get the demos running. I haven't seen any benchmarks or anything.
     
  10. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,793
    Likes Received:
    1,148
    GPU:
    EVGA 1080ti SC
    That's why I said correct me if I'm wrong. Honestly have not played any games recently with Gameworks and honestly thought Nvidia had move away from PhysX completely.
     

  11. Oxezz

    Oxezz Member

    Messages:
    28
    Likes Received:
    0
    GPU:
    R9 290 VaporX @1150
    Damn i love this guy.

     
  12. Truder

    Truder Ancient Guru

    Messages:
    1,628
    Likes Received:
    586
    GPU:
    RX 6700XT Nitro+
    I hate to say it but I think he has become AMD's lacky now... Back when he gained notoriety, it was because of him questioning business ethics of nVidia and AMD such as the pervasive "mindshare" and the use of blackbox software of for example, gameworks and the nature of rebranding hardware but now all we see is him disseminating information of AMD's products in a positive light.

    The review itself though isn't really worth it's salt, just plain averages, no frametimes or critical analysis to check for stuttering or hitching and also only showing best case scenarios (such as best API for each card).
     
  13. Oxezz

    Oxezz Member

    Messages:
    28
    Likes Received:
    0
    GPU:
    R9 290 VaporX @1150
    I quite understand his sympathy towards AMD since nvidia-intel is bruteforcing their way into market without bringing anything new to table. Yes this is a rebrand i know but there's a video which explains how many times nvidia did the rebranding game while Radeon had new architecture every year and new technologies also even when new dx apis introduced people was still buying nvidia rebranded cards which were very old architecture and were still outselling radeon cards just because "ow well this is a new api will take time to adapt" well actually not, this is a good-bad trend which exists quite some time now saying nvidia is the best radeon is meh. People would still buy todays best performance gfxs for games NOW problem is there's tommorow games and future games, buying something that perform good now doesn't justify the price at least for me.
     
  14. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,895
    Likes Received:
    767
    GPU:
    Inno3D RTX 3090
    Up to now I don't believe he has any real sympathies. I believe his audience believes that, but he has said multiple times that he believes that Vega's specs is basically AMD giving up on the high end. A lot of his predictions failed too, but I forgive everything just because of that RotTR analysis he did, which was later proven correct.
     
  15. Oxezz

    Oxezz Member

    Messages:
    28
    Likes Received:
    0
    GPU:
    R9 290 VaporX @1150
    Yes that was a good video he triggered other youtubers do the same tests heh...
     

  16. OnnA

    OnnA Ancient Guru

    Messages:
    13,896
    Likes Received:
    3,647
    GPU:
    3080Ti VISION OC
    :)

    ~120-145tW with Undervolting to 1040mV :nerd:
    Old good ATI: You have The Power (sic ;) ) .... to underrrrr Volt....

    Same goes with my Fury, 1.174v and it has Good 3Dmark on 1050/570 lolz

     
    Last edited: Apr 20, 2017
  17. OnnA

    OnnA Ancient Guru

    Messages:
    13,896
    Likes Received:
    3,647
    GPU:
    3080Ti VISION OC
    XFX RX580 8GB GTS Review

    Overall and Final Verdict

    With my testing done and having taken a look at the XFX RX580 GTS lets run through the main details and the good and the bad. So like the Sapphire Nitro+ that I tested yesterday, the XFX 580 GTS should fall near the top of the RX 580’s with a relatively high overclock with XFX’s other cards coming in a little slower. That said the 1411Mhz overclock of the standard Sapphire Nitro+ ends up being a touch more than the 1408MHz of the GTS so in most results the Sapphire comes in just barely ahead. XFX mentioned to me that the GTS went through the qualification testing to actually run at 1425MHz and I really think that would have been just enough to push it up to the top of the RX580’s for performance with the exception of the more expensive LE Nitro+. That said, even with the lower clock speed, it performed extremely well in all of our tests. 1080p and 1440p are extremely playable for anyone looking at the RX580 as well as VR making this a good sweet spot card for most gamers.

    The GTS did especially well in our cooling testing where it had a great fan profile to keep it running cool without having to dig in and change the settings manually. In addition to that, it matched the cooling performance at 100% of the Sapphire Nitro+. The fans were quiet even when cranked up as well. As far as performance goes really only the power usage was a concern but I think that will be an issue with all 580’s. The GTS actually pulled less than the Sapphire even.

    My big issue with the card though was with the overall aesthetics. The fake carbon fiber finish was a little too much for me and the fan shroud design as a whole wasn’t my favorite. I loved that they included a backplate, but the weird part at the end of the card didn’t make much sense to me. The card could have been much shorter and they didn’t even use the space inside the end for anything. Now if you are packing the card in your PC and all you will see is the top edge, it's not going to look too bad. I just wouldn’t pick the GTS first if I had a case like our Crush project build with a side window facing the fans.

    So is the XFX RX580 GTS the RX580 to pick up? In a lot of ways, it is a great card and the performance is most certainly there. It really just depends on your case configuration. If you don’t mind the styling or if you won’t see it in your build the GTS is going to make you a very happy person when gaming at 1080p and 1440p. If aesthetics is big to you, you might want to check out the XFX GTR models with the older cooler design or check out the other options on the market.

    -> https://lanoc.org/review/video-cards/7502-xfx-rx580-8gb-gts?showall=
     
  18. OnnA

    OnnA Ancient Guru

    Messages:
    13,896
    Likes Received:
    3,647
    GPU:
    3080Ti VISION OC
  19. Turanis

    Turanis Ancient Guru

    Messages:
    1,779
    Likes Received:
    475
    GPU:
    Gigabyte RX500
    PhysX was good back in his days,but nVidia destroy it.

    What I love at Radeon Crimson drivers is you can change the level of Tessellation from 64 to 8-16x.
    And voila you'll have more fps in games.

    If you are on nVidia side you can not do that and with every sponsored DX11 game they push moar & moar level of Tessellation to the limit.
    Thats why every new NV card "shine" vs old card,If you can change that level of Tessell old card will run better in new games.But nVidia will never put a option to switch level of Tessellation.
     
    Last edited: Apr 20, 2017
  20. Noisiv

    Noisiv Ancient Guru

    Messages:
    8,103
    Likes Received:
    1,389
    GPU:
    2070 Super

    NV screwed up with PhysX not completely unlike what they are doing with Gsync now.

    Proprietary is good as long as it offers clear advantage and is able to stand on its own. Like CUDA for example.

    But once competing techniques catch up, you ought to release it into wild. Say like what AMD did with Mantle. So instead of having the dead duck, they have their code in new Vulkan standard.
    Not ****ing rocket science :D
     

Share This Page