AMD Radeon R9 Fury X - Official Benchmarks

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 19, 2015.

  1. Clouseau

    Clouseau Ancient Guru

    Messages:
    2,844
    Likes Received:
    514
    GPU:
    ZOTAC AMP RTX 3070
    Let's get to the heart of the matter. How important is FreeSync and GSync? Really how important are those technologies when it comes to gaming? If it is just frosting with no substance than fine, let it rot and die from the scene. Otherwise, why have we not all gone out and purchased such a monitor already? Because all that really matters is that the card is able to function within those boundaries 100% of the time. That is it. So all this belly aching about DVI just means that those Sync technologies amount to a pile of BS. Or is it that screen manufacturers have not put out a decent sync monitor? Then once again the complaining is misplaced is should be focused on the monitor manufacturers.
     
  2. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    There really is no bellyaching. It's like 4-5 people complaining about DVI on this forum -- in the grand scheme of things the amount of people who own Korean monitors makes up less than .01% of the people who buy GPU's. Like I get that it sucks if you own one and were looking forward to buying this. But I bet people with floppy disks, HD-DVD's, 8 track tape, etc also were upset at some point.

    G-Sync/Freesync make a huge difference, but no one is going to go out and spend $500 for a feature that they can't see on a review site. People barely upgrade their monitors as is, let alone for one feature. It also doesn't help that the best monitor (Swift) was a pile of garbage in terms of reliability/QA and the Freesync stuff just started launching recently (some with their own problems). For 4K I think both those technologies are even more of a necessity and as 4K catches on I think people will just buy monitors with FSync/GSync and continue to purchase them in the future.
     
  3. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    ASUS has great G-Sync monitor IPS 1440p @144Hz, it may not be true IPS, but whatever. It is priciest of those monitors.
    Then ASUS makes real IPS 1440p @144Hz with Freesync which is in reasonable price. Too bad They allow Freesync to run only between 35 and 90Hz.

    So, you are right to the spot. Someone is failing, it is not VESA, it is not AMD with nVidia. Failure is in manufacturers producing half-assed product and asking lot more that it is worth, so they can sell fully working refresh in 2 years.
    Or they give you proper product but at extreme price.
     
  4. MaxBlade

    MaxBlade Master Guru

    Messages:
    907
    Likes Received:
    13
    GPU:
    980Ti/970/390/+ more
    This is just getting silly. Everyone's popping up "we have the REAL benchmarks". And duh there all honest.

    This is why YOU buy the card YOU test it. You get a good idea when you read a review but for some odd reason you never seem to get the same results. lol
     

  5. Humanoid_1

    Humanoid_1 Guest

    Messages:
    959
    Likes Received:
    66
    GPU:
    MSI RTX 2080 X Trio
    One thing I will say about the Fury X is they have made a smart choice of fan to bolt onto the water cooler.

    It is made by Nidec Servo Corporation, but was sold by Scythe as the Gentle Typhoon

    [​IMG]

    The duct ring incorporated around the blade allows this 25mm deep fan to have characteristics of a 38mm deep unit while also helping it reduce blade noise - so good static pressure which is perfect for radiator use.

    Assuming it is the "low end" model then it has a maximum 3,000rpm speed while only producing 36.5dBA

    This thing at max speed can really move some air : 83.0 CFM = 141 m³/h
    (this is a Lot more air than most 120mm fans move)

    Potentially if it is a higher end model it can be capable of shifting a Lot more air when cranked up!


    We now know the Fury X only pulls up to 275W under load, but we've also heard MAD bragging about the hugely over engineered power supply circuitry they claim will make it an amazing overclocker. Well, with this fan it seems to have the ability to dissipate a Lot more heat than it will produce at stock, could be it will be a bit of a Monster when pushed :thumbup:
     
    Last edited: Jun 20, 2015
  6. rl66

    rl66 Ancient Guru

    Messages:
    3,931
    Likes Received:
    840
    GPU:
    Sapphire RX 6700 XT
    just some few thing:

    is it better 2g DDR3 or 1g GDDR5?
    in most of the case 1 g GDDR5 do better... it will be the same with new memory right now the mem of the Fury is an exclusive great news...

    Why always compare to Titan X? it's a semi-pro entry level and not expensive card... for gaming the 980Ti do better for lot less money. the Fury is not in competition with Titan X as there is no equivalent in AMD range
    (out of subject: i think that Titan range is too expensive for gaming and not enough "with balls" for a pro use... quadro is better for that... but if some people buy it then why not sell it :) ).

    just for fan boy (whatever prefered color) wait for the review... "who is more powerfull? the hippo or the elephant?" is more like masochist endless wondering... both will be outdated in few month lol.

    and finaly about Hillbert: he is most of the time very precise and tell if NVidia or AMD suck... he never have a "color" (despite sometime he write what you don't want to read lol).

    So yes i wait for the review with a mojito on the beach and i don't think this news need so much fever...

    *edit* And about $ to € conversion all country are not at the same price
    1 US$ = 0.880545234 €
    in my country you add 21% of tax + 0.10% of recycle tax :banana:
    150Km away (crossing fronteer) the tax are only about 15% and that's all :)
     
    Last edited: Jun 19, 2015
  7. Bansaku

    Bansaku Guest

    Messages:
    159
    Likes Received:
    7
    GPU:
    Gigabyte RX Vega 64
    Nice benchmarks! This just confirms that my CFX HD7950s are still relevant as they pull in similar frame rates at the same settings. And that's before W10 and DX12. Looks like I will be keeping my money for yet another generation.
     
  8. waltc3

    waltc3 Maha Guru

    Messages:
    1,445
    Likes Received:
    562
    GPU:
    AMD 50th Ann 5700XT
    Lots of this stuff is marketing--after all, what people perceive to be true is often more compelling for them that what is actually true...;) It's human nature.

    Of course, tech like HBM is not marketing but very real & substantial...but Freesync & Gsync...

    When Gsync "demos" first started making the rounds, every example of "Here's what the game looks like with no gsync support" was noticeably worse--sometimes much worse--than what I was seeing with my middle-of-the-road GPU. It was actually very funny to me and I thought, "Man, if this is how bad nVidia tech has gotten over the years then I genuinely feel sorry for those folks because they really do need something like gsync!"

    But really, nVidia tech is not *that bad*--no way, and the "non-gsync" examples featured on so many web sites were of course deliberately crummied up to create the illusion of this huge contrast between gysync on and gsync off. My non-gsync, non- freesync gaming has *never* looked as bad as those examples designed puff up the importance of a gimmick feature--not even close. It's pure marketing, imo.
     
  9. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    Lmao. Why keep bringing up the people that were complaining about no dvi? Us korean monitor guys said our peace already. Unless some people like complaining about other people complaining?
     
  10. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB

    Reviews give us an idea. Something to compare with. Without them you probably would make a bad decision based on things like "It's a nice colour" or "The lights are pretty" instead of actual gaming performance. That's why we're here :)
    I can't afford to spend a grand on a GPU to see how it performs :D
     
    Last edited: Jun 20, 2015

  11. Rage

    Rage Member

    Messages:
    18
    Likes Received:
    8
    GPU:
    EVGA GTX680 2GB
    Another AMD fan with super low expectations? If AMD releases a "cutting edge" card that competes with an already available nVidia card on year old Maxwell architecture for the same price? To me that's a fail. All nVidia would have to do is drop the price once F/FX hit the shelves and they have a better bang for buck. Not to mention all the intangibles that come with nV GPU's (better driver support, better software). AMD needs to TKO the 980TI if they want to gain any kind of ground between now and Pascal's launch. We'll see once we get real benchmarks from G3D later.

    Company X: "Our products are the best!"
    Consumer: "YES! Congratulations Company X! Your products are officially the best, because you said so!"

    What am I reading here? The raw performance advantage is always on AMD's side? What alternate reality are you coming from? NV has been walking over AMD GPU's since the HD5000 series cards. So desperate for a win over NV that you're just taking AMD's own in-house results as gospel. It's unreal.

    Read your own post. "If Fury X beats GTX980ti that is well and good, if Titan X beats Fury X then that is also good news since it(Titan X) at least needs to justify its $350 give or take, higher price tag."

    The 980TI beats the TX.
    The 980TI beats the TX.
    The 980TI beats the TX.
    I said that three times so it would sink in. If the Fury X beats the 980TI and the 980TI beats the TX, what does that tell you about the Fury X versus the TX? Mind blown?

    Also, the 980TI and Fury X are both the same price. So you could say the TX doesn't justify it's $350 higher price tag (over the 980TI) and you would be partially correct, but you shouldn't. The Titan cards aren't designed strictly with gamers in mind. They do (non-game related) things better than the 980TI/Fury X, but the 980TI/Fury X will probably both be better for gaming.

    What in the world are you talking about? I believe your first mistake is assuming that techpowerup is rebenchmarking the older cards every time a new card comes out. They aren't. They are taking the results from previous reference benchmarks and using those results for comparison to the new hardware's benchmarks. Just like the guys here at Guru aren't breaking out the HD6950 to benchmark with the 980TI in this benchmark.

    [​IMG]

    And performance decreases from successive driver updates? What? Firstly, not every driver update is intended to increase performance. Secondly, the driver updates have been steadily bringing performance increases since the 600 series (when I jumped off the sinking AMD ship). Rather than going off of the loud minority's whining did you ever look how the drivers were actually performing in benchmarks specifically setup in order to do so?

    "So far, we would recommend upgrading to the latest GeForce 350.12 driver because there are generally incremental advantages, and no large performance-impacting negatives that we encountered"
    -- http://www.babeltechreviews.com/geforce-350-12-whql-mini-performance-analysis/view-all/
    As should be expected.

    "The results demonstrate that, for 1080p gaming at least, a year's worth of driver updates roughly equates to a five per cent increase in graphics performance. In our particular suite of tests both Nvidia cards yielded higher average increases than their AMD counterparts, although, margin of error makes it too close to call a 'winner,' for want of a better term."
    -- http://hexus.net/tech/reviews/graphics/79245-amd-nvidias-2014-driver-progress/
    So the 2014 Most Improved GPU Performance Via Driver Updates award goes to... Nvidia. History repeating itself.
     
    Last edited: Jun 20, 2015
  12. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    Sometimes I don't know why I even bother, but here goes.

    Yes, year old... the Titan X hit the market around April 2015. Maxwell may be a year-old architecture (it's actually more if you count the 750Ti, but I'll humor you), but that's taken out of context; GM200 has only been available for 2-3 months now, and the affordable variant of such has only been available for a month, in the form of the 980Ti.

    I smell a fanboy, and the stench is pretty awful. I wonder how you would call Nvidia's GPUs "walking" over AMD's since the 5000 series since:

    -HD5000 released December 2009 and held the performance crown for FOUR months, stomping out the GTX260, GTX280, GTX295 at the time.

    -The HD5970 was pretty much the fastest card for a very long time.

    -The GTX470 and GTX480 were power suckers and heat monsters, even with the GTX480 slightly ahead of the HD5870 after four whole months of waiting for release.

    -AMD's Eyefinity had just released and was a big thing at the time. There was *no* Nvidia Surround yet.

    -AMD issues a slight upgrade to its range with the 6000 series, and Nvidia does the same with the 500 series. GTX580 holds the performance crown, but AMD's prices are killer.

    -HD7000 series vs. GTX670 / 680. I won't comment much about this, but guess which cards are still relevant nowadays and which ones are pretty much obsolete? HD7000 had more VRAM, faster memory, beefier GPU, much better compute, scaled better in CrossFire, AND were cheaper. Yeah, good luck convincing anyone how Nvidia were walking all over AMD during 2012-2013.

    I guess those three times got you pretty convinced yourself as well. The 980Ti does not, and cannot possibly beat the Titan X at the same clockspeed. Both cards overclock pretty well, and if you're comparing an overclocked, non-reference 980Ti with a stock Titan X, I'm afraid I'll have to tell you the obvious.

    At similar clock speeds, the Titan X beats the 980Ti.
    At similar clock speeds, the Titan X beats the 980Ti.
    At similar clock speeds, the Titan X beats the 980Ti.

    However, the marginal difference is not worth the Titan X's price which is why the 980Ti is the better card, plain and simple. Just don't mislead people into the crazy, unbelievable idea that the 980Ti beats the Titan X because it does not.

    Should have read up on Maxwell's lack of Double Precision compute power, then. Maxwell in general is crippled in Double Precision, unlike GK110 (Big Kepler). That includes GM204, GM206, GM200, and whatever other variants of Maxwell there are out there. So, all the 980Ti and Titan X GPUs (GM200) are is just a bigger version of the 970 / 980 GPUs (GM204). No additional Double Precision, nothing. Just more VRAM, and a faster memory subsystem (due to the increase in bus width from 256-bit to 384-bit).

    The Titan X is no workstation card, unless you're doing graphical rendering that does NOT require Double Precision. In the case where it does, the Titan X is not even on the list of cards to be considered, and Quadros, Teslas, and FirePros take over from there.

    Peace :)
     
    Last edited: Jun 20, 2015
  13. Hughesy

    Hughesy Guest

    Messages:
    357
    Likes Received:
    1
    GPU:
    MSI Twin Frozr 980
    Yea that was a very smart move. I still have three Gentle Typhoons that I used when I had a custom loop, and they are by far the best fans IMO.

    If I didn't have a 980 I probably would have bought the Fury X, as my 1440p Dell monitor can do 1440p over DP. It's only over HDMI that it can't. I'll probably get the next card from AMD or Nvidia. I like that the Fury X uses watercooling, as due to my disability I couldn't maintain my custom loop, so this is the next best thing.
     
  14. mR Yellow

    mR Yellow Ancient Guru

    Messages:
    1,935
    Likes Received:
    0
    GPU:
    Sapphire R9 Fury
    Hilbert, when can we expect Guru's review?
     
  15. snip3r_3

    snip3r_3 Guest

    Messages:
    2,981
    Likes Received:
    0
    GPU:
    1070

    I think he has a few points right though. Maxwell as in the architecture is already poised to be replaced soon. While technically AMD is also using an older architecture (Tonga), it is already leveraging a next generation memory system via HBM. All the hype surrounding it has propped up Fiji into a position that it simply cannot meet the expectations set for it. I think that people want to see bigger differences between the two vendors, +/- 5-10% is marginal at best and wouldn't actually allow you to play games that you couldn't with either top end card.

    As for the previous generations... AMD had better performance but that was about when they started to bleed market share. The winners of those generations via sales were the GTX460, 560Ti, and with the Kepler/GCN generation, the mining boom strangled and cut off supply of basically ALL GCN cards (well, the bigger 78xx and 79xx) for a good half a year or more. The prices were insane (I got 2 7950s before the boom, luckily) and likely drove sales towards Nvidia even though Tahiti was generally a bit quicker than Kepler (though at the time it lacked Mantle, VSR, and VCE enabled recording).

    Someone posted the market share chart a in another thread, and AMD really needs to step up their game because just matching Nvidia isn't good enough. They need a solid upper-midrange, C/P card like the 670, 780s, and now 970. They also need to step up their mainstream cards as that bracket is often the one that contains the most sales via OEMs/HTPC market. TDP and efficiency are important down there. By focusing only on the enthusiast tier, even if AMD wiped the floor this generation, it still wouldn't generate much more sales for them simply because you don't have that many consumers tech savvy enough or have pockets deep enough for $600 GPUs. With the rest of the midrange and mainstream tiers being complete rebrands, the appeal of them will be much lower. Nvidia also has the capability of reducing GM204's price more so than AMD with Hawaii/Grenada due to it being less complex (-1B transistors, 256 vs 512 memory bus, theoretically less cost needed for VRM/cooling). Regardless of whether or not Fiji is the top dog this generation and hence becomes the "halo" product, AMD doesn't have a compelling win in the rest of the segments (serious just buy a vastly cheaper, but "old" 200 series card).

    As for TitanX, I think the purpose this time was really for the rendering crowd, which is still sizable, and more "consumer/prosumer" vs HPC's generally scientific or industrial origins (=they have more money to spend). By making the past Titans good at everything, they might have cannibalized their Tesla and Quadros (maybe not so much due to locked features), which are multiple times more expensive.
     

  16. zcess81

    zcess81 Guest

    Messages:
    8
    Likes Received:
    0
    GPU:
    EVGA GTX 980 Ti SC
    I already bought EVGA GTX 980ti SC ACX and I do not regret the decision at all. Amazing card. Whisper quiet, cool and destroys any game I throw at it at 1440p...and it does all this at 250W. Personally, I wouldn't go for AMD because I've had driver issues in the past, and AMD are notorious for long driver update waiting times.

    Having said that, I hope their new card does well, because the last thing I want is NVIDIA monopoly in the graphics card market. They're already leading by quite a margin, but without AMD, the prices would be even higher then they already are, and we would see little to no innovation/performance increase from generation to generation.

    I hope Fury X does well or even beats 980ti -- makes no difference to me. I'm happy with my 980ti and that's all that matters.
     
  17. Sukovsky

    Sukovsky Guest

    Messages:
    967
    Likes Received:
    76
    GPU:
    GTX 1080
    I have never had any issues with drivers from AMD. Maybe it's because I know how to build a pc.

    I am getting a Fury X.
     
  18. MasterfulSaber

    MasterfulSaber Guest

    Messages:
    19
    Likes Received:
    0
    GPU:
    STRIX GTX 970|RL2455HM
    +1 for @Yasamoka on this:
    I guess those three times got you pretty convinced yourself as well. The 980Ti does not, and cannot possibly beat the Titan X at the same clockspeed. Both cards overclock pretty well, and if you're comparing an overclocked, non-reference 980Ti with a stock Titan X, I'm afraid I'll have to tell you the obvious.

    At similar clock speeds, the Titan X beats the 980Ti.
    At similar clock speeds, the Titan X beats the 980Ti.
    At similar clock speeds, the Titan X beats the 980Ti.

    However, the marginal difference is not worth the Titan X's price which is why the 980Ti is the better card, plain and simple. Just don't mislead people into the crazy, unbelievable idea that the 980Ti beats the Titan X because it does not.
    +1 again


    As for my part:
    The topic on hand is a leaked official benchmark from AMD which obviously does not state an AIB GTX980Ti card, means there was no OC, no alterations, a reference card infact VS an R9 Fury X. Everything has its context.:book:
    So yeah at stock speeds the Titan X beats the 980Ti. Similar to @yasamoka said.

    And as for the thing with the Titan X's price, I was being a bit sarcastic...get it now? ;)
     
    Last edited: Jun 20, 2015
  19. 0blivious

    0blivious Ancient Guru

    Messages:
    3,301
    Likes Received:
    824
    GPU:
    7800 XT / 5700 XT
    No need to bring up crossfire woes or any of that these days as it's becoming absurd. Nvidia has released 8 new WHQL drivers since the last time AMD released their current driver (Dec 2014). Some folks find the AMD way just dandy. Some find that lacking.

    Pretty much this entire forum knows how to build a PC so I'm not sure how that statement has anything to do with driver issues? Since when has building a PC improperly been the root of a driver issue anyways?
     
  20. heffeque

    heffeque Ancient Guru

    Messages:
    4,429
    Likes Received:
    208
    GPU:
    nVidia MX150
    Seems like nVidia has driver problems if they actually need to out so many WHQL drivers in such a short period of time.
     

Share This Page