Review: AMD Radeon R9 Fury X

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 24, 2015.

  1. eclap

    eclap Banned

    Messages:
    31,497
    Likes Received:
    3
    GPU:
    Palit GR 1080 2000/11000
    8800gtx wasn't out when Oblivion launched, the top nVidia card back then was the 7800gtx. I was running a 6600gt prior to Oblivion launch, then upgraded to the second best single gpu on the market, 7800gt. The game still ran like a pile of steaming sh1t. I wouldn't draw conclusions based on Oblivion. In fact, my x1800xt and x1900xtx were a massive upgrade for Oblivion, those cards were beasts.

    EDIT: Actually, thinking back, the 7900gtx was the top nVidia single gpu card back then. My bad.
     
    Last edited: Jun 24, 2015
  2. ---TK---

    ---TK--- Ancient Guru

    Messages:
    22,112
    Likes Received:
    2
    GPU:
    2x 980Ti Gaming 1430/7296
    The fury will be more likely to change in price than the ti.
     
  3. bigosik

    bigosik Banned

    Messages:
    161
    Likes Received:
    0
    GPU:
    3072
    New review :)

    I think we have here some demanding scenes (CPU)
    Link: http://pclab.pl/art64026-23.html

    [​IMG]

    [​IMG]
     
  4. Undying

    Undying Ancient Guru

    Messages:
    12,825
    Likes Received:
    2,141
    GPU:
    Aorus RX580 XTR 8GB
    Ok, where is Fury (non-X) ? Maybe we can expect similiar performance at cheaper price just like 290 did vs 290X.

    Nano will be coming as well, it can be interesting very soon.

    bigosik, stop posting that links from that Poland side, its one sided and no one gives a crap
     
    Last edited: Jun 24, 2015

  5. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,175
    Likes Received:
    86
    GPU:
    RX 580 8GB
    Next month I think.
     
  6. Mineria

    Mineria Ancient Guru

    Messages:
    3,931
    Likes Received:
    40
    GPU:
    Asus RTX 2080 Super
    What a shame, was hoping that AMD pulled an Ace out of their sleeves and not coming with a card that doesn't suffice for 4K gaming.
    Not saying that the card is bad, but compared to it's competitor it doesn't pack anything extra, rather the opposite.
    I highly doubt that driver updates/tweaks will be able to fix anything for the card.
    No reason for me to switch now, but I already had the suspicion that the Fury wouldn't be as super as some fake leaks and peoples HBM performance assumptions claimed.
    Just didn't make sense with the rest of the cards stats, you just can't push more than what the rest of a card is up to, simple as that.

    Hopefully AMD manages to come with something that clearly beats NVidia with the next generation of cards.
    Cause they won't keep surviving with on pair performance with less features, which also could lead to even higher prices.
     
  7. A M D BugBear

    A M D BugBear Ancient Guru

    Messages:
    3,168
    Likes Received:
    311
    GPU:
    4 GTX 970-Quad Sli
    I was testing the card with oblivion I meant.
     
  8. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,175
    Likes Received:
    86
    GPU:
    RX 580 8GB
    Driver optimizations can make all the difference, it's not that it can but will it be done. AMD already have a huge overhead compared to NVIDIA (lack of multi-threaded DX11) imagine if AMD did support it?
    It's all drivers. AMD has some great hardware and it's always let down by drivers. They've just made their current branch support the new cards and then they've been separated from all the previous models for some reason. There's nothing new there.
     
  9. Isbre

    Isbre Member Guru

    Messages:
    196
    Likes Received:
    0
    GPU:
    290X Matrix H2O
    Seems to shine at 2160p. A shame it does not offer downsampling with 120Hz at that resolution, only 60Hz so that forces me to buy from the green team. (hope i'm wrong)
     
  10. Agent-A01

    Agent-A01 Ancient Guru

    Messages:
    11,363
    Likes Received:
    899
    GPU:
    1080Ti H20
    No we didnt expect it to be, but we all hoped it would be faster.

    Anyways guys, i caved and bought a reference 980ti and a ek block..
     

  11. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,587
    Likes Received:
    1,001
    GPU:
    EVGA 1080ti SC
    Where are you getting EK blocks from in the states now? FrozenCPU haven't updated since the 780Ti.
     
  12. bigosik

    bigosik Banned

    Messages:
    161
    Likes Received:
    0
    GPU:
    3072

    On different sites the same.

    CPU-Benchmarks & Frametimes


    Link:
    http://www.computerbase.de/2015-04/gta-v-grafikkarten-und-prozessor-benchmarks/2/


    The frametimes on NV cards are way better, smooother gameplay.
    Very high spikes on 290X

    10FPS more (CPU Test)
     
    Last edited: Jun 24, 2015
  13. wiak

    wiak Member

    Messages:
    29
    Likes Received:
    0
    GPU:
    RX 480 Gaming X
    well both intel and amd has stated they will remove/phase out DVI/VGA/LVDS
    5 freaking years ago...

    http://newsroom.intel.com/community...digital-display-technology-phasing-out-analog

    and here is 3 years ago from amd
    http://www.pcworld.com/article/248421/vga_dvi_display_interfaces_to_bow_out_in_five_years.html

    so please stop whining, start to use DP to DVI-DL adapters and DP1.2 to HDMI 2.0 adapters (coming this summer)
    you wont find DVI/VGA on intel products either in the future

    and i also believe some AIB partners will create cards with DVI/HDMI2 on them by using DP1.2 to HDMI 2.0 chips or different layouts for DVI-DL (no HDMI anyone)
    sapphire did this on the "SAPPHIRE R7 250 1GB GDDR5 EYEFINITY EDITION"
     
  14. eclap

    eclap Banned

    Messages:
    31,497
    Likes Received:
    3
    GPU:
    Palit GR 1080 2000/11000
    why so angry? not a bad 2nd post after being away for 5 years. Look at you go!
     
  15. Chillin

    Chillin Ancient Guru

    Messages:
    6,814
    Likes Received:
    1
    GPU:
    -
    Where's your source for that? Back that up, since historically they don't. And just to repeat my evidence for that:

    Why do people think drivers are going to magically speed up the card by magnitudes? Here's the breakdown for the 290x from release driver (13.11 Beta 6) to driver at time of review (Catalyst 14.7 RC1, about 9 months after launch):

    -In 3DMark the R9 290X gained 2.5% in FireStrike and there was no change in FireStrike Extreme.
    -In Battlefield 4 the R9 290X gained 1.3%.
    -In Bioshock Infinite the R9 290X saw no change.
    -In Metro Last Light the R9 290X gained 1.8%.
    -In Sleeping Dogs the R9 290X saw no change.
    -In Tomb Raider the R9 290X saw no change.

    http://www.ete-knix.com/examining-amds-driver-progress-since-launch-drivers-r9-290x-hd-7970/

    And Hexus's test came out the same for the course of a year's driver updates (Jan-Dec 2014):
    Average AMD 290: +4.1%
    Average Nvidia 780 Ti: +6.2%

    http://hexus.net/tech/reviews/graphics/79245-amd-nvidias-2014-driver-progress/?page=6

    So where's your, and several other people here, statement of drivers massively improving performance coming from?
     

  16. TestDrivers

    TestDrivers Member Guru

    Messages:
    109
    Likes Received:
    3
    GPU:
    MSI 980Ti Gaming 6G
    When the 780Ti came out, it beat the 290x in almost every game. Now we see it lags behind in almost every game. So perhaps in a years time the Fury X will beat the 980Ti...
     
  17. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,175
    Likes Received:
    86
    GPU:
    RX 580 8GB
    I said that they can (if they worked on it) but they won't...
     
  18. bigosik

    bigosik Banned

    Messages:
    161
    Likes Received:
    0
    GPU:
    3072
  19. Undying

    Undying Ancient Guru

    Messages:
    12,825
    Likes Received:
    2,141
    GPU:
    Aorus RX580 XTR 8GB
  20. eclap

    eclap Banned

    Messages:
    31,497
    Likes Received:
    3
    GPU:
    Palit GR 1080 2000/11000
    I don't know whether to laugh or just hold my head in my hands. I laughed first though, I'll give you that.

    clutching for them straws, I see.
     
    Last edited: Jun 24, 2015

Share This Page