Radeon Fury X Beats GeForce GTX Titan X and Fury to GTX 980 Ti: 3DMark Bench

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 17, 2015.

  1. TyrantofJustice

    TyrantofJustice Ancient Guru

    Messages:
    5,021
    Likes Received:
    26
    GPU:
    RTX 2080 super
    eclap.... link me to a 1440p adapter
     
  2. xIcarus

    xIcarus Master Guru

    Messages:
    945
    Likes Received:
    90
    GPU:
    1080 Ti AORUS
    Hm you are right, didn't think about that.
    I assumed it had 4x8pin connectors, but I never actually pay much attention to dual gpu cards.
     
  3. Aura89

    Aura89 Ancient Guru

    Messages:
    7,875
    Likes Received:
    1,043
    GPU:
    -
    Fixed that for you

    In terms of how you just tried to compare the two, you can't, i compared them as the point behind the idea of "one person says they don't have a problem, therefor the problem doesn't exist" is useless

    Your comparison is hardware breakage vs software that hopefully gets fixed but still can mostly play your games alright even if it doesn't.

    One prevents all possible usage, while the other doesn't, so ofcourse microsoft is going to do something about it, and ofcourse it would get widely publicized

    Aka, no need for AMD to "extend their warranty" or "millions/billions of dollars in write offs" due to their drivers being horrible
     
  4. Agonist

    Agonist Ancient Guru

    Messages:
    2,823
    Likes Received:
    205
    GPU:
    RX Vega 56 8GB
    I used display port on my R9 290, and am now with my 670 sli.
    I have the LG 25UM55-P
    It has 2 HDMI, and Display Port
     

  5. eclap

    eclap Banned

    Messages:
    31,497
    Likes Received:
    3
    GPU:
    Palit GR 1080 2000/11000
    According to ToastyX, this adapter is supposed to go up to 400mhz pixel clock. The advertised 330mhz pixel clock limit would equal to 1440p at 82hz. 400mhz would be good for roughly 1440p and 100hz. Problem is, it's $95. It's utter BS that AMD are dropping DVI-D support.
     
    Last edited: Jun 17, 2015
  6. mR Yellow

    mR Yellow Ancient Guru

    Messages:
    1,935
    Likes Received:
    0
    GPU:
    Sapphire R9 Fury
    I'm sure they will ship the cards with an adaptor.
    It would be stupid not too. DVI is so old school :p
     
  7. Deathchild

    Deathchild Ancient Guru

    Messages:
    3,970
    Likes Received:
    2
    GPU:
    -
    ^ yeah but what if that adapter doesn't support more than 60hz and... 1080p etc... lol. o,O

    I'm telling you man, ****ing ridiculous... they could've at least added one dvi-d port.. geez.... all these qnix users and everything lol.

    If fury x doesn't come out with a dvi-d I'm not getting it.. I'm getting a 980 Ti instead lol... so be it, but it's ****ing expensive.. -.-'

    Not splashing 80$ on a ****ing adapter... wtf. I can add that 80$ and get a 980 ti instead lol. :D
     
  8. mR Yellow

    mR Yellow Ancient Guru

    Messages:
    1,935
    Likes Received:
    0
    GPU:
    Sapphire R9 Fury
    lol, i'm sure you'll regret it once DX12 titles appear :)

    Maybe prices will drop due to supply and demand.
     
  9. Deathchild

    Deathchild Ancient Guru

    Messages:
    3,970
    Likes Received:
    2
    GPU:
    -
    Regret what, getting a 980 Ti ? :D Or.. what.
     
  10. eclap

    eclap Banned

    Messages:
    31,497
    Likes Received:
    3
    GPU:
    Palit GR 1080 2000/11000
    They won't ship the cards with adaptors capable of high pixel clocks.
     

  11. mR Yellow

    mR Yellow Ancient Guru

    Messages:
    1,935
    Likes Received:
    0
    GPU:
    Sapphire R9 Fury
    980Ti...me thinks DX12 is were AMD will dominate.

    @ Eclap, most people won't care. They just want their frikken DVI port :p
     
  12. eclap

    eclap Banned

    Messages:
    31,497
    Likes Received:
    3
    GPU:
    Palit GR 1080 2000/11000

    You either have some pre-release DX12 benches or you're full of fanboy juice.
     
  13. Deathchild

    Deathchild Ancient Guru

    Messages:
    3,970
    Likes Received:
    2
    GPU:
    -
    Why DX12? Lots of unit count... what, drawcalls? :D Don't think it has any difference.

    It will be good for both cards. Gaming overall for that matter. :D

    And why is everyone saying adaptor, isn't the correct word ADAPTER? ffs.. :D

    rofl fanboy juice hhaahha rofl... yellow. :D

    Anyways... yellow, ofc ppl will care, I don't want to run fking 60hz, it sucks bro. Difference between 114hz and 60 is like black and white.
     
  14. mR Yellow

    mR Yellow Ancient Guru

    Messages:
    1,935
    Likes Received:
    0
    GPU:
    Sapphire R9 Fury
    I'm going on a hunch ;)
     
  15. eclap

    eclap Banned

    Messages:
    31,497
    Likes Received:
    3
    GPU:
    Palit GR 1080 2000/11000
    erm.. do you even read? that's exactly what I've been talking about! I want a DVI-D port on new AMD cards. They either put one on there or it's going to cost me $95 for an adapter. Guess what I'll do? I'll get a 980ti.
     

  16. Deathchild

    Deathchild Ancient Guru

    Messages:
    3,970
    Likes Received:
    2
    GPU:
    -
    yellow was referring to the fact that they just want "dvi ports" .. but ... it will matter for ppl running 60hz + bro.
     
  17. Hirantha

    Hirantha Master Guru

    Messages:
    254
    Likes Received:
    1
    GPU:
    EVGA 980ti Hybrid (BEAST)
    If the actual numbers match to what we see here I’m definitely jumping on the AMD band wagon. Enough of Nvidia trying to Milk every cent out of poor gamers like us by giving us just a 5% increase in their cards.
     
  18. eclap

    eclap Banned

    Messages:
    31,497
    Likes Received:
    3
    GPU:
    Palit GR 1080 2000/11000
    Not to be negative, but even if these numbers are accurate, you're paying 980ti prices for the FuryX that performs pretty much the same, only it has less vram. I don't see how nvidia are the bad guys here.

    I feel like I'm coming across like I'm bashing AMD, not true at all, but if 2 cards perform the same and cost the same, only one has 2gb more vram then gets labeled as made by made by a money grabbing company, it kinda rubs me up the wrong way.
     
  19. afaque

    afaque Member Guru

    Messages:
    130
    Likes Received:
    0
    GPU:
    x1300/g210/hd5750/r9280x
    btw u are trying to support nvidia on this, but ive already seen in these forums a link in which a thorough anaylisis of hbm in which it is written that 4gb hbm is pretty equal in performance to a 6gb gddr5 performance.. so there is no difference in the vram too, but if by seeing the benchmarks r9 fury x beats titan x in alot of benchmarks, ill say its good to go with the winning gpu at the same price, and its with new tech too. :)
     
  20. sounar

    sounar Master Guru

    Messages:
    696
    Likes Received:
    0
    GPU:
    EVGA 980TI SC
    Well that is not be entirley true. Amd Fury memory, HBM promises to deliver 4.5 times the bandwidth of GDDR5 and is said up to 9 times faster than GDDR5, which is what the current 980TI carries. In theory this means the video memory can be brought in and let go multiple times quicker than on GDDR5 resulting in alot less Vram usage. There are articles going around stating Fury's HBM 4GB is equal to 6GB GDDR5.
     

Share This Page