AMD Teases PCs with Radeon Fury X2 Dual FIJI GPUs

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jan 25, 2016.

  1. jura11

    jura11 Guest

    Messages:
    2,640
    Likes Received:
    707
    GPU:
    RTX 3090 NvLink
    Depends what price will be and what performance will be in OpenCL,then I will go with this card,but still Polaris is around corner(end of the year I hope so),then I don't know,but all depends on performance in my view



    Thanks,Jura
     
  2. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    This card will be another AMD PR stunt in the multi-card GPU niche.

    Theoretical on paper power and in reality USABLE power are two different things.

    CFX support since Crimson drivers (specially 16.1) is &%$·& and previous AMD multi-card GPU 295x2 has ZERO AMD support.

    His official page:

    www.amd.com/en-us/products/graphics/desktop/r9/295x2 still gives this message:

    [​IMG]
     
  3. GhostXL

    GhostXL Guest

    Messages:
    6,081
    Likes Received:
    54
    GPU:
    PNY EPIC-X RTX 4090
    Yeah it won't be obsolete, but as you said before it's late. Would be better off focusing on their new gpu's.
     
  4. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    With the unsure future of multi-GPU's in general, not sure about this. CFX support isn't improving in DX11 games. Only 4GB effective VRAM and a price tag to follow the 295x2.
     

  5. DeadlyBGShadow

    DeadlyBGShadow Guest

    Messages:
    15
    Likes Received:
    0
    GPU:
    RX VEGA64-O8G
    The untimely release of this product that is still not in stores will make it's short lifespan in terms of relevance unattractive to the person who will spend so much money on High End Video Cards with all new gimmicks.

    Because Other than the ability to run present products at relevantly good FPS you are locked on a hardware that is not in the technology peak of present time.
    It's the best product of old generation products that are one breath of becoming unattractive to people with money that want to have the new HDMI ; The New GCN; The new "DX12 Full support" and so on*...

    If you want to use VR in the months to come and have money to spend to enjoy it for say year and a half before going for the next best thing yeah this is for you.
    But if you want to invest in GPU for "future proofing" this is the worst card to get.

    PS: Polaris is a generation of GPU that will get die shrink, VRAM Upgrade, GCN upgrade , HDMI 2.0a, DX 12 support up to date and who knows what else . It may not be the fastest GPU but it brings more to the table than 3 generations AMD GPUs together so obsolete is not off the table.
     
  6. icedman

    icedman Maha Guru

    Messages:
    1,300
    Likes Received:
    269
    GPU:
    MSI MECH RX 6750XT
    People that keep saying dx12 will solve the memory sharing need to stop. Dx12 hasn't even taken off yet first of all and second we don't even know if game devs will use that feature it could be a complete flop.
     
  7. Fender178

    Fender178 Ancient Guru

    Messages:
    4,194
    Likes Received:
    213
    GPU:
    GTX 1070 | GTX 1060
    True it can be used for the VR headsets coming out. Heck even 1 fury x is enough for the VR headsets. Also to me this card can be used for 4k.
     
  8. (.)(.)

    (.)(.) Banned

    Messages:
    9,089
    Likes Received:
    0
    GPU:
    GTX 970
    Yeah i agree. Learned my lesson with the Geforce 9800 GX2. no more gpu sandwiches for me.
     
  9. xIcarus

    xIcarus Guest

    Messages:
    990
    Likes Received:
    142
    GPU:
    RTX 4080 Gamerock
    ^ Haha GPU sandwiches. I'll remember that.

    You're saying it like HBM makes a difference. HBM or GDDR5, VRAM capacity is VRAM capacity.
    It's all up to the devs to actually do their jobs and optimize. I mean look at the beautiful VRAM usage of The Witcher 3. Shadow of Mordor looks like a bloody joke in comparison.

    Yeah I agree. It's way too late and the way AMD seems to be moving in slow-motion annoys the living crap out of me.
    This card could have been released half a year ago. Let's be honest, you have the GPU. It's not such a hard job to cram two of them on one PCB.

    Yet another missed opportunity. This card isn't even released yet and it already faces a price cut very soon once Pascal starts rolling. And I have a feeling Pascal will hit the road first. I sure hope I'm wrong. AMD would lose a ton of sales, just like the 390 barely sells compared to how the 970 sold.
     
  10. geogan

    geogan Maha Guru

    Messages:
    1,271
    Likes Received:
    472
    GPU:
    4080 Gaming OC
    I had the GeForce 9800 GX2 myself - still have actually as never managed to sell it. Then I swapped over to AMD with the dual gpu 5970 card (sold) and now currently still on the dual gpu 7990 card. I was going to get their latest dual gpu 295 card last year but decided not to as still too expensive.

    To be honest there is a lot of trouble actually using the two gpus in some games. I mostly used to play Battlefield series and they sort of supported them and got faster frame rates but now I am mostly playing Starwars Battlefront and that has absolutely atrocious support for two gpus. It actually makes the game unusable. I actually had to rollback the terrible Crimson drivers which would not allow disabling of crossfire at alliance order to get Starwars working. In fact Crimson when installed completely broke this game, caused two completely out of ordinary instant PCs shutdowns on a machine which is stable and not changed for years, and also completely destroyed the HDMI audio output on a media server machine I have..

    Really don't know what card to get next. Probably never dual gpu again and maybe nvidia again for first time in years.
     

  11. (.)(.)

    (.)(.) Banned

    Messages:
    9,089
    Likes Received:
    0
    GPU:
    GTX 970
    Yeah it frustrating coz when dual gpu works, it works well.

    I just find it ridiculous that AMD\Nvidia tout 4k, physX, gameworks, PureHair etc that require more gpu power than the top gpus can deliver.

    Then theres the issue if you buy two mid to lower end cards to save money for sli\xfire, if a new game doesnt support dual gpu, then you're simply stuck with a single low to mid card struggling to reach a stable fps.
     
  12. xIcarus

    xIcarus Guest

    Messages:
    990
    Likes Received:
    142
    GPU:
    RTX 4080 Gamerock
    This is exactly why I didn't pull the trigger on another 970 to SLI with my existing one. It's ridiculous how greedy the gaming industry has gotten. Frame rate caps and no multi-gpu support. That's a huge issue if you ask me. Although to be fair the fps cap trend is more or less over because the respective devs have been shamed out of proportion.

    I've never had SLI/CFX. If this trend continues; I may never.
    Or perhaps Vulkan or DX12 will make this task easier, encouraging the industry to support it well.
     
  13. geogan

    geogan Maha Guru

    Messages:
    1,271
    Likes Received:
    472
    GPU:
    4080 Gaming OC
    Appears to be, since the new consoles are now using the same AMD gpus, that the big development houses using the big game engines are actively only developing for single gpu since that is all that every console has. And and nvidia and motherboard manufacturers will never admit it though since they make lots of money selling the whole dual gpu, sli, crossfire idea to pc gamers.
    Thing I can't understand is why big games like Starwars which use the well known PCs frostbite engine which worked perfectly well with battlefield still can't work right with more than one Gpu. I have also heard that lots of the big game algorithms and modern rendering techniques are just not compatible with multi gpu which is very sad.
    Appears to me now consoles are number one, that multi gpu is dead as a dodo.
     
  14. chispy

    chispy Ancient Guru

    Messages:
    9,988
    Likes Received:
    2,715
    GPU:
    RTX 4090
    ^ This +1. I think more and more of us enthusiast are moving away from Crossfire and SLI due to the above mentioned statement you made. My last SLI set up was a VGA GTX 690 and my last Crosssfire set up was 4x 290X Quadfire , when they worked they deliver some amazing performance , but since a couple of years back i have started to noticed a trend on both camps Nvidia and AMD where support for SLI/Crossfire has been lacking on a lot of tittles , i see it more and more often now.

    The Gemini Dual FuryX will no doubt will be a beast of a card , but i wonder for how long those merely 4GB of HBM memory will last on the upcoming years to come with the new AAA games sometimes exceeding 4+ GB of video ram at 4K gaming, I love AMD cards but Gemini it is too late to the party in my opinion :/
     

Share This Page