Crossfire is no longer a significant focus says AMD CEO Lisa Su

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 22, 2019.

  1. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,016
    Likes Received:
    4,395
    GPU:
    Asrock 7700XT
    Well, keep in mind price goes up exponentially with chip size. That's why Intel's monolithic design is no longer deemed cost effective. As pointed out @Embra, if AMD can pull of a chiplet design with their GPUs, that will be the future of making absurdly powerful GPUs at an affordable price.
     
  2. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,038
    Likes Received:
    7,379
    GPU:
    GTX 1080ti
    chiplets have some interesting hurdles to overcome to be high performant in dedicated graphics cards.
     
  3. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Yes, but the price... Cost efficiency is strong motivation. Take current Zen2 chiplets and range of products same dies are used on.

    Imagine AMD tapes-out chiplet with 10 WGPs and can scale it as much as interposer, I/O die and power efficiency allows.
    No more reason to keep old product stack around for few generations. They would be able to replace everything. And cost of making top GPUs (cards) would not have that extra of tapeout which is now main blocker as lower sales makes it harder to get return of investment.

    Everything would be quite economical. And that would mean lower prices for us. And faster iterations of new generations. Because designing each chip and working on separate tapeout => validation is time consuming too.

    (And I doubt that I wrote even half of benefits.)
     
    Embra, airbud7 and schmidtbag like this.
  4. fellix

    fellix Master Guru

    Messages:
    252
    Likes Received:
    87
    GPU:
    MSI RTX 4080
    Both SLi and CF are using AFR to balance the workload and that technology has become more and more at odds with the advancement of the game engines. In the past it was much more straightforward, where the game will render everything into a single memory target (frame-buffer) and start again fresh with the next frame.
    Now games have scattered the different stages of the image composition in multiple off-screen buffers and on top of that the growing use of sampling data from previous frames throws more wrenches in the delicate driver-based management of the multiple video memory pools. Obviously, Nvidia and AMD are not willing to spend more resources to do the legwork for the game devs, since both DX12 and Vulkan APIs already expose direct multi-GPU controls at application level.
     
    geogan likes this.

  5. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,038
    Likes Received:
    7,379
    GPU:
    GTX 1080ti
    such an interposer would have to run faster than nvidia's nvlink fabric
     
  6. BlueRay

    BlueRay Guest

    Messages:
    278
    Likes Received:
    77
    GPU:
    EVGA GTX 1070 FTW
    Good riddance.
    Too many issues with it.
     
  7. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    2,952
    Likes Received:
    1,244
    GPU:
    .
    Not a big deal, in any case D3D12 and Vulkan are able to manage multiple node adapters and multiple independent adapters. With PCI-E 3.0 and 4.0, such "optimized" share bandwidth channels begin to become less important then ever (AMD removed the Crossfire bridge many years ago..)..
     
  8. Dimitrios1983

    Dimitrios1983 Master Guru

    Messages:
    348
    Likes Received:
    114
    GPU:
    RX580
    To be honest I was never a fan of running two GPU cards. I can understand if new games are demanding like another Crysis game but unfortunately game developers penny pinch and take baby steps just to be safe.
     
  9. Mpampis

    Mpampis Master Guru

    Messages:
    249
    Likes Received:
    231
    GPU:
    RX 5700 XT 8GB
    Rememeber the "Chiplet design GPUs" discussion?
    The engineers said that the major problem with a chiplet GPU is to make the system recognise it as a single GPU.
    We should have guessed back then that multi GPU setups have no future.
     
  10. EspHack

    EspHack Ancient Guru

    Messages:
    2,799
    Likes Received:
    188
    GPU:
    ATI/HD5770/1GB
    it died thanks to worthless software makers like EA, ubisoft etc, multi gpu on the low-mid end made sense for people because they could almost double performance later on by adding another now cheaper card, and it made sense for the high end since the only way you go higher than top card is to double it

    back in hd5000 days i was very happy with it, and a year later when one card died i could still play, so hey multigpu is even good for redundancy

    at least it seems nvidia realizes this and wants that sweet cash or whatever, what if you want 4k ultrawide? a single 1200$ card will barely do 40fps, but 2 might get you 70fps or so

    you could argue it doesnt make sense for low end since you are wasting a lot of power and dealing with extra complexity vs a newer card, but thats the case today, not when multigpu was in its heyday
     

  11. user1

    user1 Ancient Guru

    Messages:
    2,782
    Likes Received:
    1,305
    GPU:
    Mi25/IGP
    makes sense, there is no clear solution to the problems caused by multicard setups, other than specific optimization by the developer or vendor on a per engine/game basis.

    Any future multi-gpu implementation will likely be on a single card anyway, and will need to be invisible to the user /developers, to be worth adopting.
     
  12. Ssateneth

    Ssateneth Member Guru

    Messages:
    117
    Likes Received:
    27
    GPU:
    EVGA GeForce GTX 980
    multi-gpu used to be a thing where you buy 2 cheaper video cards to get the performance of a higher card at a slight discount, and an upgrade path of sorts if you didn't want to sell off your old hardware (buy $120 gpu and buy another $120 gpu later is easier on the wallet than buying $120 gpu, then a $250 gpu later and selling the first GPU at a discount because now it's used and probably older tech)

    It's somehow turned into an abomination where only the highest end SKU's support it and it's only purpose is to get benchmark scores/world records and no practical purpose other than a money sink that only benefits the parent company.

    Explicit mode multi-GPU rendering is still a thing and you don't need to rely on proprietary technology (crossfire/sli/nvlink) but the game or app needs to be programmed to support it... well, explicitly.
     
  13. kakiharaFRS

    kakiharaFRS Master Guru

    Messages:
    987
    Likes Received:
    370
    GPU:
    KFA2 RTX 3090
    so true, if skyrim among other games ran much better on win7 for me, alas we are now forced by hardware to go to win10 :(

    people are happy they kind of say game over because I still read many people wanting to do sli on win10, please don't
    I had sli 580-680-780-980-1080 the lack of support and micro-stuttering have destroyed sli, forget it exists or you will like me one day, disable sli for some reason then realize your games with 40% less fps look smoother >< frametimes etc.. weren't so talked about back then but I'm pretty sure mine where really bad, "140fps" according to fraps BF4 looked like a 5fps diaporama

    the size and heat generated by the current videocards doesnt help either most of nvidias are monsters
     
  14. mikeysg

    mikeysg Ancient Guru

    Messages:
    3,300
    Likes Received:
    753
    GPU:
    MERC310 RX 7900 XTX
    One of my rigs is still running 2x VEGA64's, pretty happy with it as when CF works, the result is pretty damn good. So, regardless, I'm keeping this setup for my other gaming desktop...
    [​IMG]

    As for my recently built 3900X rig, due to the placement of the PCIe X16 slots, when I tried CF, the primary (top) card was just millimeters above the 2nd card. Running them in games resulted in primary card hitting thermal threshold and I had to stop the game. While I'm content with just a PC VEGA64 Red Devil in it for now, I'm waiting anxiously for NAVI 21 and 23.....a single powerful GPU's the ticket for me.
     
    Keitosha likes this.
  15. sykozis

    sykozis Ancient Guru

    Messages:
    22,492
    Likes Received:
    1,537
    GPU:
    Asus RX6700XT
    I never tried Crossfire.... Tried SLI twice. Once with a pair of 8600GT's and again later with GTX660's..... Don't recall any real issues either time.
     

  16. ht_addict

    ht_addict Active Member

    Messages:
    76
    Likes Received:
    23
    GPU:
    Asus Vega64(CF)
    With CF I can pull 100fps+ in Wolfenstein II The New Colossus with my Dual Vega 64's running at 4K Ultra. Just shows when implemented properly it works.
     
  17. LesserHellspawn

    LesserHellspawn Master Guru

    Messages:
    690
    Likes Received:
    32
    GPU:
    RTX 3080ti Eagle
    This is why I'm holding off upgrading indefinitely right now. I've been using SLI since 2008, went tri-SLI in 2015. I have and had zero issues with it ever. My tri 980ti are so fast that even two 1080 would have been just on par. Only with two RTX would I see an actual improvement, but that is way too much money and only worth a consideration when/if my 980s actually break.

    Companies are putting out ever bigger monitors with ever bigger refresh rates. At the same time Nvidia and AMD are actively crippling our ability to actually drive these things by canning SLI and Crossfire. No goddamn single card can drive such a monitor beast, and none will be able for 1-2 generations of cards further down the line.
     
  18. Hyderz

    Hyderz Member Guru

    Messages:
    171
    Likes Received:
    43
    GPU:
    RTX 3090
    had once 8800GT ran battlefield 3 at 1280x1024, old 4:3 ratio back then at low medium hovered around 40-60fps
     
  19. Keitosha

    Keitosha Ancient Guru

    Messages:
    4,943
    Likes Received:
    192
    GPU:
    RX6800 / Vega56
    I applaud your bravery for using Gigabyte Vega's in CF. :D
     
  20. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,754
    Likes Received:
    9,647
    GPU:
    4090@H2O
    If only they'd offer a single performing GPU that performs good enough for high res high refresh high settings gameplay.
     
    LesserHellspawn and screwtech02 like this.

Share This Page