Crossfire is no longer a significant focus says AMD CEO Lisa Su

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 22, 2019.

  1. JamesSneed

    JamesSneed Master Guru

    Messages:
    620
    Likes Received:
    212
    GPU:
    GTX 1070
    I like this approach. I hope they do make a very large GPU chip to compensate. LIke a little larger than 2x current Navi which is easily doable since that is still smaller than Nvidia's 2080. Better yet they figure out how to MCM / Chiplet GPU's together that present to the OS as one GPU.
     
  2. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    4,589
    Likes Received:
    1,437
    GPU:
    HIS R9 290
    Well, keep in mind price goes up exponentially with chip size. That's why Intel's monolithic design is no longer deemed cost effective. As pointed out @Embra, if AMD can pull of a chiplet design with their GPUs, that will be the future of making absurdly powerful GPUs at an affordable price.
     
  3. Astyanax

    Astyanax Ancient Guru

    Messages:
    3,966
    Likes Received:
    1,119
    GPU:
    GTX 1080ti
    chiplets have some interesting hurdles to overcome to be high performant in dedicated graphics cards.
     
  4. Fox2232

    Fox2232 Ancient Guru

    Messages:
    9,770
    Likes Received:
    2,208
    GPU:
    5700XT+AW@240Hz
    Yes, but the price... Cost efficiency is strong motivation. Take current Zen2 chiplets and range of products same dies are used on.

    Imagine AMD tapes-out chiplet with 10 WGPs and can scale it as much as interposer, I/O die and power efficiency allows.
    No more reason to keep old product stack around for few generations. They would be able to replace everything. And cost of making top GPUs (cards) would not have that extra of tapeout which is now main blocker as lower sales makes it harder to get return of investment.

    Everything would be quite economical. And that would mean lower prices for us. And faster iterations of new generations. Because designing each chip and working on separate tapeout => validation is time consuming too.

    (And I doubt that I wrote even half of benefits.)
     
    Embra, airbud7 and schmidtbag like this.

  5. fellix

    fellix Member Guru

    Messages:
    172
    Likes Received:
    12
    GPU:
    KFA² GTX 1080 Ti
    Both SLi and CF are using AFR to balance the workload and that technology has become more and more at odds with the advancement of the game engines. In the past it was much more straightforward, where the game will render everything into a single memory target (frame-buffer) and start again fresh with the next frame.
    Now games have scattered the different stages of the image composition in multiple off-screen buffers and on top of that the growing use of sampling data from previous frames throws more wrenches in the delicate driver-based management of the multiple video memory pools. Obviously, Nvidia and AMD are not willing to spend more resources to do the legwork for the game devs, since both DX12 and Vulkan APIs already expose direct multi-GPU controls at application level.
     
    geogan likes this.
  6. Astyanax

    Astyanax Ancient Guru

    Messages:
    3,966
    Likes Received:
    1,119
    GPU:
    GTX 1080ti
    such an interposer would have to run faster than nvidia's nvlink fabric
     
  7. BlueRay

    BlueRay Master Guru

    Messages:
    269
    Likes Received:
    62
    GPU:
    EVGA GTX 1070 FTW
    Good riddance.
    Too many issues with it.
     
  8. Alessio1989

    Alessio1989 Maha Guru

    Messages:
    1,428
    Likes Received:
    239
    GPU:
    .
    Not a big deal, in any case D3D12 and Vulkan are able to manage multiple node adapters and multiple independent adapters. With PCI-E 3.0 and 4.0, such "optimized" share bandwidth channels begin to become less important then ever (AMD removed the Crossfire bridge many years ago..)..
     
  9. Dimitrios1983

    Dimitrios1983 Master Guru

    Messages:
    237
    Likes Received:
    58
    GPU:
    AMD RX560 4GB
    To be honest I was never a fan of running two GPU cards. I can understand if new games are demanding like another Crysis game but unfortunately game developers penny pinch and take baby steps just to be safe.
     
  10. Mpampis

    Mpampis Active Member

    Messages:
    68
    Likes Received:
    37
    GPU:
    RX 5700 XT 8GB
    Rememeber the "Chiplet design GPUs" discussion?
    The engineers said that the major problem with a chiplet GPU is to make the system recognise it as a single GPU.
    We should have guessed back then that multi GPU setups have no future.
     

  11. EspHack

    EspHack Ancient Guru

    Messages:
    2,449
    Likes Received:
    38
    GPU:
    ATI/HD5770/1GB
    it died thanks to worthless software makers like EA, ubisoft etc, multi gpu on the low-mid end made sense for people because they could almost double performance later on by adding another now cheaper card, and it made sense for the high end since the only way you go higher than top card is to double it

    back in hd5000 days i was very happy with it, and a year later when one card died i could still play, so hey multigpu is even good for redundancy

    at least it seems nvidia realizes this and wants that sweet cash or whatever, what if you want 4k ultrawide? a single 1200$ card will barely do 40fps, but 2 might get you 70fps or so

    you could argue it doesnt make sense for low end since you are wasting a lot of power and dealing with extra complexity vs a newer card, but thats the case today, not when multigpu was in its heyday
     
  12. user1

    user1 Maha Guru

    Messages:
    1,443
    Likes Received:
    472
    GPU:
    hd 6870
    makes sense, there is no clear solution to the problems caused by multicard setups, other than specific optimization by the developer or vendor on a per engine/game basis.

    Any future multi-gpu implementation will likely be on a single card anyway, and will need to be invisible to the user /developers, to be worth adopting.
     
  13. Ssateneth

    Ssateneth Active Member

    Messages:
    56
    Likes Received:
    4
    GPU:
    EVGA GeForce GTX 980
    multi-gpu used to be a thing where you buy 2 cheaper video cards to get the performance of a higher card at a slight discount, and an upgrade path of sorts if you didn't want to sell off your old hardware (buy $120 gpu and buy another $120 gpu later is easier on the wallet than buying $120 gpu, then a $250 gpu later and selling the first GPU at a discount because now it's used and probably older tech)

    It's somehow turned into an abomination where only the highest end SKU's support it and it's only purpose is to get benchmark scores/world records and no practical purpose other than a money sink that only benefits the parent company.

    Explicit mode multi-GPU rendering is still a thing and you don't need to rely on proprietary technology (crossfire/sli/nvlink) but the game or app needs to be programmed to support it... well, explicitly.
     
  14. kakiharaFRS

    kakiharaFRS Master Guru

    Messages:
    222
    Likes Received:
    40
    GPU:
    MSI Gaming X 1080ti
    so true, if skyrim among other games ran much better on win7 for me, alas we are now forced by hardware to go to win10 :(

    people are happy they kind of say game over because I still read many people wanting to do sli on win10, please don't
    I had sli 580-680-780-980-1080 the lack of support and micro-stuttering have destroyed sli, forget it exists or you will like me one day, disable sli for some reason then realize your games with 40% less fps look smoother >< frametimes etc.. weren't so talked about back then but I'm pretty sure mine where really bad, "140fps" according to fraps BF4 looked like a 5fps diaporama

    the size and heat generated by the current videocards doesnt help either most of nvidias are monsters
     
  15. mikeysg

    mikeysg Ancient Guru

    Messages:
    2,454
    Likes Received:
    167
    GPU:
    PC VEGA64 Red Devil
    One of my rigs is still running 2x VEGA64's, pretty happy with it as when CF works, the result is pretty damn good. So, regardless, I'm keeping this setup for my other gaming desktop...
    [​IMG]

    As for my recently built 3900X rig, due to the placement of the PCIe X16 slots, when I tried CF, the primary (top) card was just millimeters above the 2nd card. Running them in games resulted in primary card hitting thermal threshold and I had to stop the game. While I'm content with just a PC VEGA64 Red Devil in it for now, I'm waiting anxiously for NAVI 21 and 23.....a single powerful GPU's the ticket for me.
     
    Keitosha likes this.

  16. sykozis

    sykozis Ancient Guru

    Messages:
    21,101
    Likes Received:
    692
    GPU:
    MSI RX5700
    I never tried Crossfire.... Tried SLI twice. Once with a pair of 8600GT's and again later with GTX660's..... Don't recall any real issues either time.
     
  17. ht_addict

    ht_addict Active Member

    Messages:
    65
    Likes Received:
    9
    GPU:
    Asus Vega64(CF)
    With CF I can pull 100fps+ in Wolfenstein II The New Colossus with my Dual Vega 64's running at 4K Ultra. Just shows when implemented properly it works.
     
  18. LesserHellspawn

    LesserHellspawn Master Guru

    Messages:
    638
    Likes Received:
    6
    GPU:
    2x GTX980ti
    This is why I'm holding off upgrading indefinitely right now. I've been using SLI since 2008, went tri-SLI in 2015. I have and had zero issues with it ever. My tri 980ti are so fast that even two 1080 would have been just on par. Only with two RTX would I see an actual improvement, but that is way too much money and only worth a consideration when/if my 980s actually break.

    Companies are putting out ever bigger monitors with ever bigger refresh rates. At the same time Nvidia and AMD are actively crippling our ability to actually drive these things by canning SLI and Crossfire. No goddamn single card can drive such a monitor beast, and none will be able for 1-2 generations of cards further down the line.
     
  19. Hyderz

    Hyderz Member Guru

    Messages:
    118
    Likes Received:
    23
    GPU:
    GTX 1070ti
    had once 8800GT ran battlefield 3 at 1280x1024, old 4:3 ratio back then at low medium hovered around 40-60fps
     
  20. Keitosha

    Keitosha Ancient Guru

    Messages:
    4,860
    Likes Received:
    103
    GPU:
    Vega56 8GB \ GT1030
    I applaud your bravery for using Gigabyte Vega's in CF. :D
     

Share This Page