Crossfire is no longer a significant focus says AMD CEO Lisa Su

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 22, 2019.

  1. geogan

    geogan Maha Guru

    Messages:
    1,267
    Likes Received:
    468
    GPU:
    4080 Gaming OC
    Remember the days when there was actually two GPUs on the one card? I had two of those AMD cards, Radeon 5970 and 7990 .... in those days the game engines were simpler (as explained by Fellix above) and could do AFR and other techniques, now the game engines are not compatible with splitting across multiple GPUs.

    Also as others said, the increase in framerate in those days of using two cards was completed invalidated by stuttering which made it look like less framerate than just using a single card. And then there was the games where MSI Afterburner would show that one GPU was at 100% usage and the other was sitting at 0%. I'm sure using "chiplets" to scale GPUs will cause similar game engine problems as SLI did. But maybe the massive bandwidth increase of PCI-E 4.0 will help (or the interconnnect between GPUs similar to the 3rd Gen Ryzen chiplets).

    A lot of it is economics too... NVidia would much prefer people had to buy its more expensive GPUs than have any option of doubling up a much cheaper or second-hand card later on to get same speed.
     
    Undying likes this.
  2. EngEd

    EngEd Member Guru

    Messages:
    138
    Likes Received:
    40
    GPU:
    Gigabyte RTX 3080
    Never bothered with SLI/Crossfire but I did like multi Gpu in one card. If they could make dual gpu card without needing the Crossfire or SLI type of rendering but with a lower price then two individual cards and that always works No matter the game, now that would be awesome.

    Like 2x RTX 2080 ti dual but instead of costing $1200 per card it would be $1500 per dual card in total and not $2400 per card like it is today..
     
  3. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    Without massive scheduling changes to accommodate chiplet design Nvidia estimates you'd need on average of 3TB/s of bandwidth between the chiplets. So PCI-E 4.0 is no where near close enough. Nvidia proposed a number of optimizations and architecture changes that would make it achievable at ~768 GB/s of bandwidth which NVLink can currently do, but it would be an extremely wide bus. Recently Nvidia has shown a 32 chiplet MCM design for inferencing - AMD has also been talking about going chiplet as well although they've published a lot less material on it.

    Either way it's coming it's only a matter of time but there definitely seems to be a ton of roadblocks and hurdles to overcome.
     
    Last edited: Aug 23, 2019
    HandR, yasamoka and Embra like this.
  4. screwtech02

    screwtech02 Master Guru

    Messages:
    208
    Likes Received:
    36
    GPU:
    6900 XTU
    If this is the case, they need to make a affordable single gpu that will run 3 or more monitors @ high resolutions and produce over 60fps... Have used multi gpu since inception of SLI, then move to crossfire when Nvidia didnt allow SLI on intel chipset boards. Have never had any issues with gaming, nor multi monitor use since 2010.
     

  5. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,140
    Likes Received:
    1,743
    GPU:
    GTX 1080 Ti
    There are. The 2080 ti is an example of a card that can drive multiple monitors(or one big res monitor). Let's also not forget that DX12 Async(multi gpu) is a thing, which works regardless of what gpus you have as long as they are both DX12 capable. If you really want to drive multiple cards in a system and to be utilized, look at game devs and urge them to support this feature.
     
  6. DeskStar

    DeskStar Guest

    Messages:
    1,307
    Likes Received:
    229
    GPU:
    EVGA 3080Ti/3090FTW
    Developers that still support it.... Dice being one of them are still killing it with multi-gpu performance...

    I'll have to post some pics soon because I've been gaming at 6880x2880 DSR with four Titans. If memory is an issue I drop it down a few notches in resolution and all's good.

    Messing with sli profiles I don't mind. Especially when one gets to have near perfect scaling across all four cards.

    I know exactly why th we companies are doing away with MGPU setups. Because people like me can still play games today on hardware that's five plus years old. They don't want that kind of turn around for their hardware.....just means people are going to buy less shtuff less often because their hardware is now more relevant due to performance.

    Take one of my cards and yeah I'd have upgraded a long time ago......

    Always a selling point as to why things happen. Can't make money off of someone who doesn't upgrade but every six or seven years. Because this is the longest in my life I've ever gone without an upgrade!!
     
    Fox2232 likes this.
  7. johhnyangel

    johhnyangel Guest

    Messages:
    6
    Likes Received:
    0
    GPU:
    zotac 1070
    Had 2 6870's in crossfire and they rocked! Now days my wallet hurts every time I buy 1 new GPU....
     
  8. screwtech02

    screwtech02 Master Guru

    Messages:
    208
    Likes Received:
    36
    GPU:
    6900 XTU
    But DICE does not support any type of AMD multi-gpu in BF5, the beta did, yet the release is quite lacking...
     
  9. mikeysg

    mikeysg Ancient Guru

    Messages:
    3,289
    Likes Received:
    742
    GPU:
    MERC310 RX 7900 XTX
    Eh? Am I missing something here? Thus far, the dual Gigabyte cards in CF have been running games that support CF with awesome framerates and smoothness. Temps is relatively good with highest I've seen at about 75C.....what's wrong with GB?
     
  10. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Yeah, but you still paid for your GPUs more than you would if you bought just one top gaming card from each generation. (They are definitely not losing money on you.)

    Look at me on exactly opposite side of spectrum. 1GPU, 1080p screen @240Hz. I care little for all those situations where that single GPU ends up underutilized. Its performance that I can use next year. Or year after that if there is not GPU that interests me.

    If someone buys GPU like 5700 XT for mere 60Hz 1080p screen, they will be able to keep it till they want to use DXR. And maybe even then some medium settings will run 60fps via shaders in those 2 games which given person is interested in playing from DXR game pool.
     

  11. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,016
    Likes Received:
    7,355
    GPU:
    GTX 1080ti
    Keitosha likes this.
  12. BReal85

    BReal85 Master Guru

    Messages:
    487
    Likes Received:
    180
    GPU:
    Sapph RX 570 4G ITX
    Well, you got ~52% performance for 100% more price. Literally you threw half the money spent on the second GPU out of the window. If they spend only 10% of the efforts put in Crossfire (or SLI, as NV also leaving the SLI train) on anything else, everyone will be happier.
     
  13. mikeysg

    mikeysg Ancient Guru

    Messages:
    3,289
    Likes Received:
    742
    GPU:
    MERC310 RX 7900 XTX
    Keitosha likes this.
  14. Redemption80

    Redemption80 Guest

    Messages:
    18,491
    Likes Received:
    267
    GPU:
    GALAX 970/ASUS 970
    Since AMD are more about DX12/Vulkan, it's obvious Crossfire will not be a a priority.

    Moving to DX12 and Vulkan has made multi-gpu poorer though.

    I wonder how AMD/Nvidia feel about it all? Obviously it's less work for them if MGPU support is now the responsibility of the developer, but there is obviously the loss of sales from people who buy multiple cards.
     
  15. Keitosha

    Keitosha Ancient Guru

    Messages:
    4,942
    Likes Received:
    192
    GPU:
    RX6800 / Vega56
    I'm having a Gigabyte Vega 56 OC and lucky for me no problems (yet). I flashed its bios to the latest right away and got a good airflow in my case. Good thing it came with 3 games, DMC5, TD2 and RE2 remastered.
     

  16. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    IT made it poorer. But ultimately if those APIs make accessibility of MGPU easier over time, it would be "win". But either way I see future in chiplets. Its far away, but price reduction on side of GPU maker from them will get us much better performance on one card per $ than 2 smaller GPUs in SLI/CFX.
     
    Redemption80 likes this.

Share This Page