back on the red team

Discussion in 'Videocards - AMD Radeon' started by xvcardzx, Jul 4, 2015.

  1. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,132
    Likes Received:
    974
    GPU:
    Inno3D RTX 3090
    What it will solve is that (as you said) people who want to support multi-gpu in their games can do it properly on their own, with no obfuscated monstrous drivers and engine workarounds.

    On the other hand, that means that they have to do everything by themselves. I'm still surprised at people who believe that everybody and their mothers will use DX12 (especially in the first couple of years). The amount of porting that needs to be done, and the amount of low level work compared to DX11, boggles the mind really.

    Low level = More work.

    [​IMG]
     
  2. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    If more work must be done in DX12 and AMD and Nvidia will not be able to do it cause all the load is placed on game devs we will have WORST multi-GPU support in DX12 not better.

    Game devs don't give a s*** at multi-gpus (1% more game sells at what expense in code time???) this is a problem (maybe) for GPU manufacturers and their PR and driver teams.
     
  3. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,132
    Likes Received:
    974
    GPU:
    Inno3D RTX 3090
    The only games that make me happy about multi-gpu with DX12 are BF4, and (even more) Civ Beyond Earth. But I have a feeling that in the end it was AMD people who did the implementations for both (I might be wrong, I don't want to be unfair).
     
  4. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    gta gta :d
     

  5. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    If it was the case and DX12 doesn't allow GPU manufacturers to implement crossfire and SLI by themself that means multi-gpu in DX12 is de facto dead.
     
  6. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    Sfr was removed from the nvcp years ago, the performance hit vs afr was very big. Not worth it imo.
     
  7. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,132
    Likes Received:
    974
    GPU:
    Inno3D RTX 3090
    The whole point is that it is done by the developers themselves. GPU companies are so powerless over that, that even the great NVIDIA nightmare is possible. Cross-vendor multi-gpu! :D

    The guys at Firaxis and AMD seemed very very happy about it. It doesn't give you perfect scaling, but it keeps latencies to single GPU levels, with a performance boost on lowest FPS, and a very nice boost on average FPS. Sounds like a win-win to me.

    Still needs to be programmed by hand in per-title basis.








    Dammit.
     
  8. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    Nah if you ever used it you would know. Nvidia probably removed it for the performance hit. Though I think you can use it with nvidia inspector.
     
  9. MatrixNetrunner

    MatrixNetrunner Guest

    Messages:
    125
    Likes Received:
    0
    GPU:
    Powercolor PCS+ R9 270X
    Yes, but by game developers, not by GPU driver programmers. There will probably be away to automate it a bit (at least for commonly used engines).
     
  10. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,132
    Likes Received:
    974
    GPU:
    Inno3D RTX 3090
    Nvidia removed it in an era that nobody was checking about frame times or latency in gaming. We had to wait for years and have multiple websites and people pushing for actual smoothness (beyond average framerates), until it was acknowledged by both manufacturers.
    Read the article at Anandtech, it's very nice. The frame time results are even nicer.

    [​IMG]

    [​IMG]

    Holy frame times Batman! This technique literally makes the multi-gpu setup behave like a single bigger card. It can never have the framerates of AFR, but if it negates all the micro-macro-mega-stutter and the latency associated with multi-gpu, and gives you extra performance, I see it as a huge win-win.

    Not so sure about that, there are probably people working on game engines trying to figure that out. My bet is on performance profilers with extra options, but I don't carry very high hopes for a couple of years.
     

  11. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    You are actually correct, at least someone here understand simple things.
    Unfortunately Having 2 different set of textures may work only in games where player has no control over viewport. In case player is capable to looks left/right it is wiser to have all textures present in both cards vram than to get hitching as player looks around.

    For me textures are kind of thing of past, especially very high resolution textures. Games should use microtexturing and proper shader code to create materials. Then there will be no need for even 2GB of vram on 4k.
     
  12. ---TK---

    ---TK--- Guest

    Messages:
    22,104
    Likes Received:
    3
    GPU:
    2x 980Ti Gaming 1430/7296
    That really doesn't interest me, my setting and resolution require afr.
     
  13. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,132
    Likes Received:
    974
    GPU:
    Inno3D RTX 3090
    Well, part of this conversation is that the game developers will be the ones deciding what is more important now. If they program for AFR (which is hard), you will get AFR. If they don't, you won't.

    May I ask why isn't SFR something you would use? Doesn't microstutter bother you with multi-gpu? It almost drived me mad.
     
  14. oGow89

    oGow89 Guest

    Messages:
    1,213
    Likes Received:
    0
    GPU:
    Gigabyte RX 5700xt
    Alright i thought i should post this for the fan boys out there to see,

    https://www.youtube.com/watch?v=ESeFXwGfBsI

    And one thing to add, if anyone ever recommends buying a gtx 960 for whatever reason, should just go visit a doctor, he might figure out what is wrong with him.
     
  15. oGow89

    oGow89 Guest

    Messages:
    1,213
    Likes Received:
    0
    GPU:
    Gigabyte RX 5700xt
    Here again finally a nice review for the r9 390 showing the card on stock and oc'd versus a gtx 970 stock and oc'd:
    https://www.youtube.com/watch?v=k9cKZiJw6Pk

    Anyways, i guess there is no point anymore in recommending the gtx 970

    Just one question for AMD, why the f**k didn't you release this card 6 months ago? :bang:

    It is a rebrand anyways, how hard is it to put a good cooler on the damn a thing and to tweak the clocks?
     
    Last edited: Jul 7, 2015

  16. The Mac

    The Mac Guest

    Messages:
    4,404
    Likes Received:
    0
    GPU:
    Sapphire R9-290 Vapor-X
    um.. because its more than that?

    updated microcode, updated process, etc, etc..
     
  17. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    Let's knock that on the head for now. There's no proof either way just speculation.
     
  18. The Mac

    The Mac Guest

    Messages:
    4,404
    Likes Received:
    0
    GPU:
    Sapphire R9-290 Vapor-X
    all of the review sites pretty much say the same thing.

    Its an issue of semantics really, its updated silicon, just the same design.

    If you want to call that a re brand, knock yourself out, im tired of arguing about it.

    But it itsnt, technically.
     
    Last edited: Jul 8, 2015
  19. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    From an end users point of view, you're paying more for a "new" process but the same performance (if you null the driver improvements for 300 cards).
    A 390 is more than a 290x 8GB (was, discontinued now) here. It's just not worth it.
     
    Last edited: Jul 8, 2015
  20. The Mac

    The Mac Guest

    Messages:
    4,404
    Likes Received:
    0
    GPU:
    Sapphire R9-290 Vapor-X
    i certainly dont disagree, performance improvements was never at issue.

    Considering what they have purported to have improved, there just isnt any real world gains.
     

Share This Page