Quantum Break coming to PC - DirectX 12 only - Screenshots - Specs

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Feb 11, 2016.

  1. sammarbella

    sammarbella Ancient Guru

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    Microsoft is finally starting the DX12 gaming era on PC pushing his Xbox One "exclusives" AAA titles Fable legends and Quantum Break to PC with DX12.

    AotS is the only DX12 game until now but it's not an AAA and a lot of people consider it simply an expensive AMD showcase.

    Fable legends benchmark showed a very different DX12 results for AMD in a future DX12 AAA title, DX12 results are clearly better for Nvidia in this game.

    Some ppl want to believe DX12 will magically shift the GPU lead to AMD erasing his DX11 driver performance problem (?) but i think Nvidia leverage on game publisher/devs and his better gaming support will remain.
     
  2. Rich_Guy

    Rich_Guy Ancient Guru

    Messages:
    12,676
    Likes Received:
    663
    GPU:
    MSI 2070S X-Trio
    Recommended card from AMD should be the 390X, as the FX hasn't got enough Vram.
     
  3. sammarbella

    sammarbella Ancient Guru

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    Yep, AMD worst error with Fury X is the use of HBM1 who is limited to 4 GB VRAM.A 1440/4K gaming will need more VRAM.

    ROTTR already need 6 GB VRAM for 1440 and very high...and for 1080/60 390X is enough.

    VRAM requirements in AAA titles will not decrease.
     
  4. Dygaza

    Dygaza Master Guru

    Messages:
    536
    Likes Received:
    0
    GPU:
    Vega 64 Liquid
    Fury series was mostly a HBM test. They don't even benefit that much of the bandwidth as memory controllers can't take all out of them anyways. (Should be 512GB/s, but can reach ~370GB/s in real because of memory controllers.
     

  5. siriq

    siriq Master Guru

    Messages:
    790
    Likes Received:
    14
    GPU:
    Evga GTX 570 Classified
    Yup! I wanna test this card in DX 12, Also one more DX 12 game will come soon in march.
     
  6. DeskStar

    DeskStar Maha Guru

    Messages:
    1,189
    Likes Received:
    190
    GPU:
    EVGA 2080Ti/3090FTW
    AHAHAHAHAHA. YES.!.!.! These comments on here made my day.......

    People crying about requirements for a game is hilarious. You do know there are ways to upgrade a PC. 16GB of RAM is just fine. Any way you can keep any overhead/bottleneck away (HDD/SSD) I welcome it. Having half, or more of your game loaded into RAM at all times would be epic.

    Memory is so darn cheep these days people should want to rock at least 32gb!!

    I factor things like this into a build....hence I wanted 65gb in my rig, so I would not have this issue. Also knowing when and or if you want to do an upgrade/build makes all the difference in life of said build.....why I'll always break the bank with an EPIC motherboard that supports the universe ***127756;...
     
  7. sammarbella

    sammarbella Ancient Guru

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    Do you seriously called Fury series "an HBM test?

    AMD GPU dep (RTG) finances have no margin for "test" or even "errors" in GPU series releases.

    They can (and need to) develop prototypes and test them in their labs but Fury series was not a test but a fail.
    :3eyes:
     
  8. Tuukka

    Tuukka Active Member

    Messages:
    83
    Likes Received:
    4
    GPU:
    Asus GTX660 OC 2 GB
    I probably have have money for those specs in about year, so no thanks. 50€ for a game. Thanks, but no thanks or will I get these cpu's and gpu's costs me about 1000€ , I don't think so. Pass
     
  9. Dygaza

    Dygaza Master Guru

    Messages:
    536
    Likes Received:
    0
    GPU:
    Vega 64 Liquid
    You know exactly what I mean with "test". AMD knew 4GB will be problematic in border cases, yet they had to do it, as they were developing new tech and investing it in heavy. If it was possible to get 8GB for furies, they would have it. Sometimes you have to run new tech out , and sometimes you hit limits that ain't optimal.
     
  10. TBPH

    TBPH Active Member

    Messages:
    78
    Likes Received:
    0
    GPU:
    MSI GTX 970 3.5+.5GB
    You clearly haven't tried playing Rise of the Tomb Raider on that rig. You need a 960 and a Haswell i5 to match XBox One in that game other than textures.

    I'm willing to bet anything that those recommended specs are for 1080p60 and the minimum is for sub-1080p at lowest settings.
     
    Last edited: Feb 11, 2016

  11. edilsonj

    edilsonj Active Member

    Messages:
    53
    Likes Received:
    1
    GPU:
    Gigabyte RTX 2060 S
  12. sammarbella

    sammarbella Ancient Guru

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
  13. Dygaza

    Dygaza Master Guru

    Messages:
    536
    Likes Received:
    0
    GPU:
    Vega 64 Liquid
    I personally wouldn't have problem if it uses gameworks. That would only mean nvidia has made DX12 versions out of it, thus encouraging use of DX12. It's gonna happen one day anyway.
     
  14. sammarbella

    sammarbella Ancient Guru

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    Yep, Gameworks 2.0 (DX12 one...) is around the corner.Let's see what new features and performance "improvements" we receive from it, specially AMD GPU owners.
     
  15. Weekend

    Weekend Member

    Messages:
    49
    Likes Received:
    3
    GPU:
    1660 S
    I remember someone said fury uses less vram compared to gddr5 cards don't know if that is true but anyway if anyone is going for fury xfire then there is no problem under dx12. 8GB of HBM... I really want to see some benchmarks of that compared to xfire/sli 12GB GB of GDDR5 cards.
     

  16. sammarbella

    sammarbella Ancient Guru

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    I don't believe in DX12 on paper promises,i need to see them in real life.

    Unified VRAM memory pool and the solution for all multi-card GPU setups it's very nice in paper.

    The problem is that mean a lot of work for game devs...
     
  17. Weekend

    Weekend Member

    Messages:
    49
    Likes Received:
    3
    GPU:
    1660 S
    I see your point but both amd and nvidia could benefit alot from it if the coding is right. it's best for them to start dx12 AAA titles with good Unified VRAM memory pool solution.
    I myself won't mind getting 2nd 270x (or plan my next upgrade to xf/sli or maybe combined) if they code it right for dx12.
     
  18. sammarbella

    sammarbella Ancient Guru

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    The main problem for that is in DX12 multi-card support (including memory pool) is ALL and ONLY in game dev hands.

    GPU drivers can't add or fix nothing related to CFX/SLI in DX12 gameslike they did until now in DX9-11 games.

    Nvidia Gameworks DX12 access to game codes will be able to add some features for Nvidia GPUs (and even tax AMD performance) but they are not going to add multi-card support from scratch in games by themselves.
     
  19. TBPH

    TBPH Active Member

    Messages:
    78
    Likes Received:
    0
    GPU:
    MSI GTX 970 3.5+.5GB
    You are not going to get the full unified pool no matter what. At least some data will be replicated, and it'll be years before we see any unified pool at all. I can guarantee you that this game will not take advantage of that at all.
     
  20. Cave Waverider

    Cave Waverider Maha Guru

    Messages:
    1,122
    Likes Received:
    132
    GPU:
    RTX 3090 ROG Strix
    Hey, I'm actually happy DirectX 12 games are finally coming! ;)
     

Share This Page