Quantum Break coming to PC - DirectX 12 only - Screenshots - Specs

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Feb 11, 2016.

  1. CalculuS

    CalculuS Ancient Guru

    Messages:
    3,048
    Likes Received:
    267
    GPU:
    GTX 1660Ti
    My skepticism levels are off the charts, my pessimism aswell.
     
  2. Denial

    Denial Ancient Guru

    Messages:
    13,294
    Likes Received:
    2,779
    GPU:
    EVGA RTX 3080
    But that's basically my point. The 280x has a gig more of ram on it.

    I went from a 690 to a 980, in the vast majority of games it was a sidegrade. Skyrim unmodded, same exact FPS. Throw a bunch of textures mods into it, and the 980 nearly doubled my 690.

    It shouldn't surprise people that the 770/680 is falling behind. There were threads on this very forum about the 7970 being the better long-term buy due to the lack of VRAM on the Nvidia equivalents. Why is that Nvidia screwing customers? Why isn't it customers screwing themselves by not doing basic research?

    I personally don't care, I'll go with whoever has the fastest card at the time I feel like upgrading. My last AMD card was a 4870X2, I was actually a semi-decent experience. I'll gladly go back to them.

    I just can't stand the overall negativity on this thread and Guru3D as of late. Nearly every single post is about x company screwing y customer. Bunch of debbie downers if you ask me.
     
  3. ScottishPickle

    ScottishPickle Member

    Messages:
    42
    Likes Received:
    8
    GPU:
    8gb DDR3 1666
    It's worth it imo. IPS/Gsync/165hz/1440p is what you need. Gaming is now a pleasure. I kill more often and games like Alan Wake (which l hadn't played before) look amazing.

    I have the Acer XB270HU!
     
  4. TBPH

    TBPH Active Member

    Messages:
    78
    Likes Received:
    0
    GPU:
    MSI GTX 970 3.5+.5GB
    Like hell I'd going 1440p with a 970! No way.
     

  5. BroDragon

    BroDragon Member

    Messages:
    11
    Likes Received:
    0
    GPU:
    gtx970sli
    If this game is indeed fully DX12 then that implies that the game will support multiple different GPUs including the iGPU on the CPU yet no mention is made of this whatsoever. I see that the requirements are for at least a Haswell CPU and a 700 series NVidia graphics card so I guess we'll have to wait and see if this game is truly DX12 or if they are just cherrypicking a few features.
     
  6. sammarbella

    sammarbella Ancient Guru

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    This game can be DX12 and don't add multi-card support at all.

    One things is all the options available in DX12 API and a VERY different thing is a game dev using them ALL in a game.

    Let's wait for a serious game review until then i will remain skeptic.
     
  7. (.)(.)

    (.)(.) Banned

    Messages:
    9,094
    Likes Received:
    0
    GPU:
    GTX 970
    A 980Ti for Recommended!

    Well, if you say so.

    I mean, if DX12 is all that, id a thought that we see a slow down on inflated requirements for ports.

    Will be interesting to see if modders find a DX11 work around.
     
  8. BroDragon

    BroDragon Member

    Messages:
    11
    Likes Received:
    0
    GPU:
    gtx970sli
    If 'recommended' graphics is a 980ti then the developers are placing hardware demands beyond the means of most consumers and therefore shooting themselves in the foot if they want to sell a lot of games. It makes sense and would be relatively simple to code the game to utilize the iGPU to handle the post processing and lighten the load on the GPU allowing a more mainstream graphics card to handle high settings. Also I'm assuming that they are talking about driving a 1080p monitor and a 980ti is way overkill for that resolution even at 144hz.
     
  9. TBPH

    TBPH Active Member

    Messages:
    78
    Likes Received:
    0
    GPU:
    MSI GTX 970 3.5+.5GB
    That's pushing it a bit unless you're talking about older games.
     
  10. BroDragon

    BroDragon Member

    Messages:
    11
    Likes Received:
    0
    GPU:
    gtx970sli
    6gb of memory for 1080p? No way!
     

  11. blaugznis

    blaugznis Member Guru

    Messages:
    124
    Likes Received:
    0
    GPU:
    Gainward GTX 1060 6GB
    Didn't Dune 2000 already tried with live action actors during cut scenes back at 1998? :D

    Or was it Emperor: Battle for Dune? :)
     
    Last edited: Feb 12, 2016
  12. quickkill2021

    quickkill2021 Member Guru

    Messages:
    131
    Likes Received:
    3
    GPU:
    1080ti sli Poseidon
    This game better look twice as good as the witcher 3 / assassin creed unity / rise of tomb raider

    This game better look so good that it litterally is a representation of not this generation but of the next generation. I better be able to see individual hair strands and sweat on each npc.

    When I play this game I better see my 4930k at 100 percent cpu usage and my two 780 tis consuming 1,100 watts of power at 100 percent gpu usage each and It better show 30 fps at 1080p.

    only then, and only then will this game be worthy of those system requirements.

    But! if this game looks as good as any other game that has come out like the division with game works pre baked in, with godly amounts of tessellation, tessellating every single crevice of the game and nose hair. With TXAA 4x all over the place with no way to turn it off. Then obviously this game is a crap port.
     
  13. kegastaMmer

    kegastaMmer Master Guru

    Messages:
    323
    Likes Received:
    39
    GPU:
    strix 1070 GTX
    saving now for something later next holiday season then, gosh, the textures and AO :V
     
  14. (.)(.)

    (.)(.) Banned

    Messages:
    9,094
    Likes Received:
    0
    GPU:
    GTX 970
    This. Exact same problem i had with the recently released Tomb Raider.

    Id like to know why nearly every game now is demanding the top end cards, when back in the days of Crysis 1,2,3 (though, i felt 3 was pushing it) those games were like nothing else in the industry and looked like they deserved the system requirements they asked for.

    So unless the PC version is like you say, the next gen of the next gen, then i fail to see how, from what little Ive seen of the game (granted, it was most likely X1 footage) that it requires a 980ti.
     
  15. stilli1988

    stilli1988 Member

    Messages:
    13
    Likes Received:
    0
    GPU:
    MSI GTX 1070 Gaming X
    don't think it's that simple mate
     

  16. vazup

    vazup Master Guru

    Messages:
    307
    Likes Received:
    12
    GPU:
    r9 280X
    Well, sorry that not all of us were born in wealthy high paying countries... a hole.
     
  17. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    12,158
    Likes Received:
    4,317
    GPU:
    2080Ti @h2o
    Is this now a new hardware - old hardware, much available income - low income debate? :wanker:

    I personally too see that not everybody can afford a 10core CPU or SLI of Fury X / 980Ti. Yet, those who can should be able to make use of it too. Would love to see games scale from older hardware to newest stuff and CFX / SLI! And maybe this game actually does help since it supports dx12, probably getting a tad more performance out of older systems. (Up to date / enthusiast hardware won't see as big gains from dx12 as older hardware.)
     
  18. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,026
    Likes Received:
    242
    GPU:
    6800 XT
    I am not. Since it is published by ms.
     
  19. Glottiz

    Glottiz Master Guru

    Messages:
    569
    Likes Received:
    176
    GPU:
    TUF 3080 OC
    Calm down guys. Requirements were updated.

     
  20. edilsonj

    edilsonj Active Member

    Messages:
    53
    Likes Received:
    1
    GPU:
    Gigabyte RTX 2060 S
    Specs updated.

    [URL="http://remedygames.com/quantum-break-pre-orders-and-previews-launch-windows-10-version-announced/"[/URL]

    Async Shader's AMD (Hitman) versus GameWorks DX12 Edition (Quantum Break).

    AMD needs to be best with Hitman's GFX performance.

    :war:
     

Share This Page