Quantum Break coming to PC - DirectX 12 only - Screenshots - Specs

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Feb 11, 2016.

  1. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    But that's basically my point. The 280x has a gig more of ram on it.

    I went from a 690 to a 980, in the vast majority of games it was a sidegrade. Skyrim unmodded, same exact FPS. Throw a bunch of textures mods into it, and the 980 nearly doubled my 690.

    It shouldn't surprise people that the 770/680 is falling behind. There were threads on this very forum about the 7970 being the better long-term buy due to the lack of VRAM on the Nvidia equivalents. Why is that Nvidia screwing customers? Why isn't it customers screwing themselves by not doing basic research?

    I personally don't care, I'll go with whoever has the fastest card at the time I feel like upgrading. My last AMD card was a 4870X2, I was actually a semi-decent experience. I'll gladly go back to them.

    I just can't stand the overall negativity on this thread and Guru3D as of late. Nearly every single post is about x company screwing y customer. Bunch of debbie downers if you ask me.
     
  2. ScottishPickle

    ScottishPickle Member

    Messages:
    45
    Likes Received:
    9
    GPU:
    8gb DDR3 1666
    It's worth it imo. IPS/Gsync/165hz/1440p is what you need. Gaming is now a pleasure. I kill more often and games like Alan Wake (which l hadn't played before) look amazing.

    I have the Acer XB270HU!
     
  3. TBPH

    TBPH Guest

    Messages:
    78
    Likes Received:
    0
    GPU:
    MSI GTX 970 3.5+.5GB
    Like hell I'd going 1440p with a 970! No way.
     
  4. BroDragon

    BroDragon Guest

    Messages:
    11
    Likes Received:
    0
    GPU:
    gtx970sli
    If this game is indeed fully DX12 then that implies that the game will support multiple different GPUs including the iGPU on the CPU yet no mention is made of this whatsoever. I see that the requirements are for at least a Haswell CPU and a 700 series NVidia graphics card so I guess we'll have to wait and see if this game is truly DX12 or if they are just cherrypicking a few features.
     

  5. sammarbella

    sammarbella Guest

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    This game can be DX12 and don't add multi-card support at all.

    One things is all the options available in DX12 API and a VERY different thing is a game dev using them ALL in a game.

    Let's wait for a serious game review until then i will remain skeptic.
     
  6. (.)(.)

    (.)(.) Banned

    Messages:
    9,089
    Likes Received:
    0
    GPU:
    GTX 970
    A 980Ti for Recommended!

    Well, if you say so.

    I mean, if DX12 is all that, id a thought that we see a slow down on inflated requirements for ports.

    Will be interesting to see if modders find a DX11 work around.
     
  7. BroDragon

    BroDragon Guest

    Messages:
    11
    Likes Received:
    0
    GPU:
    gtx970sli
    If 'recommended' graphics is a 980ti then the developers are placing hardware demands beyond the means of most consumers and therefore shooting themselves in the foot if they want to sell a lot of games. It makes sense and would be relatively simple to code the game to utilize the iGPU to handle the post processing and lighten the load on the GPU allowing a more mainstream graphics card to handle high settings. Also I'm assuming that they are talking about driving a 1080p monitor and a 980ti is way overkill for that resolution even at 144hz.
     
  8. TBPH

    TBPH Guest

    Messages:
    78
    Likes Received:
    0
    GPU:
    MSI GTX 970 3.5+.5GB
    That's pushing it a bit unless you're talking about older games.
     
  9. BroDragon

    BroDragon Guest

    Messages:
    11
    Likes Received:
    0
    GPU:
    gtx970sli
    6gb of memory for 1080p? No way!
     
  10. blaugznis

    blaugznis Member Guru

    Messages:
    125
    Likes Received:
    0
    GPU:
    GTX 1060 6GB
    Didn't Dune 2000 already tried with live action actors during cut scenes back at 1998? :D

    Or was it Emperor: Battle for Dune? :)
     
    Last edited: Feb 12, 2016

  11. quickkill2021

    quickkill2021 Guest

    Messages:
    131
    Likes Received:
    3
    GPU:
    1080ti sli Poseidon
    This game better look twice as good as the witcher 3 / assassin creed unity / rise of tomb raider

    This game better look so good that it litterally is a representation of not this generation but of the next generation. I better be able to see individual hair strands and sweat on each npc.

    When I play this game I better see my 4930k at 100 percent cpu usage and my two 780 tis consuming 1,100 watts of power at 100 percent gpu usage each and It better show 30 fps at 1080p.

    only then, and only then will this game be worthy of those system requirements.

    But! if this game looks as good as any other game that has come out like the division with game works pre baked in, with godly amounts of tessellation, tessellating every single crevice of the game and nose hair. With TXAA 4x all over the place with no way to turn it off. Then obviously this game is a crap port.
     
  12. kegastaMmer

    kegastaMmer Guest

    Messages:
    326
    Likes Received:
    40
    GPU:
    strix 1070 GTX
    saving now for something later next holiday season then, gosh, the textures and AO :V
     
  13. (.)(.)

    (.)(.) Banned

    Messages:
    9,089
    Likes Received:
    0
    GPU:
    GTX 970
    This. Exact same problem i had with the recently released Tomb Raider.

    Id like to know why nearly every game now is demanding the top end cards, when back in the days of Crysis 1,2,3 (though, i felt 3 was pushing it) those games were like nothing else in the industry and looked like they deserved the system requirements they asked for.

    So unless the PC version is like you say, the next gen of the next gen, then i fail to see how, from what little Ive seen of the game (granted, it was most likely X1 footage) that it requires a 980ti.
     
  14. stilli1988

    stilli1988 Guest

    Messages:
    13
    Likes Received:
    0
    GPU:
    MSI GTX 1070 Gaming X
    don't think it's that simple mate
     
  15. vazup

    vazup Guest

    Messages:
    333
    Likes Received:
    26
    GPU:
    r9 280X
    Well, sorry that not all of us were born in wealthy high paying countries... a hole.
     

  16. fantaskarsef

    fantaskarsef Ancient Guru

    Messages:
    15,754
    Likes Received:
    9,647
    GPU:
    4090@H2O
    Is this now a new hardware - old hardware, much available income - low income debate? :wanker:

    I personally too see that not everybody can afford a 10core CPU or SLI of Fury X / 980Ti. Yet, those who can should be able to make use of it too. Would love to see games scale from older hardware to newest stuff and CFX / SLI! And maybe this game actually does help since it supports dx12, probably getting a tad more performance out of older systems. (Up to date / enthusiast hardware won't see as big gains from dx12 as older hardware.)
     
  17. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,551
    Likes Received:
    608
    GPU:
    6800 XT
    I am not. Since it is published by ms.
     
  18. Glottiz

    Glottiz Ancient Guru

    Messages:
    1,949
    Likes Received:
    1,171
    GPU:
    TUF 3080 OC
    Calm down guys. Requirements were updated.

     
  19. edilsonj

    edilsonj Active Member

    Messages:
    55
    Likes Received:
    1
    GPU:
    Palit RTX 3060 Ti
    Specs updated.

    [URL="http://remedygames.com/quantum-break-pre-orders-and-previews-launch-windows-10-version-announced/"[/URL]

    Async Shader's AMD (Hitman) versus GameWorks DX12 Edition (Quantum Break).

    AMD needs to be best with Hitman's GFX performance.

    :war:
     
  20. isidore

    isidore Guest

    Messages:
    6,276
    Likes Received:
    58
    GPU:
    RTX 2080TI GamingOC
    Why do people comment on the fact that FuryX has 4GB of memory!? I played games with maximum texture settings at 1440p and had up to 2GB less of vram used compared to gddr5/competition. Stop worrying about FuryX memory. HBM is not GDDR5.
    Rise of Tomb Raider is a pile of crap when it comes down to memory utilization. I really doubt it uses HBM as it should, it' the first game that uses 3.9GB of Vram on my Fury.

    On Topic. Stop comparing console hardware and console game development with PC's. The reality is that you need 2 times the power to run the game at console specs. And the better the game looks the higher the PC specs.
    Also probably without dx12 features this game will be unplayable on PC's. (But this is just speculations, we shall see)
     

Share This Page