Quantum Break coming to PC - DirectX 12 only - Screenshots - Specs

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Feb 11, 2016.

  1. TBPH

    TBPH Guest

    Messages:
    78
    Likes Received:
    0
    GPU:
    MSI GTX 970 3.5+.5GB
    So, a GTX 970 should underperform an underclocked HD 7790 because one uses ASC? ok
     
  2. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    What the are you even talking about?

    Rise of the Tomb Raider is fine. The only cards that had problems were Fiji based and it was fixed with a patch/driver update. Gameworks had literally nothing to do with the performance issues there. It's an excellent port and an excellent game.

    Further, the Xbox has Tflop output of 1.32TF, the 760 is 2.2. That's not even double. So I have no idea where you are getting 2-3x from. This isn't even to mention that the game can't possibly support something like GTA V can (9800 GT) because there aren't any DX12 drivers yet for anything lower than Kepler and GCN. Should also point out that the developer needs to manually write driver-level support for nearly every single configuration under DX12 so the range of support specs are bound to be smaller than previous DX titles.

    Alan Wake had a great port, with great graphics and added functionality. I have no reason to believe this won't be either.

    Rofl what?

    http://www.guru3d.com/articles_page..._graphics_performance_benchmark_review,7.html

    https://www.techpowerup.com/reviews/Performance_Analysis/Rise_of_the_Tomb_Raider/4.html

    Edit: Wrote Mad Max instead of Alan Wake. Probably because all the terrible posts in this thread are making driving me mad.
     
    Last edited: Feb 11, 2016
  3. AsiJu

    AsiJu Ancient Guru

    Messages:
    8,806
    Likes Received:
    3,368
    GPU:
    KFA2 4070Ti EXG.v2
    Tomb Raider turned out just fine and will surely get better still with future patches.
    Granted, the port had a certain rushed out feeling to it initially but that seems to be the standard today.

    Plus there's one major difference here: Remedy will (surely) do the port themselves.
    With ROTTR the developer and PC porter were different companies (CD / Nixxes).

    If I'm wrong and this one turns out to be unoptimized then so be it, but making assumptions about the quality of an unreleased game based on system specs is pretty large a leap don't you think.
     
    Last edited: Feb 11, 2016
  4. TBPH

    TBPH Guest

    Messages:
    78
    Likes Received:
    0
    GPU:
    MSI GTX 970 3.5+.5GB
    You can't compare FLOPs unless two GPUs have the exact same architecture.

    And okay, I can get worse textures than XBox One and a slightly better framerate at the same settings thanks to excessive use of tessellation and no option for the selective tessellation of the XBox One version. Cool.

    By the way, there's nothing wrong with Fiji; it's because you need more than 8GB VRAM to use ultra textures even though XBox One can do it with 6GB total memory for the game.
     

  5. Seketh

    Seketh Ancient Guru

    Messages:
    1,899
    Likes Received:
    6
    GPU:
    RX 580 8GB
    Maxwell is still missing a couple of DX12 features, like Asynchronous Shaders, which Nvidia are trying to workaround with drivers. AMD wins hands down on feature level, but that doesn't necessarily translate into better performance.

    No, where you guys will be screwed is with GameWorks, because Nvidia's slogan seems to be "Buy a GPU every year"™.

    I don't care what team you root for, you gotta give credit to AMD for GCN. A 7970 with 6GB still holds up quite well today. Meanwhile, the 680...
     
  6. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    You get double the framerate, better textures and better shadows.

    http://www.pcper.com/reviews/Graphi...Performance-Results/Feb-5th-Patch-and-Multi-G

    There was something wrong with Fiji.

    The textures are clearly better on PC then they are on Xbox version of the game. There are obviously diminishing returns in detail as texture resolution (and size) increase. In fact this goes for just about everything. Which is why you can't just expect hardware that's twice as fast to have games that look twice as good.
     
  7. zzzaac

    zzzaac Guest

    Messages:
    39
    Likes Received:
    0
    GPU:
    -
    Yup, though it's not good for their bottom line imo
     
  8. zer0_c0ol

    zer0_c0ol Ancient Guru

    Messages:
    2,976
    Likes Received:
    0
    GPU:
    FuryX cf
    dx 12 and cpu overhead? yeah no
     
  9. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    Huh?
     
  10. RzrTrek

    RzrTrek Guest

    Messages:
    2,548
    Likes Received:
    741
    GPU:
    -
    Time to get to work with that DirectX 12 driver for Fermi.
     

  11. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    How does Gameworks force me to buy a new GPU every year?

    https://tpucdn.com/reviews/Sapphire/R9_390_Nitro/images/perfrel_1920_1080.png

    The 780Ti still seems to be neck and neck with a 290x and the 780Ti came out in 2013, 3 years ago.

    And yeah, older AMD cards are doing better than older Nvidia cards, but that's mostly because of what you said, the memory. They all had more VRAM which is becoming a huge factor in newer games. Look at anandtech's benchmark 2015 between the 680 and the 7970. It literally looks like the release benchmarks in 2012 with the exception of higher resolution games and Shadows of Mordor due to the VRAM limitation.

    I don't see how Nvidia screwed anyone. When I bought my 690 I knew it had 2GB of vram. I knew it was going to be an issue as games with higher res textures came out. I essentially sidegraded to my 980 and saw huge gains in games with VRAM limitations.
     
  12. kinggavin

    kinggavin Guest

    Messages:
    297
    Likes Received:
    0
    GPU:
    gtx 980TI super jetstream
    its interesting they say 6gb vram and a fury x , last time i looked the fury x has 4gb vram
     
  13. TBPH

    TBPH Guest

    Messages:
    78
    Likes Received:
    0
    GPU:
    MSI GTX 970 3.5+.5GB
    I don't get anywhere close to double the frame rate (it's often in the 40s), and I get worse textures because XBox One uses ultra textures and I can only manage high without extreme stuttering and a huge drop in performance.

    It would be okay if I had a G-Sync monitor, but I'm not spending $500 to permanently lock myself into Nvidia.
     
    Last edited: Feb 11, 2016
  14. bigfutus

    bigfutus Master Guru

    Messages:
    535
    Likes Received:
    59
    GPU:
    MSI 3080 VENTUS 10G
    Hope the will whine about low sales and blame piracy, and not the Win10 exclusivity.
     
  15. kinggavin

    kinggavin Guest

    Messages:
    297
    Likes Received:
    0
    GPU:
    gtx 980TI super jetstream
    i felt a bit screwed i spend £350 on a gtx 970 4gb and it had 3.5 gb and the rest was slower ram , overclockers u.k did offer me a refund to be fair but still a bit shady from nvidia only put 3.5gb fast ram not the full 4gb
     

  16. Seketh

    Seketh Ancient Guru

    Messages:
    1,899
    Likes Received:
    6
    GPU:
    RX 580 8GB

    Well, they screwed Kepler users with The Witcher 3 for example. Kepler had been out of specific optimizations since Far Cry 4, which was made a proven fact after all the community outrage and the eventual release of drivers containing optimizations which personally gave me up to ~10fps of improvement on the Witcher 3. Both are GameWorks titles, what a coincidence, huh?

    And they even had the nerve of not including those optimizations in the first wave of Windows 10 Nvidia drivers, so I had to play The Witcher 3 with crippled performance.

    Even if you think that isn't screwing anyone, I still count the 970 3584+512 as screwing every 970 owner. The boxes still say 4GB GDDR5.

    And thank you for the benchmarks you posted. Remember how the 280x was "equivalent" to the 770? And it was cheaper! Now look at it go. Just look how much value the 280x had and how ****ty the 770 is. Look at the benchmarks you posted!

    But I already made up my mind. I'm going for a 380x or 390 Polaris equivalents when they launch, with a FreeSync monitor. I've switched between AMD and Nvidia throught the years and the only advantage Nvidia always really has is specific features in games which are never really worth the extra premium you pay for Nvidia. I honestly don't care if AMD performs 5 fps worse than a Pascal equivalent, I'm not falling for the bullsh*t again.
     
    Last edited: Feb 11, 2016
  17. Hughesy

    Hughesy Guest

    Messages:
    357
    Likes Received:
    1
    GPU:
    MSI Twin Frozr 980
    Bloody hell people still going on about that....

    Ps recommended probably isn't for 1080P, more likely 4K. There's a huge gap between min and rec, if it was a bad port then minimum would be higher. Here's a secret that some PC gamers miss, you can infact turn settings down to get better performance, nobody forces you to max them. And another thing, when a game doesn't push the hardware people moan, when a game does push the hardware people moan, can't win...
     
  18. Dygaza

    Dygaza Guest

    Messages:
    536
    Likes Received:
    0
    GPU:
    Vega 64 Liquid
    Indeed, huge cap between minimum and recommended specs. I'm sure everyone will find settings that are fine for their hardware. And since it's dx12 with low cpu overhead, it's safe to assume from cpu power needed that pc version is gonna have some neat physics etc that console versions don't have. Interesting to see what will happen.
     
  19. Agonist

    Agonist Ancient Guru

    Messages:
    4,284
    Likes Received:
    1,312
    GPU:
    XFX 7900xtx Black
    If you think that it will take that to match the Xbox 1, you are seriously mistaken.

    An overclocked 1st gen i7 still outdoes the PS4 and Xbox 1.

    Even my i5 4460 in my game stream server with a gtx 670 4gb does way better. Even runs Rise of the tomb raider all high settings @ 1080p above 40 fps.

    GTA5 really looks decent but not that great. Car refecltions are on point for the most part, but Pcars and Assetto Corsa beat it easily in that regard.
    GTA5 played on a Phenom X4 9950 @ 3.1 ghz, 8GB DDR2 800, HD 5770 1GB @ 1440x900 on medium settings and got 35fps avg which was impressive.

    But I want to see games push our hardware to the limits. Im not gonna be mad my 2 1/2 year old gpus cant max the game out.
     
  20. Seketh

    Seketh Ancient Guru

    Messages:
    1,899
    Likes Received:
    6
    GPU:
    RX 580 8GB
    Things aren't black and white. Unless you somehow consider that XCOM2 is better looking than Rise of the Tomb Raider and the Witcher 3. Because it really performs worse than those two and it's running on Unreal Engine 3.

    Or when you have crap performance in some game at minimum settings and yet your GPU+CPU usage is at 50%.

    What people want is balance, and a game that scales well, and that's becoming increasingly rare. Rise of the Tomb Raider is the perfect counter to your argument. It's a demanding game that really just doesn't scales that well. Between medium and high, I get a difference of about 5 fps for much worse graphical quality. (Tesselation disabled on both, because Maxwell, of course). Turn the graphics down to low, and the game won't reach 60fps and yet it looks worse than Tomb Raider. So what's the point of lowering visuals, when the game doesn't scale properly?
     
    Last edited: Feb 11, 2016

Share This Page