Death Stranding: PC graphics benchmark perf analysis review

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jul 14, 2020.

  1. Astyanax

    Astyanax Ancient Guru

    Messages:
    15,301
    Likes Received:
    6,266
    GPU:
    GTX 1080ti
    7950 is only a dx 11.3 device.
     
  2. Bobdole776

    Bobdole776 Member

    Messages:
    25
    Likes Received:
    4
    GPU:
    zotac amp extreme 1080ti
    My 1080ti paired with a 3900x doing the EDC=1 bug and I get on average 125-144 fps ultra settings 1440p. I have yet to see it even dip below 116 fps, but there are times where I'm not getting full gpu utilization but that's cause I'm on a amd chip instead of intel.

    Wonder how my old 5820k @ 4.6ghz would be doing right now? I need to buy a case so I can build another system with it and use it as a comparison pc to my 3900x build I have now.

    There is one thing I have to note though and that's CPU scaling for this game. Lots of people are reporting it scales really well with high thread count cpu's, and I even see as much as 12 threads being used on my chip. Those with even more see just as good of scaling for thier chips as well, even threadrippers!
     
    XenthorX likes this.
  3. Stormyandcold

    Stormyandcold Ancient Guru

    Messages:
    5,838
    Likes Received:
    441
    GPU:
    RTX3080ti Founders
    Until the next-gen consoles release, I think the GTX970 is a very relevant card. It was one often featured in console vs PC comparisons. Considering Death Stranding was originally a console release, I think these gfx card requests and results are welcome.

    However, I do think by the end of 2021 many gfx cards will start to fall off the charts, replaced with 1 to 1.5 generations of new gfx cards.
     
  4. XenthorX

    XenthorX Ancient Guru

    Messages:
    4,557
    Likes Received:
    2,730
    GPU:
    4090 Suprim X
    A good overclocked 5820k is a gem, i'm not biased at all <.<
     

  5. JoeyR

    JoeyR Member

    Messages:
    18
    Likes Received:
    8
    GPU:
    2x eVGA GTX 980Ti's
    Awesome write-up Hilbert!
    I still have a few 970's boxed up/retired, but really pretty impressive these cards can still pump out playable framerates even at 2560x1440! That 4GB of VRAM is pretty much on the last leg though.
     
  6. Undying

    Undying Ancient Guru

    Messages:
    21,158
    Likes Received:
    9,456
    GPU:
    Devil RX6700XT 12GB
    Some dx12 features are missing on all gcn1.0 cards. They cannot run the game. Idk about nvidia kepler cards.
     
  7. Astyanax

    Astyanax Ancient Guru

    Messages:
    15,301
    Likes Received:
    6,266
    GPU:
    GTX 1080ti
    i think its more the fact the legacy drivers broke some DX12 titles than the fact its 11.3 only.
     
  8. Supertribble

    Supertribble Master Guru

    Messages:
    945
    Likes Received:
    162
    GPU:
    Noctua 3070/3080 FE
    The game is surprisingly heavy on performance at 4K imo. Given the vast gulf in perf from the PS4 pro to a 1080ti class card it should be running comfortably at 4K/60. I watched the Digital Foundry video about the PC version and it took the 2080ti to just about stay above 60 at 4K, that seems quite ridiculous to be honest! DLSS 2.0 to the rescue, then, for those with RTX cards but yeah, something isn't right and if I didn't have the 2070S to fall back on I'd be cheesed off. The game doesn't allow custom screen resolutions so it's either be stuck at 1440p or have poor performance at 4K on Pascal and AMD. Really not good enough.
     
  9. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,563
    Likes Received:
    2,960
    GPU:
    XFX 7900XTX M'310
    Sometimes the console settings are below the lowest available PC settings plus scaling isn't always linear and a few games also scale up other effects with resolution like running shaders at 1/2 or 1/4 of the output resolution and thus effects like ambient occlusion or depth of field have a significantly different performance profile above 1920x1080 for example. :)

    Overhead from the OS, other programs and the drivers plus bugs such as Steam API and input issues for example for this particular game though the game patched some stuff by now it seems and Valve also has a beta client update where they seemingly optimized the CPU calls their input code had though for whatever reason it took this game to hit those problems before Valve optimized things on their end but eh it got improved at least. :D
    (There's a lot of fun little ways to really break input handling and cause all sorts of performance issues it seems, mouse poll rate above default can affect a number of titles for another example and then gamepads/controllers and API's for handling these.)


    While it's probably optimized the games use of vegetation and alpha transparency and perhaps trying to have some sorting going will also really affect performance as the display resolution increases, Horizon Zero Dawn on console uses some pretty heavy culling for anything out of the camera and probably a good system for masking or minimizing pop-in and LOD transitions too.

    Someone would have to break it down more and do a deeper analysis, it can get pretty clever and also pretty complex what some of the newer game engines are actually doing and how they resolve handling all this data of various sorts, shaders, geometry, lighting and various passes and what gets drawn and when and all the ways this can completely break or need to be adjusted to minimize performance or maximize the GPU workload and recommended practices including many from NVIDIA and also AMD on how to best utilize their GPU strengths and API's such as D3D10, 11 and 12 and Vulkan now. :)

    But yeah the settings probably push the console values pretty far and that's already going to tax GPU and CPU performance extending draw distances, shadow resolution and that's without going over 1920x1080 and how the game handles even that such as upscaling or checkerboard or just natively supporting it and then higher via downsampling or methods of pushing into 2560, 3200 or 3840 resolutions for the PS4 Pro and XBox One X models.
    (3840x2160 being 4x 1920x1080 renders a lot more per frame just from the pixel count alone and how much more the GPU has to work for that also.)



    EDIT: Although again someone would have to capture and break down the entire process from a frame the game renders and each step that goes on here for how that works and if that's optimized or not and what issues there might be and then additions over the console version and all that.

    Some articles go into more depth like this but not too many, Digital Foundry sometimes get some good stuff too and especially if the developers are around to answer some of the tech stuff directly but that's not always possible.
     
  10. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,099
    Likes Received:
    946
    GPU:
    Inno3D RTX 3090
    The game I saw seemed to be running just fine. As @JonasBeckman said, there are things that are done/can be done on a console that are not possible on a more generic architecture like the PC. That's usually memory management techniques.
    Also the PS4 Pro runs this game checkerboarded, which means it really renders 1920 * 2160, so 4 million pixels instead of 8 million for real 4k, and at 30 fps at that. Suddenly for 4k60 you need 4x the raw power, and that's 17Tflop, which the 2080Ti doesn't have.
     

  11. Supertribble

    Supertribble Master Guru

    Messages:
    945
    Likes Received:
    162
    GPU:
    Noctua 3070/3080 FE
    Hmm. The PC version doesn't really offer many extra settings over the console, I think there is one settings for LODs? I can't think of a single game where the 2080ti would be borderline at 4K using console equivalent settings. even the most egregious examples I can think of, like Assassin's Creed Odyssey would be doable. Most console games run at 30fps and it is standard to expect the PC versions to run at 4K/60 using cutting edge hardware. Where that becomes a challenge is when the PC version offers much improved image quality settings over the console, so 4K then becomes much more challenging. Death Stranding doesn't offer that, so at default settings it is more apples to apples. Sure, the PS4 Pro has some custom hardware to help with checkerboarding, but considering the final upscaled resolution in relation to the internal resolution it is scaling linearly or thereabouts with GPU power.
     
  12. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,099
    Likes Received:
    946
    GPU:
    Inno3D RTX 3090
    That would work if the console games weren't optimized to the brim already. If we supposed that Death Stranding brings the PS4 Pro to its knees (and judging by the fan it does quite often), then you really have the disparity of having to shade 4x the time of pixels if you want 4k60. In that case the the 2080Ti doesn't have enough shading power to do that. It doesn't seem so weird to me, to be honest. If the PC was also checkerboarded, then sure.
     
  13. Supertribble

    Supertribble Master Guru

    Messages:
    945
    Likes Received:
    162
    GPU:
    Noctua 3070/3080 FE
    My brain is fried but I'm still unconvinced by the metrics required plus I think console super secret sauce is overstated. I never felt like I needed an abundance of addition power to run console games past a point of linear GPU power scaling, performance has typically been where I expected it to be factoring in the power of the GPU being used. There is also an assumption that any one game is maxing out the console all the time, with Death Stranding it does become GPU limited in some scenarios, specifically cutscenes but in gameplay there might be more performance available if the frame rate was uncapped.
     
  14. psolord

    psolord Active Member

    Messages:
    62
    Likes Received:
    11
    GPU:
    3060ti/1070/970/++
    Hello everybody!

    Following my 2500k+970 and 8600k+1070 test videos above, I did two new ones if anyone cares. This time with 8600k+970 and 2500k+1070.

    Things I noticed. The 970 now performs quite better with the 8600k reaching ~75fps with fewer drops. Also the 2500k performs much better with the 1070, than it did with the 970, meaning that the system was not severely cpu limited before. My guess is that due to the combination of the 3.5GBs vram of the 970 and pcie 2.0 bus of the 2500k, the system has many delays to send data back and forth the pcie 2.0 bus. However on the 8600k, due to its pcie 3.0 and overall faster performance of course, this hindrance does not appear very often.

    On the other hand, the 2500k, when paired with the 1070, gives 85-90fps and it's not even maxed out, since I have it running at only 4.3Ghz, the same clock as my 8600k with multicore enhancement. So my guess here is, that since the 1070 has 8GBs, the system has far less data to keep sending to the gpu over the pcie 2.0 bus, therefore If you have any input on this I'd like to hear it.

    Whatever the case, a cpu from January 2011 runinng at 85-90fps, a game that is 30fps on consoles, is great. Can't wait to see what it can do on Horizon.

    Here are the new vids. Again, not clickbaitig, non monetized/sponsored etc, just for fun, as it has been since 2006.

    8600k+970





    2500k+1070

     
    Undying and PrMinisterGR like this.
  15. DannyD

    DannyD Ancient Guru

    Messages:
    5,242
    Likes Received:
    3,140
    GPU:
    evga 2060
    I'm a delivery driver in the real world, there isn't a game i'd less like to play right now lol.
    Great work by the boss though.
     

  16. Supertribble

    Supertribble Master Guru

    Messages:
    945
    Likes Received:
    162
    GPU:
    Noctua 3070/3080 FE
    There seems to be a bug either involving DLSS or the depth of field, which is repeatable. I noticed the image becoming softer after a while when switching from native screen resolution to DLSS, at first I thought it was DLSS becoming inconsistent over time but now I think when switching, the depth of field is getting turned off then turning back on when the scene switches, cut scene or some other scene switch. Weird and quite annoying tbh.
     

Share This Page