Review: Strange Brigade: PC graphics card benchmark analysis review

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 31, 2018.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,388
    Likes Received:
    18,558
    GPU:
    AMD | NVIDIA
    warlord, Thunk_It and -Tj- like this.
  2. gerardfraser

    gerardfraser Guest

    Messages:
    3,343
    Likes Received:
    764
    GPU:
    R9 290 Crossfire
    I have not heard of the game either,but it looks like something I would like to play.Thanks for the review.
     
  3. cryohellinc

    cryohellinc Ancient Guru

    Messages:
    3,535
    Likes Received:
    2,974
    GPU:
    RX 6750XT/ MAC M1
    Great optimization there! How's the game btw?
     
  4. fatboyslimerr

    fatboyslimerr Member

    Messages:
    15
    Likes Received:
    3
    GPU:
    NVIDIA RTX 2060
    Maybe it's just me but I thought DX12 and, to an extent, Vulcan were both able to leverage more cores and weren't just about those GHz. Less CPU overhead or at least more balanced CPU overhead or something like that.
    It would be more interesting to see these APIs tested on say a 1060, 1080, 570 and Vega 64 with say an FX 8300, Ryzen 5 and something like i7 2600 vs your standard test bench.
     

  5. Brisse

    Brisse Active Member

    Messages:
    99
    Likes Received:
    2
    GPU:
    na
    IMO you are not praising Vulkan highly enough here. It is an open and portable standard, unlike d3d12 which is proprietary locked down crap. The fact that the developers implemented it alongside d3d12 and that it performs on par is awesome for those of us who doesn't use Windows. While this game has no Linux-port as far as I know, it would still run awesomely on Linux using compatibility layers like Proton or Wine. With Vulkan, it would have pretty much zero extra overhead since the graphics API doesn't have to be translated by the compatability layer. A shame it has Denuvo though, as it's rootkit-like functionallity often causes trouble for compatibility layers. Here's to hoping they remove that crap ASAP.

    Doom 2016 is a great example of how awesome compatibility layers can be when using Vulkan:
     
  6. Raider0001

    Raider0001 Master Guru

    Messages:
    521
    Likes Received:
    45
    GPU:
    RX 7900 XTX Nitro+
    U do not really need CPU power for games, these are not that complicated tasks (like calculating proteins), U just need something to handle the graphic cards driver according to the performance it is capable of doing and then some for the AI and sound and engine
     
  7. Agonist

    Agonist Ancient Guru

    Messages:
    4,284
    Likes Received:
    1,312
    GPU:
    XFX 7900xtx Black
    HOLY S***, look at the Fury X go at 4k. Even then 1080p, 1440p results are really good.
     
  8. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Frametime graphs reads microseconds...
     
  9. Brisse

    Brisse Active Member

    Messages:
    99
    Likes Received:
    2
    GPU:
    na
    Good proof for 4GiB being enough despite seing higher memory usage on cards with >4GiB.
     
  10. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Deferred render contexts for one thing.

    https://docs.nvidia.com/gameworks/c...s/d3d_samples/d3d11deferredcontextssample.htm

    For AMD I'm not too sure how that works at the moment, doesn't seem to be immediately doable but then again they do use multi-threaded rendering as well so, well I have a lot more to learn on the subject. :)
    https://github.com/GPUOpen-LibrariesAndSDKs/AGS_SDK/issues/20

    I do know a lot of games create render threads though, Final Fantasy XV ideally wants a 6 core processor, Assassin's Creed Origins creates a whooping 8 of them and Monster Hunter World has some bug in it so it creates 32 unless restrained though that's not strictly just render threads from how I understand it.

    And it's improved or even part of DirectX 12 and Vulkan already without extras or other add-on bits though used well it should improve overall CPU usage but it's spread across more cores and games like AC:O don't scale down very well so it's 8 or nothing and thus quad cores see a bit of stuttering though hexa cores do alright in most situations though CPU usage is pretty high. (Though the game also spends a lot of resources on the WMI thread for some undetermined reason.)


    Though I suppose technically you would still be correct, it's less about power (clock speed.) and more about using the additional cores logical or physical for modern processors more fully though it doesn't hurt to have both especially with modern PC ports being a bit, wonky, and that's being generous for how some of them turned out ha ha.
    (Although a few are also genuinely demanding and push hardware to it's limits adding various enhancements in the process.)

    I suppose efficiency doesn't hurt either, depending on extensions and what type of instruction it is modern processors can be quite a bit faster but overall I guess a GPU limit is still the most common but CPU and RAM certainly helps too, sometimes surprisingly so. :D
     
    fatboyslimerr likes this.

  11. -Tj-

    -Tj- Ancient Guru

    Messages:
    18,097
    Likes Received:
    2,603
    GPU:
    3080TI iChill Black
    Good to see it released.

    Runs great too, but the price is a bit steep atm, will wait a bit unless I get a good deal.
    And I see they jumped on that bs dlc season pass too :(


    About threading, their engine was never really optimized for multicore, although things changed since avp2010, ***** zombie army was a lot better.


    And from what I saw it's the same vibe now, nothing bad, but to hire same voice actor for story is a bit meh.. always reminds me of NZA series.
     
  12. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,388
    Likes Received:
    18,558
    GPU:
    AMD | NVIDIA
    Guys the frame time and pacing plots have been updated. I made a mistake and used the improper data-set (totally stupid), that propagated into the plots. This has been updated and fixed now!
     
  13. Venix

    Venix Ancient Guru

    Messages:
    3,440
    Likes Received:
    1,944
    GPU:
    Rtx 4070 super
    i have a feeling this is going to be the new ashes of benchularity ! great performance review HH !
     
  14. AlRayes_BRN

    AlRayes_BRN Guest

    Messages:
    12
    Likes Received:
    3
    GPU:
    EVGA RTX 2080 Ti
    This game is awesome, im having a lot of fun playing it!
     
  15. Yakk

    Yakk Guest

    Messages:
    144
    Likes Received:
    21
    GPU:
    Graphics card
    Well optimized game, and the Fury X works really, really well with Async.

    Older cards like R9 490x and even to a lesser degree the R9 280x should also have surprisingly good results and punch way above what would be expected leaving their old competition behind.
     

  16. Paulo Narciso

    Paulo Narciso Guest

    Messages:
    1,226
    Likes Received:
    36
    GPU:
    ASUS Strix GTX 1080 Ti
  17. BReal85

    BReal85 Master Guru

    Messages:
    487
    Likes Received:
    180
    GPU:
    Sapph RX 570 4G ITX
    I know it is an AMD supported title, but look at how AMD cards destroy the NV competitors: RX 580-GTX 1060, Vega 56-GTX 1070. Vega 56 even tops the 1080.
     
  18. Yakk

    Yakk Guest

    Messages:
    144
    Likes Received:
    21
    GPU:
    Graphics card
    Yeah I believe AMD worked with Rebellion on their Asura graphics engine and it seems to run really smooth.
     
  19. Kaarme

    Kaarme Ancient Guru

    Messages:
    3,513
    Likes Received:
    2,355
    GPU:
    Nvidia 4070 FE
    It's pretty funny how much it matters. But naturally Nvidia still claims the absolute crown with their top cards. I reckon the RTX cards would be quite tough as well.
     
  20. Keesberenburg

    Keesberenburg Master Guru

    Messages:
    886
    Likes Received:
    45
    GPU:
    EVGA GTX 980 TI sc
    Nvidia have async compute?
     

Share This Page