Crytek's "Hunt Showdown" seems to have a massive improvement in 1% Lows with 5950X versus 5800X

Discussion in 'General Hardware' started by BlindBison, Jul 18, 2022.

  1. BlindBison

    BlindBison Maha Guru

    Likes Received:
    RTX 2080 Super
    Test #1 -- 5800X + 3070 Max Settings

    Test #2 -- 5950X + 6900XT

    I think I might be misunderstanding this data. If I had to guess the massive difference in 1% Lows this game sees is due to the CPU difference as the 5950X has twice as many cores and threads and double the cash (albeit split between 2 CCXs). Hunt is known to scale very well with core/threads (better than most games from what I gather) and if I remember correctly at one point the devs said the game can make use of 16 cores (although not equally I imagine, it's still DX11 and a disproportionate amount of load is on the main render thread or what have you if I've understood correctly).

    Now, I could not find this game tested from the more major outlets (Gamer's Nexus, Hardware Unboxed, Digital Foundry outside their initial video years ago). I had to look for smaller channels that did their own single benchmarks and what I found really surprised me since you often hear outlets like HU claim that core and thread count don't really matter too much for games beyond a certain point (6 core/12 thread being the current "sweet spot" it seems).

    But on the other hand you hear other outlets like "Tech Deals" make videos like this one () where they claim that more cores and threads can really have a dramatic improvement on 1% lows (they also claim the games do better when they can spread to new cores before having to use hyperthreading if I've understood correctly). I've also heard HU claim that it's more about the difference in cache that higher end CPUs have than the difference in cores/threads so i'm not really sure what to believe anymore.

    The other thing that struck me is that maybe it's not the CPU causing this difference so much as it is the difference in GPUs. Some games, especially ones without RT, seem to favor/do better with AMD's most recent hardware. The 6900XT also has significantly more VRAM (16 gigs of slower memory, but with a special cache versus 3070s 8 gigs of faster vram without that special cache system iirc). Maybe that's the real difference -- either Hunt favoring AMD hardware OR the huge difference in total VRAM perhaps, again I'm not sure. The game is DX11 though and it's been commonly said Nvidia has the superior DX11 driver overall.

    I had hoped to find a video comparing the CPUs with the same GPU/rest of system but I couldn't do that -- the whole reason I was searching to begin with is i've noticed Hunt Showdown seems to generally run well, but has some very nasty 1% lows at times (with framerates uncapped on my g-sync 1440p 144 hz panel and my 5800x + 3070 16 gigs of 3600 MHz DDR4 + NVMe drive system the game doesn't really look "smooth" often times and has very nasty 1% low stutters downwards occasionally in my experience -- this also happened on my 3900X + 2080 Super system so capping the framerate or bringing overall framerate downwards via DLDSR has been the way forward in my experience).

    Do you think it's the difference in CPU causing this? Or do you think it's the GPU? Or both to a degree? I've often heard it said that a better CPU can really smoothen out frametimes in games and particularly help with the occasional 1% lows so I had figured it was that, but if that's so why do most outlets claim the 5600X versus 5950X difference is only minimal (the major outlets do not include this game in their CPU tests so perhaps most games do not see the same kind of uplift)? Have I misunderstood something? Thanks,

Share This Page