Review: Assassins Creed: Valhalla graphics performance benchmarks and analysis

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 12, 2020.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,388
    Likes Received:
    18,558
    GPU:
    AMD | NVIDIA
    Ryeo, tunejunky, Maddness and 4 others like this.
  2. Kool64

    Kool64 Ancient Guru

    Messages:
    1,656
    Likes Received:
    783
    GPU:
    Gigabyte 4070
    great review HH. It's interesting to see what "financially backing" a game engine can do for graphics performance.
     
  3. SpajdrEX

    SpajdrEX Ancient Guru

    Messages:
    3,399
    Likes Received:
    1,653
    GPU:
    Gainward RTX 4070
    I got confused by page title >
    Assassins Creed: Valhalla graphics perf benchmark review - RTX - DLSS 2.0 Perf - Quality
    :D does game support DLSS? :p
     
  4. Undying

    Undying Ancient Guru

    Messages:
    25,330
    Likes Received:
    12,743
    GPU:
    XFX RX6800XT 16GB
    lol that 5700XT is faster than it had rights to be. :D
     

  5. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Curious too as this shouldn't be utilizing the full range of RDNA stuff like Horizon was doing but the implemented AMD extensions could still be for more than FreeSync2 HDR support.
     
  6. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,388
    Likes Received:
    18,558
    GPU:
    AMD | NVIDIA
    Ah template residual, fixed. It could certainly use DLSS support though :)
     
  7. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    A solid upscaling option wouldn't hurt I think the game has two modes but I am not 100% on them keeping the same as the two prior Assassin's Creed games did with this engine here.
    Adaptive resolution scaling now as it's own setting and then below "High" anti-aliasing quality rendering in sub-native thus it looking like this setting has a higher than average performance impact when it's two separate components the scaling and then the TAA itself.

    Because it changes up I need to see if there's an actual confirmation on how this game implements it though.

    EDIT: Though of course as it's TAA then just checking the image whether it's soft or not doesn't really work it's TAA the image is softened and it can't really be disabled. :D
    (Well it should still stand out if it's really soft or not from upscaling the final image I'd imagine.)

    EDIT: Actually with it as a separate option and scaling both above and below 100% render resolution also being a option I would think that TAA on low or medium would now be entirely separated from modifying the back buffer resolution or how it scales it back.
     
  8. lukas_1987_dion

    lukas_1987_dion Master Guru

    Messages:
    701
    Likes Received:
    167
    GPU:
    RTX 4090 Phantom GS
    2560x1440 all ultra settings and 100% res scale, I have 75 average fps on 2080ti with drops to 60 fps in cities.
    Surprising to see how well 5700 XT is doing here, I wonder how well will 6800 XT perform then..
     
  9. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,535
    Likes Received:
    3,581
    GPU:
    RTX 4090 Gaming OC
    No, and good riddance.
     
  10. Loophole35

    Loophole35 Guest

    Messages:
    9,797
    Likes Received:
    1,161
    GPU:
    EVGA 1080ti SC
    1080ti still chugging along.

    AMD sponsored game. I’m sure there will be a little performance found in at least Ampere in the next few drivers. However my Pascal performance is probably what I’m gonna get.
     
    HARDRESET likes this.

  11. kanenas

    kanenas Master Guru

    Messages:
    512
    Likes Received:
    385
    GPU:
    6900xt,7800xt.
    We should state that Assasin's Creed: Valhalla is an AMD sponsored title
    Overall we say that AMD benefits from the game the most, as it should as AMD financially backs the game.

    With this title, we'll also move towards a new test platform, based on AMD Ryzen 9 5950X, currently the fastest gaming processors your money can get you.;)
    It starts:)
    gg GURU3D
     
  12. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,535
    Likes Received:
    3,581
    GPU:
    RTX 4090 Gaming OC
    6800 xt will perform worse the higher the res, due to the 256 bit bus and "only" having regular gddr6. So it will hit it out of the ballpark at 1080p, and likely be somewhat slower than the 3090 and possibly 3080 at 4k.
     
    Lily GFX likes this.
  13. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,388
    Likes Received:
    18,558
    GPU:
    AMD | NVIDIA
    :) Not saying a thing here, but sometimes I am so proud of the Guru3D community.
     
    gmavignier, Titan29, Lily GFX and 4 others like this.
  14. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Wonder what the infinity fabric and a 300 or higher bus width could do, sure there's full 512-bit but then it's all complicated due to pricing and what not in turn although combining the full thing of these together would have been interesting to see although a 384 or what's it called 448 bit bus might have been the top if AMD was going that far for the enthusiast model as I don't believe AMD has gone 512-bit since the attempts with this ring-bus and the 290 GPU model.

    Add a HBM2E type memory while thinking about something that won't happen best case is that AMD maybe uses this for the professional GPU lineup which I think is the rumored CDNA architecture replacing GCN, eventually.
    A bit under a week for reviews for these though, some good more demanding titles for testing them too. :D

    EDIT: Suppose GDDR6 and even GDDR6X has a range of speeds available plus improvements since the initial use on GPU's but I don't think AMD would be using the top-end chips same as NVIDIA likely isn't using the fastest GDDR6X modules at least for now.

    Just goes back to what benchmark results will be and then overclocking results if there's any headroom for long-term general stability when pushing the stock speeds higher.
    Was a bit iffy for the 5000 series due to the VRAM chips and then a mix between Samsung and Hynix I think it was here plus the memory controller and how the GPU handled that all.
    (Not well at all until 19.8.1 from what I recall and then various issues since with how sensitive these are.)
     
  15. H83

    H83 Ancient Guru

    Messages:
    5,465
    Likes Received:
    3,002
    GPU:
    XFX Black 6950XT
    This almost seems like an hint for the upcoming review of the 6000 series...:eek:
     
    tunejunky likes this.

  16. Supertribble

    Supertribble Master Guru

    Messages:
    978
    Likes Received:
    173
    GPU:
    Noctua 3070/3080 FE
    AMD have sponsored Ubisoft titles in the past with no obvious indication that their cards performed abnormally better than Nvidia but the sponsorship seems to have, belatedly, paid off. Nvidia cards are struggling here, and the performance the 5700XT is putting out bodes will for the 6000 series.
     
    lukas_1987_dion likes this.
  17. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    5,535
    Likes Received:
    3,581
    GPU:
    RTX 4090 Gaming OC
    Well the only option if they want to maintain the 16gb vram configuration, is 512 bit. It would be more expensive, sure... but not THAT much more expensive. So i would personally have expected at least the 6900xt to get a 512 bit bus, given it's 1k USD price point, and use of much cheaper gddr6 memory. The 6900xt with a 512 bit bus would likely have destroyed the 3090 at 4k, rather than it being possibly a bit behind the 3090 at 4k.
    Amd haven't done a gpu with a 512 bit bus since the 390, but i don't see any reason as to why they couldn't.

    HBM2 usually fares worse than GDDR6 in games due to higher latency - latency is king in games, which is also seen with intel vs amd, where intel has traditionally had substantially lower memory latency.
     
    JonasBeckman likes this.
  18. Sylwester Zarębski

    Sylwester Zarębski Member

    Messages:
    39
    Likes Received:
    31
    GPU:
    AMD
    Hilbert, could You check if tuning down Volumetric Clouds setting a notch or two gives massive perfomance improvement? In previous AC games (Odyssey and Origins) it can work miracles - going from 35 to 45 (one step down) to 60 fps (two or three steps down) on my old 290X/FHD - it is most visible on weaker cards.
     
  19. Netherwind

    Netherwind Ancient Guru

    Messages:
    8,821
    Likes Received:
    2,401
    GPU:
    GB 4090 Gaming OC
    When I bought the 2080Ti I thought it would be 4K capable which it really wasn't but I had high hopes with the 3080 being a true 4K card. Apparently it's not, at least not with Ubi games. W_D Legion would run at 4K but with reduced settings and I could never get it locked at 60fps.

    I was sure that Valhalla would run like Odyssey which after a few patches ran beautifully at 4K/60 with close to max settings on a 2080Ti.

    I checked out another review where they said that clouds have very little impact in this iteration. There is one or two settings which are much heavier on the GPU.
     
  20. willgart

    willgart Member

    Messages:
    10
    Likes Received:
    7
    GPU:
    nvidia
    I love the guys claiming that GDDR6X is required or a huge bandwidth.
    the drop in performance for the 3090, 3080 and 3070 is the same between 1440P and 4K
    specialy the drop in performance of the 3080 is 31% and the 3070 its 33%
    same quantity of ram, not the same amount of CUs, gddr6X vs gddr6... 2% of difference... for sure the speed of the ram has no impact. which is expected.

    AMD did the right move to go GDDR6 and not GDDR6X

    so we can expect to see the 6900XT at the 3090 performance level as expected.
     

Share This Page