Review: Assassins Creed: Valhalla graphics performance benchmarks and analysis

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Nov 12, 2020.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    40,017
    Likes Received:
    8,690
    GPU:
    AMD | NVIDIA
    Ryeo, tunejunky, Maddness and 4 others like this.
  2. Kool64

    Kool64 Master Guru

    Messages:
    973
    Likes Received:
    367
    GPU:
    Gigabyte RTX2070S
    great review HH. It's interesting to see what "financially backing" a game engine can do for graphics performance.
     
  3. SpajdrEX

    SpajdrEX AMD Vanguard

    Messages:
    2,669
    Likes Received:
    957
    GPU:
    Sapphire RX 6800XT
    I got confused by page title >
    Assassins Creed: Valhalla graphics perf benchmark review - RTX - DLSS 2.0 Perf - Quality
    :D does game support DLSS? :p
     
  4. Undying

    Undying Ancient Guru

    Messages:
    14,580
    Likes Received:
    3,752
    GPU:
    Aorus RX580 XTR 8GB
    lol that 5700XT is faster than it had rights to be. :D
     

  5. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,449
    Likes Received:
    2,790
    GPU:
    MSI 6800 "Vanilla"
    Curious too as this shouldn't be utilizing the full range of RDNA stuff like Horizon was doing but the implemented AMD extensions could still be for more than FreeSync2 HDR support.
     
  6. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    40,017
    Likes Received:
    8,690
    GPU:
    AMD | NVIDIA
    Ah template residual, fixed. It could certainly use DLSS support though :)
     
  7. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,449
    Likes Received:
    2,790
    GPU:
    MSI 6800 "Vanilla"
    A solid upscaling option wouldn't hurt I think the game has two modes but I am not 100% on them keeping the same as the two prior Assassin's Creed games did with this engine here.
    Adaptive resolution scaling now as it's own setting and then below "High" anti-aliasing quality rendering in sub-native thus it looking like this setting has a higher than average performance impact when it's two separate components the scaling and then the TAA itself.

    Because it changes up I need to see if there's an actual confirmation on how this game implements it though.

    EDIT: Though of course as it's TAA then just checking the image whether it's soft or not doesn't really work it's TAA the image is softened and it can't really be disabled. :D
    (Well it should still stand out if it's really soft or not from upscaling the final image I'd imagine.)

    EDIT: Actually with it as a separate option and scaling both above and below 100% render resolution also being a option I would think that TAA on low or medium would now be entirely separated from modifying the back buffer resolution or how it scales it back.
     
  8. lukas_1987_dion

    lukas_1987_dion Master Guru

    Messages:
    492
    Likes Received:
    54
    GPU:
    RTX 2080 Ti AMP! OC
    2560x1440 all ultra settings and 100% res scale, I have 75 average fps on 2080ti with drops to 60 fps in cities.
    Surprising to see how well 5700 XT is doing here, I wonder how well will 6800 XT perform then..
     
  9. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,329
    Likes Received:
    1,259
    GPU:
    Rtx 3090 Strix OC
    No, and good riddance.
     
  10. Loophole35

    Loophole35 Ancient Guru

    Messages:
    9,784
    Likes Received:
    1,135
    GPU:
    EVGA 1080ti SC
    1080ti still chugging along.

    AMD sponsored game. I’m sure there will be a little performance found in at least Ampere in the next few drivers. However my Pascal performance is probably what I’m gonna get.
     
    HARDRESET likes this.

  11. kanenas

    kanenas Member Guru

    Messages:
    155
    Likes Received:
    95
    GPU:
    rtx 2070 aurus
    We should state that Assasin's Creed: Valhalla is an AMD sponsored title
    Overall we say that AMD benefits from the game the most, as it should as AMD financially backs the game.

    With this title, we'll also move towards a new test platform, based on AMD Ryzen 9 5950X, currently the fastest gaming processors your money can get you.;)
    It starts:)
    gg GURU3D
     
  12. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,329
    Likes Received:
    1,259
    GPU:
    Rtx 3090 Strix OC
    6800 xt will perform worse the higher the res, due to the 256 bit bus and "only" having regular gddr6. So it will hit it out of the ballpark at 1080p, and likely be somewhat slower than the 3090 and possibly 3080 at 4k.
     
    Lily GFX likes this.
  13. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    40,017
    Likes Received:
    8,690
    GPU:
    AMD | NVIDIA
    :) Not saying a thing here, but sometimes I am so proud of the Guru3D community.
     
    gmavignier, Titan29, Lily GFX and 4 others like this.
  14. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,449
    Likes Received:
    2,790
    GPU:
    MSI 6800 "Vanilla"
    Wonder what the infinity fabric and a 300 or higher bus width could do, sure there's full 512-bit but then it's all complicated due to pricing and what not in turn although combining the full thing of these together would have been interesting to see although a 384 or what's it called 448 bit bus might have been the top if AMD was going that far for the enthusiast model as I don't believe AMD has gone 512-bit since the attempts with this ring-bus and the 290 GPU model.

    Add a HBM2E type memory while thinking about something that won't happen best case is that AMD maybe uses this for the professional GPU lineup which I think is the rumored CDNA architecture replacing GCN, eventually.
    A bit under a week for reviews for these though, some good more demanding titles for testing them too. :D

    EDIT: Suppose GDDR6 and even GDDR6X has a range of speeds available plus improvements since the initial use on GPU's but I don't think AMD would be using the top-end chips same as NVIDIA likely isn't using the fastest GDDR6X modules at least for now.

    Just goes back to what benchmark results will be and then overclocking results if there's any headroom for long-term general stability when pushing the stock speeds higher.
    Was a bit iffy for the 5000 series due to the VRAM chips and then a mix between Samsung and Hynix I think it was here plus the memory controller and how the GPU handled that all.
    (Not well at all until 19.8.1 from what I recall and then various issues since with how sensitive these are.)
     
  15. H83

    H83 Ancient Guru

    Messages:
    3,341
    Likes Received:
    740
    GPU:
    MSI Duke GTX1080Ti
    This almost seems like an hint for the upcoming review of the 6000 series...:eek:
     
    tunejunky likes this.

  16. Supertribble

    Supertribble Master Guru

    Messages:
    868
    Likes Received:
    115
    GPU:
    1080Ti/RadVII/2070S
    AMD have sponsored Ubisoft titles in the past with no obvious indication that their cards performed abnormally better than Nvidia but the sponsorship seems to have, belatedly, paid off. Nvidia cards are struggling here, and the performance the 5700XT is putting out bodes will for the 6000 series.
     
    lukas_1987_dion likes this.
  17. Dragam1337

    Dragam1337 Ancient Guru

    Messages:
    2,329
    Likes Received:
    1,259
    GPU:
    Rtx 3090 Strix OC
    Well the only option if they want to maintain the 16gb vram configuration, is 512 bit. It would be more expensive, sure... but not THAT much more expensive. So i would personally have expected at least the 6900xt to get a 512 bit bus, given it's 1k USD price point, and use of much cheaper gddr6 memory. The 6900xt with a 512 bit bus would likely have destroyed the 3090 at 4k, rather than it being possibly a bit behind the 3090 at 4k.
    Amd haven't done a gpu with a 512 bit bus since the 390, but i don't see any reason as to why they couldn't.

    HBM2 usually fares worse than GDDR6 in games due to higher latency - latency is king in games, which is also seen with intel vs amd, where intel has traditionally had substantially lower memory latency.
     
    JonasBeckman likes this.
  18. BoobZ

    BoobZ Active Member

    Messages:
    67
    Likes Received:
    12
    GPU:
    XFX r9 290 DD
    Holy, my 5600XT just gained some extra value :D
     
  19. Sylwester Zarębski

    Sylwester Zarębski New Member

    Messages:
    9
    Likes Received:
    6
    GPU:
    AMD
    Hilbert, could You check if tuning down Volumetric Clouds setting a notch or two gives massive perfomance improvement? In previous AC games (Odyssey and Origins) it can work miracles - going from 35 to 45 (one step down) to 60 fps (two or three steps down) on my old 290X/FHD - it is most visible on weaker cards.
     
  20. Netherwind

    Netherwind Ancient Guru

    Messages:
    7,643
    Likes Received:
    1,538
    GPU:
    MSI 3080 Gaming X
    When I bought the 2080Ti I thought it would be 4K capable which it really wasn't but I had high hopes with the 3080 being a true 4K card. Apparently it's not, at least not with Ubi games. W_D Legion would run at 4K but with reduced settings and I could never get it locked at 60fps.

    I was sure that Valhalla would run like Odyssey which after a few patches ran beautifully at 4K/60 with close to max settings on a 2080Ti.

    I checked out another review where they said that clouds have very little impact in this iteration. There is one or two settings which are much heavier on the GPU.
     

Share This Page