Assassin's Creed Odyssey

Discussion in 'Games, Gaming & Game-demos' started by Netherwind, Jun 11, 2018.

  1. Calmmo

    Calmmo Guest

    Messages:
    2,424
    Likes Received:
    225
    GPU:
    RTX 4090 Zotac AMP
    I bought this (gold ed) for ~25eu during the sales. Anyone know what the big DLC timetable's meant to be? I'm not really in any rush to play this since I've probably 2 years worth of games on my owned games backlog to go through but it'd be nice to know
     
  2. Netherwind

    Netherwind Ancient Guru

    Messages:
    8,839
    Likes Received:
    2,413
    GPU:
    GB 4090 Gaming OC
    Bought it on the sale for 23€ with extra discount.

    Tried to get 4K@60fps in the benchmark which was easier said than done. Had to turn down a lot of things and Shadows to Medium is what did the trick but they sure look really bad now :( Anyone with a 1080Ti care to share your settings?

    Also this game absolutely hammers my CPU (80-100% during bench) which could explain why the GPU isn't maxed out? I get 96% load at most.
     
  3. TaskMaster

    TaskMaster Ancient Guru

    Messages:
    1,705
    Likes Received:
    690
    GPU:
    ZOTAC Trinity 4080
    I have a 1080TI, but I run the game at 2k. I run everything on maxed out and on average get an fps of 80 and highest I saw was just over 100. It is quite demanding.
     
  4. Netherwind

    Netherwind Ancient Guru

    Messages:
    8,839
    Likes Received:
    2,413
    GPU:
    GB 4090 Gaming OC
    I could play the game at 3440x1440, that's true. I wouldn't want to play at 2560x1440 though on my screen due to black bars.
     

  5. sertopico

    sertopico Maha Guru

    Messages:
    1,444
    Likes Received:
    374
    GPU:
    Palit Gamerock 4090
    Even if it still looks gorgeous most of the times, I think the Anvil starts to be old, probably a new engine would help against the crazy CPU use... I read around that it dares to go under 60 fps even with the latest 9 series from Intel. As a starting point, the best you can do is turning down clouds to very high or high (ultra eats 25% of performance). I play at 2k with 140% scaling (almost 4k) with everything maxed out except clouds on very high and i get 45-60 fps. The torch is the real fps killer though: when real time shadows kick in i can lose even 20 fps, ending up to 30ish fps. In this case you have to turn down shadows as well.
     
  6. Netherwind

    Netherwind Ancient Guru

    Messages:
    8,839
    Likes Received:
    2,413
    GPU:
    GB 4090 Gaming OC
    Below 60 with the latest 9 series is twisted. I've already turned down the clouds but will check it out.

    Two things that are the same from Origins seems to be:
    1. Anti-Aliasing Low has better performance than Anti-Aliasing Off
    2. Torch and shadows kills performance unless in completely dark areas.
     
  7. Party Poison

    Party Poison Ancient Guru

    Messages:
    2,270
    Likes Received:
    824
    GPU:
    Rtx 4090
    Ive really got back into this after a rebuild. Game runs so much better overall, Could be hardware could be patches, But at first the game didn't grab me like Origins did.

    Last few nights though i've been laying back and really really enjoying it.
     
  8. tensai28

    tensai28 Ancient Guru

    Messages:
    1,556
    Likes Received:
    417
    GPU:
    rtx 4080 super
    Tried this game out and managed to get it running at a constant 60fps at 4k. One thing is for sure, it's terribly optimized. I had to overclock my cpu (9600k) to 4.9ghz to keep it from having annoying dips in the city. Stock would hover in the 50s especially in the opening battle. 4.6ghz would play at 60 but with occasional drops to 59/58 and only at 4.9ghz was it smooth and perfect. Had to also drop aa to low and volumetric clouds to medium and I'm using a 2080ti. Just doesn't seem like there is much going on in this game to justify these rediculous requirements.
     
  9. sertopico

    sertopico Maha Guru

    Messages:
    1,444
    Likes Received:
    374
    GPU:
    Palit Gamerock 4090
    1. AA low runs better than off because they are using a temporal reconstruction technique, so the only times you see the image at native res is when you stand still. You will notice that with AA on medium or low there is that typical ghosting/shimmering effect especially on alpha textures.
    2. Yes, on that regard there is nothing we can do. In this new episode even other light sources apart from you torch cast real time shadows in the night time or inside caves, that's why performance is sometimes so miserable.
     
    Netherwind likes this.
  10. MarkyG

    MarkyG Maha Guru

    Messages:
    1,032
    Likes Received:
    195
    GPU:
    Gigabyte RTX 4070Ti
    Just have some filthy Cultists to hunt down and murder now. Think there are about 8 or 9 left. Loving this game to death and is the most time i've ever invested in an AC game. Can't wait for the next instalment. :)
     

  11. Party Poison

    Party Poison Ancient Guru

    Messages:
    2,270
    Likes Received:
    824
    GPU:
    Rtx 4090

    Game runs miles better for me with the 9700k over my 7700k.

    Cpu usage in cities is crazy but its oc'd to 5.1 and its a huge help
     
    XenthorX likes this.
  12. tensai28

    tensai28 Ancient Guru

    Messages:
    1,556
    Likes Received:
    417
    GPU:
    rtx 4080 super
    Yeah I could definitely see that. Well at least I got it running at a constant 60fps which is all I need since I play at 4k 60hz/60fps vsync locked. Performance is absolutely fine now at 4.9ghz just feels rediculous how much cpu this is requiring.
     
  13. Netherwind

    Netherwind Ancient Guru

    Messages:
    8,839
    Likes Received:
    2,413
    GPU:
    GB 4090 Gaming OC
    1. I had no idea! So one should really use AA High at all times to get that native image.

    I was going to ask what settings you use but then I saw your GPU and now I won't ask :)
     
    tensai28 likes this.
  14. Calmmo

    Calmmo Guest

    Messages:
    2,424
    Likes Received:
    225
    GPU:
    RTX 4090 Zotac AMP
    yeeeeaah.....


    Holding off till i get a 3700x or something like that
     
  15. sertopico

    sertopico Maha Guru

    Messages:
    1,444
    Likes Received:
    374
    GPU:
    Palit Gamerock 4090
    I must say it is pretty well done but if you are playing on your monitor from a close distance is very noticeable in my opinion. Playing it on a tv from your couch would make you forget about it probably.
     
    cerebus23 likes this.

  16. Netherwind

    Netherwind Ancient Guru

    Messages:
    8,839
    Likes Received:
    2,413
    GPU:
    GB 4090 Gaming OC
    To be quite honest, I don't understand how people can play this game with mainstream CPUs. My CPU gets completely destroyed by this game even at 4500MHz. The CPU usage is at 90-100% most of the time and the stuttering is definitely an issue when playing at a lower resolution like 3440x1440 which is why I pretty much have to play at 4K to increase the GPU load but the funny thing is that the GPU is never maxed...it's mostly at 95-96% and I have no idea what causes this. I remember when I got AC:Unity and had the same problem with my old 3570K which I late replaced with the 6700K. The difference was day and night and I gained a lot of FPS just from switching CPU.

    Anyone know if there is a CPU hungry setting besides clouds?

    You're right that it's a bit blurry when playing on a monitor (at 3440x1440) but looks crispy at 4K on a TV. I feel that the FPS gain is good enough to make that trade-off.
     
  17. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    The game creates a lot of threads and uses NVIDIA's deferred render contexts which increases CPU load even further though AMD GPU's are still hit pretty hard but on a 8-core processor it's not going to be 90 - 100% but more like up to 80% utilization depending on area. Cities such as Athens and Sparta are the worst and the game could have really benefited from low-level API usage which there are traces of D3D12 in the game engine but it's not something that can be enabled same as Origins. :)

    In rough terms via the deferred render contexts the game will spawn 8 additional worker threads and try to assign these to available CPU cores but even without this the game will still spawn enough helper threads to fully load up a hexa core + hyper-thread (SMT on AMD.) so there's 12 - 16 threads in action at all times and then there's threads for streaming, audio, framerate limiter I believe runs on a thread too and other tasks. Minimum is a quad core, recommended is hexa but it'll still stutter in situations such as cities and the recommended is a octa core but these are fairly high-end enthusiast processors on PC with a cost to match at least until more recently from AMD though efficiency and clock speed also make a difference as does RAM and of course GPU load itself which is still going to be a major bottleneck issue.

    Add in an SSD and the game can now send data to and from memory much faster and add a 16ms time instead of 33.3ms (30 FPS vs 60 FPS) and the game can now push hardware to it's limits on PC pretty much.
    GPU wise well that's the volumetric settings causing a up to 40% performance penalty on mid to high end cards but even the 1080Ti and Vega models (NVIDIA and AMD respectively.) will see a nice uplift from reducing this mainly from ultra to high but then also from high to medium still sees a sizeable GPU performance improvement.
    (Curiously this also happens under clear weather conditions and similar when there's little to no actual clouds so it's still running some compute task even if you can't really see the difference.)


    Still don't have the best understanding of Anvil Next 2.0, initially I saw it as a good early example for next-gen game engines and graphical showcases but there's various problems with performance both CPU and GPU wise, draw calls is really high from memory although I can always measure it next time I wrap up that daily activity mission and go into the game and it doesn't scale down very well on it's demands for the CPU so hexa or octa core as a quad will stutter more noticeably though it's mostly the bigger cities which probably goes back to the draw calls and also other calculations like the AI and what not. :)
    Pretty sure DX12 support is planned but we'll see when it gets implemented though if AC Italy whatever the next game will be called is going to be even larger it's probably going to push the game even harder if the engine maintains similar designs though we'll see I suppose when it gets unveiled. Might also be 2020 and target the next generation of consoles and that will also see things shift a bit though the first year or two should still see plenty of cross-platform titles before the next console system overtakes entirely. :)

    Short of it is though that it's a very demanding game on all the components in the PC both GPU and CPU but then also RAM which can show a surprising performance increase when upgraded and also I/O with HDD or SSD changing things up a bit.
     
    Xtreme512, XenthorX, G*addict and 2 others like this.
  18. Netherwind

    Netherwind Ancient Guru

    Messages:
    8,839
    Likes Received:
    2,413
    GPU:
    GB 4090 Gaming OC
  19. lucidus

    lucidus Ancient Guru

    Messages:
    11,808
    Likes Received:
    1,384
    GPU:
    .
    Found a "Cave of Kratos" ... no Leviathan axe or blades of Athena :(
     
  20. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    https://docs.nvidia.com/gameworks/c...s/d3d_samples/d3d11deferredcontextssample.htm
    https://developer.nvidia.com/sites/...dev/docs/GDC_2013_DUDASH_DeferredContexts.pdf

    There's some info on how it works, there's more but it's a good implementation and it does help overall performance but this particular game creates several threads (Up to 8 from what I've been reading up on.) so that can hit lower core CPU's harder although it reduces overhead and can increase performance but if all the cores are already busy working on other tasks then it has to swap them around as cores become available and overall CPU utilization and performance can suffer slightly.

    I don't know if there's any good way to properly test this since it can't really be disabled but that's one reason why NVIDIA GPU's under benchmarks also see CPU0 hit more frequently often sitting around 90 - 100% usage in testing which on it's own isn't bad but with the number of threads created and the games own reliance on cores and spawning multiple threads for tasks of all types then it can be a bit of a bottleneck.

    For AMD last I know they don't utilize/can't utilize deferred render contexts in the same way. https://github.com/GPUOpen-LibrariesAndSDKs/AGS_SDK/issues/20 but it might have been resolved although I don't think anything has changed, situation is more or less the same though since the game itself relies a lot on these CPU threads though outside of cities CPU0 is going to measure a bit lower than on NVIDIA systems ranging up to 80% or so and then inside cities here's a ton of draw call commands and other tasks and it's likely going to be 100% even on the higher-end Intel 8000 series (Skylake? Perhaps the one after that name.) and AMD 2000 Zen+ CPU models.

    Does similar things on consoles but uses the lower-level PS4 and XBox API's from my very limited understanding and then it also benefits from hardware architecture being the same and a 30 FPS framerate (With dips on the base models but even so.) so it has a 33.3ms frame time and on PC well at 60 FPS you get 16ms and it executes or tries to execute these things faster but it does take more of the hardware or you get stuttering, hitches or stalling which Final Fantasy XV seems to really have a thing for that last bit but might also have a slight leakage on resource or memory, need to check up on well a lot of things really but the faster things run the more data is being sent and thus the hardware has to keep up or it will have some consequences.

    Doesn't have to be terrible but depending on the program itself and other system specific bottlenecks it can manifest in various ways from framerate drops to stalls to just loading for a longer period of time. :)


    I don't have a perfect understanding of this sort of thing either, I'm trying to improve and learn but it's a incredibly complex subject and seeing something like RAM causing a 20+ percentage framerate increase was really surprising when I looked into 1333 DDR3 memory modules and compared against 21333 DDR4 and then higher especially for open world titles, Fallout 4 was a early title showing some really curious gains scaling up to 2400 Mhz and probably even higher but kits weren't quite up to that speed yet and then Watch_Dogs 2 and Wildlands and many other games including Assassin's Creed and more also showed impressive gains so that was quite a eye opener.

    Then there's SSD's which really reduce the I/O bottleneck and on modern M.2 systems can send data at near Gigabit speeds if I'm using the correct bit/byte divider here but at that point software can actually see a increase in hardware demand since it's now sending data even faster to and from memory and other parts whereas a HDD at some 100 MB under perfect circumstances with usually dips to 20 - 40 MB/s is actually lowering the performance impact although how much is going to depend entirely on the game or software and overall a SSD will improve many other factors from the OS itself to initial start or load times and more. :)
    (Final Fantasy XV comes up as a example often for this, really reduces load times but also causes stuttering and stalls even more often but then there's that possibility of a resource leakage or memory increasing over time and it not freeing it properly so it's a bit of a worst case example in some situations though with the amount of objects and other data handled by Origins and Odyssey they also show some impact from this but it's not going to be stuttering constantly if the rest of the system is falling behind somewhat...until it's city time at least heh those are just not really fun places for the hardware to handle.)

    And then a lovely little mess of whatever the display driver "compatibility" profiles and numerous more or less outright hacks might be doing and OS issues, other software saying hello (Steam and UPlay overlay for example.) and it's quite a complex little mess going on under the hood there which you can use various tools and overlays to monitor things but you'll usually just feel how performance or frame-time fluctuates a bit as the game struggles with some areas because there's so much going on in these. :)

    There's been comprehensive performance tests for the game but I don't know if they go much further than comparing frame rate and frame time on average, going into multi-threading and CPU is a bit complex and then there's vendor differences between AMD, Intel and GPU wise also AMD and NVIDIA handling certain things slightly different. (Must be lovely for low-level API usage and D3D12 and VLK even with extensions on Vulkans's side helping out with drawing the most from the hardware.)

    Lengthy post but yeah it's a demanding title alright, probably still a decent indicator of next-gen game and game engine demands in a way though there might be certain areas where the engine just isn't doing a great job either or it's not quite as scalable on older hardware especially CPU wise where even at minimum settings it will still put a high workload on the processor and require a good number of available cores. GPU still shows immense gains from reducing things like the volumetric clouds and shadows but that increases framerate and at worst you might see a wider range now as it fluctuates or dips whenever it runs into some problem or bottleneck to call it that which 60 to 30 or lower periodically is not very fun and checking the frame time and seeing it spike up to way above normal can also be a thing although even without measuring these the user would still notice hitching and stalls even if it's for less than a second especially if it repeats frequently as the game is loading up more data.


    Should happen on consoles too including XBox One X and PS4 Pro but 30 FPS cap so these dips aren't as severe though I don't know if there's tools or software or even hardware to properly measure this in more detail, framerate itself might be possible and maybe a frame time or graph estimating the frame time via recording hardware or similar.

    Origins and Odyssey are certainly pushing Anvil Engine Next and the systems it's running on that's pretty clear, good visuals and all but there might be some areas the engine isn't dealing with entirely optimally which comes in play especially on PC and the earlier dominance of quad cores as more affordable until Intel started providing i5 hexa cores and AMD and Zen gave more competition but not everyone is going to immediately upgrade either and pricing isn't the best for hardware for just one factor.

    Looking forward to seeing what Italy will bring but if they're targeting 2020 and possibly the next gen of console hardware it might be quite a step up even from these two games in terms of hardware demand.
    (Possibly, we shall see and there might be other additions to the Anvil engine too or improving on existing tech.)



    EDIT: Yeah that's a lot of text. Wasn't quite planning on a A4 paper or several when writing this reply it just kinda happened as I kept adding and trying to cover more and more.
     
    cerebus23, sertopico and Netherwind like this.

Share This Page