We need to talk about UE4 Shader compilation issues

Discussion in 'Videocards - NVIDIA GeForce Drivers Section' started by snight01, Mar 12, 2022.

  1. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    It might also be the result of publishers setting unrealistic deadlines or not providing the resources required to correctly retool a title during the porting I imagine. But yeah, there definitely is a huge variance in dev skill level that exists out there. Most devs are a far cry from your John Carmack types and I imagine the best ones aren’t cheap. What I’m personally hoping for is that we get more engine level or abstracted solutions (if that’s even possible) so that it’s harder for teams to screw this up. I have no insight into the development process but there is clearly some mind blowing gaps in knowledge depending on the team. For example all the studios that repeatedly ship games with bad frame pacing.
     
    Last edited: Jan 17, 2023
    Smough and Kamil950 like this.
  2. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    I recently had some "nostalgia/retro vibes" but instead of firing up many old games, I opted to read wikipedia/fandom(-wiki) pages and "looking back at the history of gaming" type articles, etc. Even though the oldest game I ever actually played was probably DOOM(1) (or whatever early ID game which ran from MS-DOS and featured Hitler at the end..., could be Wolfenstein or something else from those years, I don't know...), I was way too young to really appreciate and/or still remember those early games. Half-Life 1 was the first which really made me drop my yaws at the graphics ("this looks like I am really there") and the one I can really recall (that's when I was coming to the age to appreciate and remember details...).
    I took a little "side-quest": I went a bit out of my way and ended up reading a lot about John Carmack beyond gamedesign/programming related topics (public personal life topics and all that crap, even if I just quickly scrolled through those articles) because it was carved to my mind how he was the biggest genius and biggest pioneer of games (mainly through his programming skills). And right now (from what I gathered and how I interpreted all that with my current psyche...) I actually have really mixed feelings about John Carmack (both as a game developer/programmer and as a generic "genious/pioneer")! Sure, sure, he was a true pioneer for SOME time and he probably has roughly double the IQ and triple the sheer lexical knowledge in generelal (and roughly infinitely more in game/programming specific topics) compared to me. BUT! I think that only really applies to the very early days like iD's first few games. I think he stopped "shining" sometime during the development of DOOM3. Every iDTech engine starting with DOOM3's (which John Carmack really worked on as a lead developer) might looked like a great achievement from some perspective but neither was without the obvious limitations and questionable decisions. And since he stopped working on game engines, his life seems to be dedicated to something he can't possibly solve in his lifetime (so, essentially, he sort of retired and now dedicates his time and money to a hobby, essentially -> the A.I. stuff).
    Just take a look at DOOM3, Half-Life 2 and FarCry(1) (these are games which you can still run on Win11 and current hardware today without emulators or unofficial patches and/or retro hardware). DOOM3 wasn't bad but I think both of those other games were a lot more "revolutionary" at the time. And then try to recall how RAGE(1) ran, even if you had a performant SSD (which was "high-end" stuff at the time, even the "budget" SSDs were fairly small capacity and "not that great" [as later ones with refined controllers the like first Intel and SandForce offering]) but especially if you didn't have an SSD (or big enough to house a game over an OS).
    So..., is he really a high IQ genius and a talented programmer? Sure! Hell yes. But is he really a revolutionary genius since the 1990's and the single biggest innovator in gaming? I don't think so. He was just sort of lucky to grab a lot of "low hanging fruits" in the span of less than a decade or so... And I personally think he will loose a lot of money on that A.I. research thing he has going on nowadays (money he could have spent on hookers and/or cocain with better returns than "cracking true A.I. in our lifetime" stuff, or he could have led their children to inherit and use that money for whatever if he wanted to retire and only chase "hobbies"...). And I don't even try to go into that space rocket research thingy (which was his first money-pit hobby before A.I.).
    So my summary is, that he is probably high-IQ and high-knowledge man but there was only a really short few years when he could really tap into his potential and actually shine. Ever since then his life is more about money-burning hobbies... And yes, yes, yes, that's still a lot more than I can tell about myself (both in relative and absolute terms) but I don't expect the world ever to remember me for anything at all. (And I bought a lot of Bitcoins on MT.Gox for a few dollars a pop before it was cool and I did manage to loose all that wealth in ~5 years of so... Just to give you an example... But I still have enough passive income to live without having a day job, so I am miserable [popping anti-depressant drugs like ticktocks and crap, just to keep myself from suicide level miserable...] but not a complete idiot...)

    EDIT: Oh, sorry. I got carried away a lot. I am not sure why I felt like I have to talk so much about myself just because I tried to unravel a personal opinion about an "icon". (Probably because 99% of the replies are expected to attack me like "and who the f|ck do you think you are to judge someone like that?!?"). I went too far but I am lazy to rewrite it (because I am drunk and high).
     
    Last edited: Jan 19, 2023
  3. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    DOOM 3 blew me away back in the day as did RAGE to be honest -- I always felt R1 was underrated considering that level of visual fidelity at a flawless 60 fps on consoles of that era was no small feat. Subjectively speaking I'm not too bothered by asset fade in provided it's buying something worth the trade off (e.g. a totally stutter free experience free from asset stream hitches or high visual fidelity and performance that sorta thing).

    The PC version however was abysmal sadly. But I hear yah and you may very well be right -- I haven't read enough to say for sure I suppose :) But Source and CryEngine (was it called cryengine back in the far cry 1 days I wonder?) were impressive technical showcases indeed! The physics engine in HL2 I thought was amazing at the time -- subjectively I sort of prefer the look to games like DOOM 3 and Riddick Escape from Butcher bay (titles that launched at more or less the same time) but that's just my own taste I suppose :p Thanks for your write up, I enjoyed reading it.

    Sorry to hear about the Bitcoin man -- sucks to experience a loss like that and it sounds like you've been pretty down -- that's rough man. My go-to when I feel low is to take whatever time I can off my job then settle down with some of my favorite old time vidya games like Sonic 2 or Escape from Butcher Bay and some coffee :) I've been reading a bit regarding financial planning lately and I'm glad I didn't touch Crypto. I'm thinking some combination of the Bogleheads route (passive total market index funds) and maybe some TreasuryDirect Series I-Bonds, that kinda thing. But I haven't much started yet.
     
    Last edited: Jan 19, 2023
    montanchez likes this.
  4. KoKlusz

    KoKlusz Member Guru

    Messages:
    151
    Likes Received:
    40
    GPU:
    RTX 3080 12 GB
    So I have been messing around with several UE games, trying various combinations of .ini settings: dx11, dx12. dxvk, rt and so on, trying to find solution for streaming stutter, and I remembered this post. I don't think the difference comes down to UE dx11 vs dx12, but rather the driver overhead, and I say this because both dx12 AND dxvk reduce streaming stutter in its frequency and intensity when combined with settings tweaks vs dx11 with tweaks. Dxvk is essentially a hack, so it doesn't work as well as native dx12, but it is better than dx11.

    Hellblade: dx12 is almost entirely smooth once shaders are compiled, dx11 has streaming stutter, however dx12 + rt has even worse streaming stutter (couldn't get dxvk 2.0 to work with this one)

    Fallen Order: dx11 is hopeless, however dxvk reduces stutters A LOT, forcing dx12 via command line has crazy shader stutter at first, once shaders are complied thought it's almost perfect. Unfortunately, dx12 crashes the game after the first level, so it's unusable in this case.

    FF7R: forced dx11 stutters, dxvk less but still not perfect, dx12 is almost perfect once shaders are compiled.

    All of this tested on 3080 12gb, vram is important since .ini tweaks can consume more of it, especially FF7R, without caping r.Streaming.PoolSize to specific value, it went up to 11gb of vram on 1080p, lol.
     
    BlindBison and montanchez like this.

  5. montanchez

    montanchez Member

    Messages:
    45
    Likes Received:
    32
    GPU:
    RTX 3080
    My experience was similar to yours, in search of a solution to the poor performance of games under unreal engine 4. But i never used a special tweaks with DXVK. What settings tweaks do you use with dxvk?
     
    BlindBison likes this.
  6. KoKlusz

    KoKlusz Member Guru

    Messages:
    151
    Likes Received:
    40
    GPU:
    RTX 3080 12 GB
    Basically those, just put in the right config file (usually Engine.ini)
    Code:
    [SystemSettings]
    r.Streaming.PoolSize=6500
    r.Streaming.MaxTempMemoryAllowed=100000
    r.Streaming.AmortizeCPUToGPUCopy=1
    r.Streaming.MaxNumTexturesToStreamPerFrame=3
    r.Streaming.NumStaticComponentsProcessedPerFrame=3
    r.Streaming.FramesForFullUpdate=1
    r.Streaming.MinMipForSplitRequest=0
    r.Streaming.HiddenPrimitiveScale=1
    s.AsyncLoadingThreadEnabled=1
    s.AsyncLoadingTimeLimit=0.1
    s.LevelStreamingActorsUpdateTimeLimit=0.1
    s.UnregisterComponentsTimeLimit=0.1
    s.AsyncLoadingUseFullTimeLimit=0
    s.IoDispatcherCacheSizeMB=256
    s.LevelStreamingComponentsRegistrationGranularity=1
    s.LevelStreamingComponentsUnregistrationGranularity=1
    s.MaxIncomingRequestsToStall=1
    s.MaxReadyRequestsToStallMB=0
    s.MinBulkDataSizeForAsyncLoading=0
    s.PriorityAsyncLoadingExtraTime=0
    r.MipMapLODBias=0
    r.SkeletalMeshLODBias=-15
    r.LandscapeLODBias=-15
    r.ParticleLODBias=-15
    
    Just remember that it's not a 100% perfect silver bullet, and it will increase loading times (by a lot in case of JFO). But it should help. If you are hitting VRAM ceiling, reduce r.Streaming.PoolSize value, it might cause texture pop in though.

    For Fallen Order you also need this mod to generate new .pak file.

    You might also want to add r.CreateShadersOnLoad=1 for dx11 or dxvk, but depending on the game you will still get shader stutter once when loading a new level / area for the first time.

    Oh, and those settings are apparently tuned for 60 fps.
     
    Last edited: Jan 20, 2023
    Shadowdane, BlindBison and montanchez like this.
  7. Kamil950

    Kamil950 Member

    Messages:
    42
    Likes Received:
    17
    GPU:
    Radeon RX 580 4GB
    Do you use DXVK with ASYNC patch enabled?
     
    BlindBison likes this.
  8. KoKlusz

    KoKlusz Member Guru

    Messages:
    151
    Likes Received:
    40
    GPU:
    RTX 3080 12 GB
    Tested both, but the latest builds of regular dxvk plus r.CreateShadersOnLoad=1 seem to give better results. Maybe once async patch gets updated it will work the same.
     
    BlindBison likes this.
  9. BlindBison

    BlindBison Ancient Guru

    Messages:
    2,419
    Likes Received:
    1,146
    GPU:
    RTX 3070
    Just an aside going back to DXVK, I recently started replaying the Witcher 2 Enhanced Edition which is a 32 bit DX9 game -- and wow did DXVK Async reduce stuttering and improve GPU utilization in that title. Same for GTA4. On maximum (including the downscaling option) settings with a 1440p output and G-Sync the game had very poor frametimes and wasn't utilizing the GPU (3070) to 90+% much of the time. With the latest DXVK Async the GPU usage is now consistently at 99% and perceptible stutters have decreased (not disappeared of course but decreased noticeably). Ultra Low latency mode in DX9 default (doesn't work with vulkan) also seemed to somewhat improve frametimes in this particular case.

    I'd love to test something like Hunt Showdown with the latest DXVK Async but I wouldn't want to risk any kind of ban for an online game. It's very interesting how some games benefit noticeably from DXVK whereas others regress somewhat or change very little.

    Side note if anyone ends up trying this you have to move the DXVK d39 dll file into the bin folder where the exe is not the one where the launcher / other exe happens to be. I verified DXVK was actually working by setting an environment variable (DXVK_HUD = devinfo, fps). Also, does anyone know if you still need the environment variable present of DXVK_ASYNC = 1? Or is that no longer needed with dxvk async? Thanks,
     
  10. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    I am not sure why but DOOM3 sort of looked a bit "blurry plastic like" (especially the NPCs in particular) compared to Half-Life 2 or FarCry for me. And those "black or white" kind of shadows were also a bit jarring for my taste.
    Since I didn't know any better at the time (I was a kid and didn't really get into the deeper technical details), I just sort-of noted that "OpenGL games are blurry plastic (in general)". Not like there were many of them. But the second Riddick game (also OpenGL but higher ShaderModel specs at high quality settings than DOOM3) looked great indeed (but also carried a little bit of "plastic like" taste for my taste...). I never saw the first Riddick game though (I first played EfBB from inside AoDA) and it's very broken now on Win11 and 16:9 4k displays.
    I never saw RAGE on a console, only on PC (installed on an SSD) and that "texture pop-in" always bothered me (yes, even with an SSD). Not just in this game (RAGE wasn't the only game which operated somewhat similar at the time but I can't recall any other titles [I mean, not iDTech based titles]).
    However, DOOM 2016 did blow my mind with how great it ran on PC with that level of image quality... More than that, it was a great game too. (I didn't like DOOM3 that much but neither the latest DOOM for that matter.) And Carmack is said to have wanted to go into a very different direction with the next iDTech engine (after RRAGE) than his successors (I recall he already wanted to do ray-tracing long before nV or AMD considered releasing hardware with RT-acceleration capabilities ... and he even talked about experimenting with leaving graphics APIs like DirectX/OpenGL behind and writing a game engine in pure GPGPU code..., CUDA as the particular example at the time before the vendor-independent OpenCL or the later DirectCompute were released...), so... those things seem, ... just how to put it ...., way, way, way too far-fetched and in-practical even today, let alone some 6-8 years ago..., so.., I don't know... I think he (just like me) always loved to theorize and conceptualize things at abstract levels a lot more than doing any actual practical work at all -> fast forward to A.I. research... I think he only put in some actual footwork until he had enough money to pursue his dreams (which is totally fine by me, just don't regard him as a foot-solder if he is more of a philosopher/dreamer...).

    A little historical easter-egg: Look up the "Half-Life 2 HDR techdemo" on youtube. It was published long before the actual release of either HL2 or FC1. Way before the Source engine got a lot more different HDR implementation than that old Valve techdemo. In my opinion, that old demo looked orders of magnitudes better (well, at least a lot more "photo realistic", that is...) than how HL2-EP1 did it's HDR (EP2 got better, but I still prefer the unreleased old implementation over EP2). And that old techdemo ran on an ATI Radeon 9xxx card. The EP1 implementation required 1 nV or 2 ATI generations younger graphics cards (SM3.0 instead or SM2.0x). FC1 did HDR a lot better than HL2-EP1 (but also with SM3.0 and not SM2.0 like that old Valve techdemo) and a few years later most games (early-DX10 era) started to "get it right" as well (better than the later HL2-EP2 in my opinion). It's so strange to me how the first public HDR techdemo had better HDR than some Valve (or other) games a few years later with higher hardware requirements...
    I tried to get that old HL2 HDR look running with the leaked "beta" codebase on much later hardware but I never got it working (I mean, it runs, but looks completely off). I am a bit surprised nobody else (with some actual programming skills) tried to fix the dx9_hdr render path in the fairly big and over-arching "HL2 beta community" (the community collaboration on that is massive compared to what it is).


    But back to the stutters: It's strange it rarely bothered me as much as nowadays. Probably because I played with regular V-Sync at 60Hz with "buffer bloat" or with FastSync (which has "jitters" anyway). And whenever I saw some "hitching" I just assumed the 60 fps just wasn't there and V-Sync dropped down to 30 or 45 fps for a moment. VRR (G-Sync) is supposed to mask some of the smaller hiccups, yet the generally higher refresh rate and framerate I got used to in the last few years makes those "small" stutters more visible, even with VRR. And I can't blame them on how regular V-Sync works.

    Although I did read an article (not in English) recently which explained why DX12 and Vulkan have more shader compilation related stutters than DX11: the shipped "midlevel" code is a bit more "raw" and needs more computing power to get "finalized" than the more pre-optimized DX<12 code. Yet OpenGL is theoretically supposed to be the worst because it ships high level code (not "half-baked" but actual raw material) and I don't recall any issues with this in old games. Yet again, there were very few OpenGL games and most of them worked with much more simple shader codes (but then again, CPUs had 1 slower core back then, not 8+ with much higher IPC and higher freq.).
     
    Last edited: Jan 20, 2023
    BlindBison likes this.

  11. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    I just had an idea: Would it be theoretically feasible to rewrite the driver's shader compiler to run most (if not all) of the work on a GPU and be significantly faster at it than on a regular CPU?
    If so, we could keep an older and/or lower-tier VGA in a second slot and use it for on-demand shader compilation (similarly to a dedicated "PhysX accelerator" card - which was barely used for handfull of games).
     
    enkoo1 and BlindBison like this.
  12. BmB23

    BmB23 Active Member

    Messages:
    93
    Likes Received:
    34
    GPU:
    GTX 1660 6GB
    That HL2 demo was not for HDR but simply for showing off shaders.
    Shader compilation was not really an issue because back then directx and opengl drivers handled the compilation much better, asynchronously by default. Whereas DX12 and VK are synchronous by default.
     
  13. janos666

    janos666 Ancient Guru

    Messages:
    1,653
    Likes Received:
    407
    GPU:
    MSI RTX3080 10Gb
    It may be so, but I am sure it was rendered in HDR, even if only by circumstance. And HDR rendering also needed hdr-specific shaders for tone-mapping and such. You can find the stshader_hdr_dx9 file in the pre-release leak (this is the biggest DLL file from the DX 6-9 family), so they must have at least experimented with it (even if they didn't really care if this could be ready for the already extended release date or not).
     
  14. dr_rus

    dr_rus Ancient Guru

    Messages:
    3,930
    Likes Received:
    1,044
    GPU:
    RTX 4090
    No.

    Wouldn't make any sense to do it this way even if it would be possible to compile shaders on GPU h/w.
     
  15. Memorian

    Memorian Ancient Guru

    Messages:
    4,021
    Likes Received:
    890
    GPU:
    RTX 4090
    MW2 runs so smooth because it pre-compiles shaders before playing. I'm also playing Uncharted The Legacy Thieves Collection and it's silky smooth because it pre-compiles shaders in the menu. CP 2077 also runs silky smooth, does it pre-compile before playing ? There is no warning at all. Let's hope that future UE 5 games will have shader pre-compilation in the menu, otherwise PC ports will be in serious trouble.

    So far my list for ''silky smooth'' titles(I may forget 1-2 games):

    RE 2/3 remakes, RE 7 and RE 8(After the DRM fix patch)
    Forza Horizon 3, 4, 5
    TitanFall 2
    All Call of Duty games
    Uncharted: The Legacy Thieves Collection
    CP 2077
    DMC 5
    NFS: Unbound
    God of War
    DOOM/Eternal
    Rainbow Six: Siege/Extraction
    GTA V
    Dying Light 2
    Gears of War 4/5
    Tomb Raider games

    Most of these games are using their own ''in-house'' engine. UE games is the problem with a few exceptions.
     
    Smough likes this.

  16. lors

    lors Member

    Messages:
    17
    Likes Received:
    6
    GPU:
    Asus RTX4090 TUF
    Dying Light 2 and silky smooth? I'm experiencing a ton of stuttering in DL2, especially while flying. Feels like loading stutter. Playing on a 5800X3D/RTX4090 and NVMe PCIe 4.0 SSD.
     
  17. Horus-Anhur

    Horus-Anhur Ancient Guru

    Messages:
    8,731
    Likes Received:
    10,818
    GPU:
    RX 6800 XT
    I started playing Dying Light 2 yesterday, and the game runs very well. Rarely do I see any stutter.

    But the game does have a few bugs. And somewhat clunky controls.
     
    BlindBison likes this.
  18. lors

    lors Member

    Messages:
    17
    Likes Received:
    6
    GPU:
    Asus RTX4090 TUF
    Last edited: Feb 4, 2023
    BlindBison likes this.
  19. snight01

    snight01 Master Guru

    Messages:
    454
    Likes Received:
    87
    GPU:
    GB RTX 4090gamingOC
    ....
     
    Last edited: Feb 13, 2023
  20. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    Anyone managed to have a real fix for Hogwarts Legacy? The game is a stutterfest pretty much everywhere. I can't make it run at all on Windows with either VKD3D by itself, or DXVK+VKD3D.
     

Share This Page