Wrappers, fix Low FPS

Discussion in 'Videocards - AMD Radeon Drivers Section' started by neil78, Feb 19, 2022.

  1. Eddamoo

    Eddamoo New Member

    Messages:
    3
    Likes Received:
    2
    GPU:
    AMD RX5700
    I know this is a fairly old post, but i just wanted to feed back an idea if you haven't yet fixed.

    If you're using a controller, try disabling it or disabling rumble/vibration. Ive noticed many indie games seem to have implemented XInput badly somehow and i get massive stutter with any event that causes a rumble. BroForce was the worst due to the amount of explosions.
     
  2. Reclusive781

    Reclusive781 Ancient Guru

    Messages:
    2,227
    Likes Received:
    736
    GPU:
    RX 6700(non-xt)
    Turned the settings down to high except for textures but I still had to turn the resolution down a notch(3200x1800) just to get it to peg 60fps in the benchmark.
     
  3. deton24

    deton24 Member

    Messages:
    38
    Likes Received:
    8
    GPU:
    a bit outdated
    Did you actually have Vulkan listed in any OSD during using this wrapper?
    Because I have OGL in OSD all the time.

    On Polaris and old 4 cores (not hi-end RX 6XXX like yours, but with most probably the same better OGL driver branch introduced in 22.7.1 for both archs), I have performance drop even in main menu to 40-42, and in-game around 10 max FPS drop. Also stuttering and when I pressed Esc during the game, it froze the game.

    Since 22.7.1 I started having 60 FPS on AMD driver in Old Blood, but sadly New Order still works terrible.

    Cmd looked like this. Maybe it can be tweaked further, only if you don't have 3 FPS in main menu. Then it probably doesn't use zink. mesa3d-23.1.0-release-msvc used, mingw doesn't work (or only without zink working and with 3 FPS). Override because of OpenGL 3.2 which the game uses, and the last GLSL available set for that version. Not always both are necessary, sometimes neither of them, because the wrapper can detect the proper configuration automatically (besides zink which is not used by default apparently).
    Tried different settings (e.g. deleting some overrides or all of them), but couldn't get any better performance.

    Code:
    set __GLX_VENDOR_LIBRARY_NAME=mesa
    set MESA_GL_VERSION_OVERRIDE=3.2
    set MESA_GLSL_VERSION_OVERRIDE=150
    set MESA_LOADER_DRIVER_OVERRIDE=zink
    set GALLIUM_DRIVER=zink
    WolfNewOrder_x64.exe +fs_cachepath "d:\cache"
    
    Cache can be safely set on SoftPerfect RAM Disk 3.4.8 (last free version; findable on CDRinfo) with 6GB RAM.

    _______________________
    I was testing thoroughly for few hours various commands and parameters without mesa at all.

    I didn't do it since the new, optimized OGL driver for AMD came out, and here's the chance -
    Previously I only tested on Old Blood.

    (Q9650/X5450 (4/4)/DDR2/~RX 570 4GB)

    Write it in id5tweaker config.

    All "page" values from that config will be applied to wolfconfig.cfg on startup, and on every load of game save.
    Write command in console (CTRL+~) to check current value if necessary.
    Every game load will revert changes you've made in the console to id5tweaker values.

    Code:
    jobs_numthreads 1
    vt_maxPPF 144
    vt_uncompressedPhysicalImages 0
    image_useCompression 1
    r_useHardwareTextures 1
    vt_pageimagesizevmtr 8192
    vt_pageimagesizeUnique 8192
    vt_pageImageSizeUniqueDiffuseonly 512
    vt_pageimagesizeUniqueDiffuseonly2 16384
    FHD, all remaining details lowest possible (below the first three options in the game menu, all others will be overwritten by tweaker on every game load)

    London Nautica (after entering "shortcut" room at the top)
    FPS increase in closed narrow room - from 27-32 to 47-52 FPS
    FPS increase during combat there - from 22-27 to 35-40
    99% and mostly 100% CPU usage all the time (previously it was ~70%).
    22 FPS drops when entering rooms can still occur, but they're less prevalent.

    After all, I checked mesa with zink again, but there was still major FPS loss.
    __________________

    jobs_numthreads 1

    Anything higher was worse.
    With more cores/threads set more
    default 2
    -1 sets to 0 (heavy stutter; reenter jobs_numthreads to check if it differs for your CPU)

    vt_maxPPF

    normally only max 64 can be set; contrary to what id5tweaker says, values above 32 were somehow beneficial here, 128 was the first really making a difference here.

    image_useCompression

    is "vt compress" in the game options; turned out better to set 1, contrary to what id5tweaker says

    vt_pageimagesizeUniqueDiffuseonly2

    Can be set to 256 for less than 4GB cards, but normally VRAM doesn't exceed 1600MB with the settings above if you don't test many settings in the same game session, I think especially when vt_restart should be performed after every page setting change - then it can reach 3600-3800MB.

    All four pages must be value of power of two - otherwise crash on startup - don't exceed 16K at least in at least one of the fields, otherwise the game will break and wolfConfig will have to be edited.

    fs_cachepath
    Cache cannot be set to work in id5tweaker and in-game

    Adding additional commands like

    r_useGlobalShadows 0
    r_useDynamicLightingJobs 0
    r_useDynamicEnvironment 0
    found using listcvars don't work.

    vt_pageimagesizevmtr
    low is 2048
    medium and high is 4096

    vt_pageimagesizeUnique
    low is 4096
    medium and high is 8192

    pageimagesizediffuseonly
    default 4096

    pageimagesizediffuseonly2
    default 4096
     
    Last edited: May 22, 2023
  4. janos666

    janos666 Ancient Guru

    Messages:
    1,522
    Likes Received:
    363
    GPU:
    MSI RTX3080 10Gb
    I just tried it in an old DX9 game (Tides of Numenera) built with Unity Engine [unknown version] (strange how it uses DX9 because the official recommendation is a "DX10 compatible" card and there is no ingame switch for this).
    My RTX3080 with driver 535.98 runs at about half the GPU clockspeed (~1000 instead of ~2000 MHz) and uses roughly half the power (~110W instead of ~220W for the GPU) when using DXVK v2.2 (DX9->Vulkan) while producing the same capped 117 fps with no visible change in image quality. Although the dedicated per-app VRAM consumption is roughly 300% (~3G instead of ~1G for no apparent reason).
    At this point, nVidia should bundle DXVK and offer an option in NVCP to enable it per app. :D
    Or should I reinstall Windows 11 from scratch? Because this doesn't look normal to me (several games showing up to 100% differences in GPU performance where the CPU performance doesn't seem to matter). (I asked others to try and replicate some of these findings of mine but I didn't receive many hard data points and/or screenshots, just some generic chat.)
     
    Last edited: Jun 1, 2023

  5. Krteq

    Krteq Maha Guru

    Messages:
    1,095
    Likes Received:
    715
    GPU:
    MSI RTX 3080 WC
    Last edited: Jun 2, 2023
    Cryio likes this.
  6. Cryio

    Cryio Master Guru

    Messages:
    648
    Likes Received:
    270
    GPU:
    AMD RX 5700 XT
    What Vulkan beta driver
     
  7. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    3,581
    Likes Received:
    1,425
    GPU:
    7800 XT Hellhound
    He means Nvidia driver.

    Btw. You can turn MSAA into SGSSAA with d3d9/d3d11.forceSampleRateShading = True. Only way to play Mirror's Edge with SGSSAA on AMD, as AMD Windows driver's SGSSAA feature is just without function on RDNA2. Alse needs RADV for proper performance:
    Screenshot_20230531_131110.jpg
     
    Krteq likes this.
  8. janos666

    janos666 Ancient Guru

    Messages:
    1,522
    Likes Received:
    363
    GPU:
    MSI RTX3080 10Gb
    Who needs SSAA (or MSAA, FXAA, SMAA, etc...) anymore? 4k120 has been readily available for ~3 years. Native 4k is better than 2x2 super-sampling for 2k. It's the same level of aliasing (e.g. effectively none) but with more native detail on textures and geometry. I personally never felt the need for any kind of anti-aliasing on native 4k displays (neither spatial nor temporal).
     
    The Creator likes this.
  9. Cryio

    Cryio Master Guru

    Messages:
    648
    Likes Received:
    270
    GPU:
    AMD RX 5700 XT
    "4K120 has been stadily available for 3 years".

    Yeah, for X360/PS3 games for most people, lmao.
     
  10. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    3,581
    Likes Received:
    1,425
    GPU:
    7800 XT Hellhound
    4k without AA looks like crap. Nice if you can live with it, doesn't necessarily apply to others.
    Really not sure how you can cheer for a picture that's broken by definition. Even 8xSGSSAA is not enough to eliminate shimmering in vegetation of Gothic 2 completely, even in 4k. You need 32xSSAA Nvidia driver mode for that...
     
    Last edited: Jun 3, 2023
    Kamil950 likes this.

  11. Cryio

    Cryio Master Guru

    Messages:
    648
    Likes Received:
    270
    GPU:
    AMD RX 5700 XT
    4k with 32xSSAA? For Gothic 2?

    What's this nonsense.
     
  12. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    3,581
    Likes Received:
    1,425
    GPU:
    7800 XT Hellhound
    It's not nonsense, it's visibly better. If you think otherwise, you simply have no clue.
    And yes, I got access to a 32" 4k 144Hz display. Resolution never replaces AA, it's pointless discussion.
     
  13. janos666

    janos666 Ancient Guru

    Messages:
    1,522
    Likes Received:
    363
    GPU:
    MSI RTX3080 10Gb
    I checked this game out. It is indeed a mess (I didn't even need to leave the first room to see the aliasing). But I suspect the internal render resolution is nowhere near the resolution that is set in the game's menu. At least not for all objects. Pretty much nothing seemed to happen when I changed the resolution setting. (There is a DX11 mod for this game but it doesn't work with the GOG version. Although I am not sure if that would change anything in this regard.) Some very old games are indeed like this. I guess looking at very low-polygon models with long line segments makes the aliasing more noticeable.

    But I didn't have such old games in mind!
    I rarely notice regular aliasing in "modern" games (less than ~10 years old). And those cases usually aren't regular aliasing but more like "noise" (for example, white pixels appearing on dimly lit, dark, and somewhat reflective surfaces, which I guess are the result of deferred rendering or similar "tricky optimization" techniques).
    I prefer native 4k over 2x2 supersampled 2k because detailed objects appear with more actual detail rather than everything getting either blurred or sharpened (depending on the scaler(s) used for supersampling and possibly on the visual style of the game [like "foggy" or "crispy" to start with]).
    Of course, supersampling can also be applied to 4k output but 1: that's not viable for most games (only for very old or very "cheap" ones), and 2: the minimal amount of aliasing is getting replaced with some amount of blurring or sharpening (you exchange one kind of artifacting for another).
     

Share This Page