Mass Effect 1 - Insane Glitching / Artifacting on AMD hardware

Discussion in 'Videocards - AMD Radeon Drivers Section' started by cryohellinc, Jul 12, 2020.

  1. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,500
    Likes Received:
    1,875
    GPU:
    7800 XT Hellhound
    This was a "general" you, not a "personal".
    But even with Nvidia you still have the Windows 10 CPU performance penalty in Borderlands 2, you can find reports for it from even 2015.
     
  2. mbk1969

    mbk1969 Ancient Guru

    Messages:
    15,604
    Likes Received:
    13,612
    GPU:
    GF RTX 4070
    And Borderland 2 was running just fine on my previous rig (several months ago).
     
  3. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,500
    Likes Received:
    1,875
    GPU:
    7800 XT Hellhound
    You can keep telling yourself that, it's just pointless without doing CPU performance comparisons vs. e.g. Win 8.1 or DXVK.
    I'm getting ~70fps with native D3D9 and 4.2GHz Skylake at certain locations, no CPU will be fast enough for constant ~144fps.
     
    Kutep0v likes this.
  4. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Borderlands 2 unless this is something Gearbox changed (Which I think is only for Borderlands 1) running under D3D9 is just inherently API limited, can run really well but it has drawbacks and shortcomings like so many other games on DirectX 9 although running on a newer build of UE3 compared to the first game should help at least. (Again not sure what exactly the definitive edition does for Borderlands 2 and Pre-Sequel as I don't believe it was as extensive as for the first games code overhaul.)

    NVIDIA or well AMD too when it's not the Navi10 cards should be less performance constrained by CPU bottlenecks though although it is still a thing.
    Persona 4 does the same thing for a recent port that somehow ended up with D3D9, hits 60 - 70 FPS well enough with dips (Because some areas are hit by draw call limitations of a ridiculously large amount.) throw in DXVK and it hits 200+ instead. :D
    (Neat, especially for 120Hz+ displays where this is of particular usefulness.)


    EDIT: But yeah CPU bound and single threaded API with D3D9 and the initial belief about clock speed instead of multiple cores and that's going to be a hindrance although it's getting good enough to mostly brute force into 60 FPS anyway now on current hardware.
    (Just in time for the re-release of Crysis to put that little statement from back then right back into focus just as it was finally coming to a end. Ha ha!)



    EDIT: Ah explanations and descriptions and more that I'm not really all that great at!

    The games on D3D9 do run fine or well enough but they could do more and as the saying goes bigger ... framerate numbers ... is better. :p


    Well not entirely but eh small details, or not really when the max is unstable and you get varying dips and resulting feeling of un-smooth and stuttering comparably to a lower max but far more stable average and higher minimums. :)


    EDIT: Well many of these games do run fine now with current hardware at least, brute force all the optimization issues!

    Or rather it's more like inherent API limits and bottleneck situations because the earlier idea was something like 5 no 6 no 10 Gigahertz! or something but then it took forever to scale to dual core and quad core and now it's moving into hexa and octa with all sorts of small and large issues both on the developers and the system OS and drivers as a result.
    (Though it is nice to see those 6+ core CPU's getting utilized even in gaming now instead of targeting 4 and maybe a bit extra wherever.)
     
    Last edited: Aug 4, 2020
    OnnA and mbk1969 like this.

  5. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    For Mass Effect 1 though that's one of the really early UE3 builds, will have to read up on that AMD CPU issue more as well and these particular glitches and what not.
    Plus when and how and which studio if the rumors on EA now OK'ing remasters is accurate and that's something some studio got saddled with trying to figure out how to force into the Frostbite engine again.
    Or maybe actually use UE4 instead and make it a tough but more smooth transition. (For that matter throw in Andromeda too couldn't hurt. ~ More than the game itself already did the first time around at least.)


    EDIT: And what will the gameplay even be, well that's for whenever this recurring little rumor actually gets confirmed or not I suppose.
     
    Last edited: Aug 4, 2020
  6. mbk1969

    mbk1969 Ancient Guru

    Messages:
    15,604
    Likes Received:
    13,612
    GPU:
    GF RTX 4070
    I will...

    I am perfectly fine with 70 fps - both at old rig with 60Hz plain monitor, and at new rig with 144Hz GSYNC compatible one.

    PS But I get it - you can boost games with DXVK.
     
    Last edited: Aug 4, 2020
  7. mirh

    mirh Member Guru

    Messages:
    103
    Likes Received:
    5
    GPU:
    MSI HD 7750 1GB
    Nobody had told Epic or Demiurge how physx worked, so rather than shipping the actual physx system software installer, every time the game inits they just write a global physx variable for its games to forcefully use built-in dlls and never check anywhere else.
    This is on the other hand breaks newer games such as mirror's edge, that just cannot accelerate physx on the latest hardware with 2009 leftovers.

    And microsoft somehow decided to ship a fix they didn't even understand the functioning of (as if running games as administrator wasn't already bad enough).
    Isn't dxvk masking every nvidia card as amd (since on linux nvapi doesn't work)?
    That may have an impact on physx, which in turn should be the big offender for performance AFAIK.
    ?? Who mentioned here the AMD cpu issue?
     
  8. Astyanax

    Astyanax Ancient Guru

    Messages:
    17,036
    Likes Received:
    7,378
    GPU:
    GTX 1080ti
    Oh, i remember this bug!

    You can just replace the out of date physx cooking and physx loader files and that does the trick right?

    It works for Mirrors Edge at least (original that is)
     
  9. aufkrawall2

    aufkrawall2 Ancient Guru

    Messages:
    4,500
    Likes Received:
    1,875
    GPU:
    7800 XT Hellhound
    Despite of it spoofing itself as AMD card, GPU PhysX actually works fine with DXVK. Seems like that CUDA interop is ~completely independent from vendor specific 3D API stuff and it just works if it finds a CUDA device.
    The performance gain by DXVK when being CPU bound is independent from PhysX, it was set to low when comparing performance.
     
  10. mirh

    mirh Member Guru

    Messages:
    103
    Likes Received:
    5
    GPU:
    MSI HD 7750 1GB
    Yes, you can. But that's silly when physx is supposed to do that alone.
    I don't doubt there is some performance gain, W10 1607 destroyed d3d9 SOFTWARE_VERTEXPROCESSING after all.

    I'm not really sure physx on low means gpu isn't used at all though.
     

  11. mbk1969

    mbk1969 Ancient Guru

    Messages:
    15,604
    Likes Received:
    13,612
    GPU:
    GF RTX 4070
    PhysX is not a part of DirectX/OpenGL/Vulkan/OpenCL, so when you create a hook dll for DX/OGL/VK/OCL you can forget about PhysX. PhysX calls should go directly to PsysX libraries. Regardless of CPU- or GPU-implemented PsyxX.
    Just my two cents.
     
  12. mirh

    mirh Member Guru

    Messages:
    103
    Likes Received:
    5
    GPU:
    MSI HD 7750 1GB
    That may even be the case.. but you never know?
    For example, I have heard Youngblood would let you enable ray tracing (which is just a directx thing) only if it has first detected DLSS (which is only a nvapi affair afaik).

    Then assuming most games would be coded with such idiot balls isn't really an educated take, but UE3 has its reputation with being poorly planned around physx (after a decade I still have to figure out what the hell the "EpicLocalDLLHack" variable does).
     

Share This Page