Gears of War Ultimate Edition is a DX12 disaster for AMD (?!)

Discussion in 'Videocards - AMD Radeon Drivers Section' started by fr33jack, Mar 4, 2016.

?

Do you believe that AMD will fix its performance issues? Do you still trust them?

  1. Yes. Completely.

    52.2%
  2. No. And I'm planning to buy an NVIDIA GPU later this year.

    15.2%
  3. No. But I'm staying with my current AMD card for now...I still have some hope left.

    32.6%
  1. DesGaizu

    DesGaizu Ancient Guru

    Messages:
    3,701
    Likes Received:
    65
    GPU:
    JetStream 980Ti
    Like tj said unreal engine 3 is **** tbh and from what I've seen from 4 that's not much better. For what you get graphically from them performance is always dog ****.
     
  2. Barry J

    Barry J Ancient Guru

    Messages:
    2,780
    Likes Received:
    128
    GPU:
    RTX2080 TRIO Super
    AMD really need DX12 I bet there working hard to fix or get it fixed by the devs
     
  3. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,007
    Likes Received:
    230
    GPU:
    6800 XT
    Oh I had the impression that ue3 offered decent performance with decent visuals when the devs actually tried to do something.
     
  4. Barry J

    Barry J Ancient Guru

    Messages:
    2,780
    Likes Received:
    128
    GPU:
    RTX2080 TRIO Super
    this is why I think DX12 will initially be a nightmare the devs have to do the work and recent game releases show how well they do that
     

  5. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,558
    Likes Received:
    222
    GPU:
    RX 580 8GB
    The blockiness for AMD hardware with AO enabled is due to HBAO+ (GameWorks) which the game uses.

    :bang:
     
  6. OnnA

    OnnA Ancient Guru

    Messages:
    12,553
    Likes Received:
    3,135
    GPU:
    Vega XTX LiQuiD
    Yup :)

    Crappy Games don't count :flip:

    BTW we have to wait for them to Fix this Game (game is OK IMO)
    But they rush DX12 title and Fail.

    Now MS working on Better Win Store (will be released with next Big Update)
    That will adress issues:

    - Multi-GPU
    - Free access to files
    - Exclusive Fullscreen (that borderless window is from X1 ;-) lol, you know= you can Play and Watch Movie/TV)
    - V-sync implementation
    - Games Will work NP with Fraps/Reshade etc.
    and some minor stuff

    IMO all Problems with MS Win Store Games takes roots in X1 and console like behaviour.
    Now they need to Implement New Features for PC like behaviour.
     
    Last edited: Mar 4, 2016
  7. fr33jack

    fr33jack Maha Guru

    Messages:
    1,153
    Likes Received:
    3
    GPU:
    1050Ti @1.9/ 9.0GHz
    For some reason I don't think that Microsoft implements normal support for third-party software and tools like Reshade...or even customization of game files (modding) and game settings via cfg files edit. I think that all the games in MS Store will be strictly closed for doing so...

    Found something interesting...

    Windows Store Games won't have VSync, SLI/CrossFire, Fullscreen or Modding

     
  8. OnnA

    OnnA Ancient Guru

    Messages:
    12,553
    Likes Received:
    3,135
    GPU:
    Vega XTX LiQuiD
    Here:

    "Shots Fired – Epic’s Tim Sweeney Accuses Microsoft Of Trying To Monopolise PC Games Development"

    "As Tim Sweeney wrote in The Guardian, with this new new Universal Windows Platform initiative and its new exclusive features, Microsoft is ‘effectively telling developers you can use these Windows features only if you submit to the control of our locked-down UWP ecosystem.’

    Sweeney went into more details in order to express what is actually wrong with Microsoft’s stance regarding UWP.

    “The specific problem here is that Microsoft’s shiny new “Universal Windows Platform” is locked down, and by default it’s impossible to download UWP apps from the websites of publishers and developers, to install them, update them, and conduct commerce in them outside of the Windows Store.

    It’s true that if you dig far enough into Microsoft’s settings-burying UI, you can find a way to install these apps by enabling “side-loading”. But in turning this off by default, Microsoft is unfairly disadvantaging the competition. Bigger-picture, this is a feature Microsoft can revoke at any time using Windows 10’s forced-update process.”

    Sweeney also added that one of the big danges here is that Microsoft may continually improve UWP while at the same time neglect and even degrade win32 over time, thus forcing developers and publishers to actually comply with its new UWP commerce monopoly.

    The biggest problem here, as you may have guessed, is that Microsoft is trying to turn today’s open PC ecosystem into a closed, Microsoft-controlled distribution and commerce monopoly.

    Sweeney concluded:

    “In my view, if Microsoft does not commit to opening PC UWP up in the manner described here, then PC UWP can, should, must and will, die as a result of industry backlash. Gamers, developers, publishers simply cannot trust the PC UWP “platform” so long as Microsoft gives evasive, ambiguous and sneaky answers to questions about UWP’s future, as if it’s a PR issue. This isn’t a PR issue, it’s an existential issue for Microsoft, a first-class determinant of Microsoft’s future role in the world.”

    UPDATE:

    Kevin Gallo, corporate vice president of Windows at Microsoft, responded to The Guardian’s article with following statement:

    “The Universal Windows Platform is a fully open ecosystem, available to every developer, that can be supported by any store. We continue to make improvements for developers; for example, in the Windows 10 November Update, we enabled people to easily side-load apps by default, with no UX required.

    We want to make Windows the best development platform regardless of technologies used, and offer tools to help developers with existing code bases of HTML/JavaScript, .NET and Win32, C+ + and Objective-C bring their code to Windows, and integrate UWP capabilities. With Xamarin, UWP developers can not only reach all Windows 10 devices, but they can now use a large percentage of their C# code to deliver a fully native mobile app experiences for iOS and Android. We also posted a blog on our development tools recently.”
     
  9. fr33jack

    fr33jack Maha Guru

    Messages:
    1,153
    Likes Received:
    3
    GPU:
    1050Ti @1.9/ 9.0GHz
    Well...I am a huge pessimist. I eat a lot of bullsh*t to earn it :D I'd say...will see.
     
  10. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,523
    Likes Received:
    522
    GPU:
    Inno3D RTX 3090
    I can't get past the accent somehow, it buzzes my head from a point on. What is the point he's trying to make? That we will have DX12 effects through GameWorks that will gimp AMD now that NVIDIA can't do it in the drivers?

    Unreal Engine has ALWAYS been fine. Don't confuse bad usage of a tool, with the actual value of the tool. Look at the Mass Effect titles and their performance, for example. They all are UE3 titles. Unreal Tournament 3, The Borderlands games, Bulletstorm, Dishonored, the initial Gears of War PC port...

    The engine is fine, this port was rushed and it's bad.

    Universal Windows Apps don't have any .exe files, and you can't touch the files they have. That excludes ALL modding, unless directly provided by the game creator. That also excludes any kind of fixes, enhancements, and the survival of awesome titles through community effort.
    Do you know what the community has done with titles like Skyrim, System Shock 2, Doom 3, Vampire:Bloodlines, KoTOR etc? You might as well forget all of that stuff. Part of Tim Sweeney's rant is exactly that.
     

  11. -Tj-

    -Tj- Ancient Guru

    Messages:
    17,124
    Likes Received:
    1,902
    GPU:
    Zotac GTX980Ti OC
    Bulletstorm stuttered, MassEffect 1 stuttered, all Batman series stuttered especially 1st one, Alice of wonderland stuttered, GOW stuttered, Bioshock infinite stuttered, Borderlands also with hw physx.. and a few more

    By all you had to manually play with streaming values and pool size and still it wasnt a garantied fix. So yeah U3E is a disaster when it comes to that 90% of the time. It got a little better lately, but far away from calling it fine. Its even worse on AMD since its streaming is more optimized for nv.

    If you didnt notice yet Unreal + nv is like a love affair lol, everytime nv makes a new hw there is unreal engine bs'ing around.. look at some acient 2005 video conference how nv trolls u3e capabilities on nv hw.

    I could go on.. I like unreal engine but sometimes its just **** no matter what. U4E fixed most of its streaming problems but its not ideal yet. Not like Frostbyte3 or Cryengine3..
     
  12. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,523
    Likes Received:
    522
    GPU:
    Inno3D RTX 3090
    I have never really had problems with UE3 games. Especially the ones I mentioned. Probably because most of them (except Dishonored) were patched when I finally played them. I never saw UE3 as a "disaster". It was very popular and it was bound to happen.
     
  13. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,007
    Likes Received:
    230
    GPU:
    6800 XT
    Physx stuff was horrid. But else I really did not have issues with UE3 games. Batman games I did not play tho so. I know on consoles UE3 had streaming issues sure.
     
  14. oGow89

    oGow89 Maha Guru

    Messages:
    1,213
    Likes Received:
    0
    GPU:
    Gigabyte RX 5700xt
    Just played mass effect entire series, aside from occasional crashes due to dlc and dynamic shadows in-agreement but no stutter what so ever. I played those games also when they came out, and only issues i had was some low fps because i had only a pentium 4 back then. As for dishonored, only issue i had was the damn mice shadows that just won't stay on. Gears of war 2007 ran like a dream with perfect 60fps on my hd 4670 on 1280x1024. Bulletstorm also ran better than COD, keep in mind, most of the games i played were either on an old p4 pc or a laptop with hd 5650m. Bioshock also ran like wonder, and that goes for all batman titles up to arkham knight which i havnt actually played. Only games that ran like **** were AC games starting from 3. In the end it depends who on makes the game and whether is nVidia is involved or not in the making. At times it bites them back, like just cause 3 runs considerably better on amd across all gpu segments. All in all, while the u3e widely spread, and the cryengine is pretty, and frostbite 3 is special, i miss the MT framework engine, most efficient one ive seen. Lost planet 1, re5, dmc 4, all ran with 100+fps even on my crappy rig back then.
     
  15. fr33jack

    fr33jack Maha Guru

    Messages:
    1,153
    Likes Received:
    3
    GPU:
    1050Ti @1.9/ 9.0GHz
    Ok. I get it now. We HAVE to worry about future dx12 titles in regard to their performance and such...for those which will be released as Windows Store exclusives :D
     

  16. Osjur

    Osjur Member

    Messages:
    34
    Likes Received:
    0
    GPU:
    Radeon Fury X
    Aand, we have a patch which fixes atleast AO problems and it didn't come from AMD :smoke:
     
  17. CrunchyBiscuit

    CrunchyBiscuit Master Guru

    Messages:
    284
    Likes Received:
    63
    GPU:
    AMD Radeon HD6950 2GB
    I agree, there are games on UE3 that just stubbornly refuse to run without any stutters. Same here with Bulletstorm, despite sky high frame rates. Batman series as well. At first I thought it was alright, but when playing on I started to notice it more and more in Arkham Asylum. Part two is way worse than part one, the open city is just too much. All Batman titles are affected. Alice - Madness Returns, same problem here. Slight hitches in some spots, keeps happening but frame rate is good, like it repeatedly has to load certain assets when they've been out of view for a couple of seconds. Also have to add Remember Me to the UE3 stutter list, same problem as in Alice. It's annoying. After having played a couple of these titles on consoles and my brother's i7 and 980Ti as well, it's safe to say all platforms are affected by this problem to various degrees, although AMD had it way worse than nVidia before the 13.12 Catalysts.

    I agree, fortunately there are also games on UE3 that run well (Mass Effect 2 & 3, Dishonored, Homefront, Singularity). These games should demonstrate the engine is not at fault - it's entirely achievable to produce a game on UE3 that doesn't have those stutter issues. Developers that actually seem to care have proven this. AMD drivers have also improved majorly since the end of 2013 when it comes to UE3 games.
     
    Last edited: Mar 5, 2016
  18. Santiago P

    Santiago P Member

    Messages:
    20
    Likes Received:
    0
    GPU:
    R7 260X 1200 MHZ 1800MHZ
  19. OnnA

    OnnA Ancient Guru

    Messages:
    12,553
    Likes Received:
    3,135
    GPU:
    Vega XTX LiQuiD
    Nvidia’s HBAO+ GameWorks Feature Is Behind The Visual Corruption Issue In Gears Of War Ultimate Edition

    As it turns Nvidia’s HBAO+ ambient occlusion feature was the culprit behind the horrible artifacting / visual corruption issue that plagued some of AMD’s Radeon graphics cards. What’s more eyebrow-raising is the fact that there isn’t even any mention of HBAO+ inside Gears of War Ultimate Edition’s graphics settings menu. There’s only an option to turn “Ambient Occlusion” on or off, with no mention of what type of ambient occlusion it is.

    Nvidia GameWorks Gears Of War Ultimate Edition

    Typically when a vendor specific visual effect is implemented in any game it ends up listed under its vendor specific designation . For example HairWorks would be listed as HairWorks and HBAO+ under HBAO+ rather than given generic “hair physics” or “ambient occlusion” designations. When no such specificity is present it’s assumed that the visual effect in question is a developer specific, hardware agnostic implementation.

    In the case of Gears of War Ultimate Edition it was Nvidia’s proprietary HBAO+ GameWorks feature listed under a generic name. In fact we only learned that the ambient occlusion implementation in Gears of War : Ultimate Edition was in fact HBAO+ when Nvidia’s Andrew Burnes announced the availability of Game Ready drivers on the GeForce.com blog and named it as one of the game’s features.

    NVIDIA HBAO gears of warSource : GeForce.com – Game Ready Driver Announcement

    I can only speculate about why the decision was even made by Microsoft to give HBAO+ a generic label. It’s quite unusual and very much the opposite of what you’d want to do as a developer to ensure transparency. For contrast let’s look at what the Far Cry 4 developers decided to label the exact same ambient occlusion implementation in their game’s menu.

    Far Cry 4 Nvidia GameWorks HBAO+Far Cry 4 – visual quality settings menu

    Not only is it plainly labeled as a GameWorks feature but it’s also optional. As in it’s not part of any of the game’s graphics presets. On the other hand in the case of Gears of War Ultimate Edition, HBAO+ is enabled by default if the Ultra preset is selected.

    Disabling HBAO+ Will Solve The Visual Corruption Issue But That’s Only The Tip Of The Iceberg

    As it turns out HBAO+ wasn’t the only thing hidden behind a veil. Whilst digging into the game’s files a friend of WCCF – youtuber Blindrun – spotted a very interesting hidden game file. The file is BaseEngine.ini and it can be found in the following folder path on Windows 10 :
    C:\ProgramFiles\WindowsApps\Microsoft.DeltaPC_1.6.0.0_x64__8wekyb3d8bbwe\Engine\Config

    The WindowsApps folder is windows protected. Which you means that you’ll have to jump through some hoops to actually access it. But here’s how to do it.

    BINGO

    In BaseEngine.ini we spotted a very peculiar entry. “bDisablePhysXHardwareSupport=False”. More peculiar is the fact that this file cannot be edited in anyway. If it is, the game will simply overwrite any changes once it’s booted up and connected to Microsoft’s servers.

    The entry means that hardware accelerated PhysX is enabled by default in the game and because any changes to the file are overwritten automatically upon game start-up means it can’t be disabled. For those of you unaware PhysX is Nvidia’s proprietary game physics engine and can only be accelerated by Nvidia’s own graphics cards. If a competing GPU is detected by the PhysX driver, the work will instead be automatically offloaded onto the CPU. This in turn means that the feature will run disproportionately slower on non Nvidia equipped systems.

    In the past, all games that included hardware accelerated PhysX features such as debris, water physics and so on included the option to turn it off. This was critical because these features directly influenced how the game performed. The option was also necessary when evaluating AMD and Nvidia graphics cards to ensure that the testing was done on an even playing field.

    Additionally, because hardware accelerated PhysX features are only visual and aren’t part of the game’s core mechanics ;disabling them would not affect the game’s behavior in any way. Sadly because we are unable to turn off hardware acceleration in Gears of War Ultimate Edition we don’t know what it’s actually doing. And whether it could account for some of the performance disparity we’re seeing between Nvidia and AMD graphics cards. Right now we simply don’t know what affect it has because we can’t test it.

    Suffice to say there seem to be some “shenanigans” going on here so we’ll keep a close watch and update everyone accordingly. In the meantime we’d advise PC gamers to wait until the mist clears on the issues surrounding this troubled PC release before pulling the trigger.

    [​IMG]
     
    Last edited: Mar 5, 2016
  20. Noisiv

    Noisiv Ancient Guru

    Messages:
    7,586
    Likes Received:
    1,021
    GPU:
    2070 Super
    In BaseEngine.ini we spotted a very peculiar entry. “bDisablePhysXHardwareSupport=False”.
    More peculiar is the fact that this file cannot be edited in anyway.
    If it is, the game will simply overwrite any changes once it’s booted up and connected to Microsoft’s servers.

    The entry means that hardware accelerated PhysX is enabled by default in the game [...]
    If a competing GPU is detected by the PhysX driver, the work will instead be automatically offloaded onto the CPU.
    This in turn means that the feature will run disproportionately slower on non Nvidia equipped systems.



    I am willing to bet that the wcfftchfch joker is once again wrong on so many levels.
    And I base this on my total lack of knowledge about UE and just a pinch of common sense :D

    Bare with me please:

    1. That what does not exist ingame(GPU PhysX) - does not run, let alone "disproportionately slower on non Nvidia equipped systems".
    2. Just because it's in the *.ini (bDisablePhysXHardwareSupport flag) does not mean its read/used by the game.
    3. Even if it was used ingame (GPU PhysX), setting “bDisablePhysXHardwareSupport=False”, does not magically make your Phenom/Sandy able to run all the effects that GPU PhysX is capable of.

    Someone who knows this UE/PhysX please correct me :)
     

Share This Page