Older DX11 Radeons and graphical problems in newer games

Discussion in 'Videocards - AMD Radeon Drivers Section' started by Spectrobozo, Mar 21, 2017.

  1. Spectrobozo

    Spectrobozo Guest

    Messages:
    48
    Likes Received:
    0
    GPU:
    8600GT
    I've been noticing over the past years that more games are presenting instability (for example being forced to use windowed mode for BF1 to run) and graphics problems with my HD 5850, I suspected at first that it could be related to the hardware going bad (my card failing), but it's clear now that this might not be the case since I can find other people with different cards of the same gen and the same problems; figured I would show some examples,
    I don't really expect AMD to fix these issues now that these cards are "legacy" products since 2015; but it kind of disappoints me that some of these games achieve playable performance with these cards but have graphical glithces; some of them have simple solutions, like disabling an effect, but not all

    the latest Frosbite games seem to all have a similar problem; I've played Dragon Age Inquisition (November 2014 game) and failed to notice any visual problems with it, but on newer games, specially Battlefront and newer there are some "squares" present; they are not visible all the time or even all the maps, but they seem to be related to some specific effect and can be reproduced easily; I'm going to use some examples captured from youtube but they happened in the same manner with my card:

    Battlefront with a 6850
    http://imgur.com/a/vaB7j
    https://youtu.be/XEAz-qy4Dfs?t=2m30s

    BF1 with a 6570
    http://imgur.com/a/OLyik
    https://youtu.be/Qizkm1eYkTg?t=19s

    Mass Effect Andromeda with a 5770
    http://imgur.com/a/TncZz
    https://youtu.be/7pfYcqYMCtk?t=4m32s

    from what I've seen this problem is happening on "Terascale 2" cards, I couldn't find the same on the "Terascale 3" ones; but I would be interested in knowing if anyone had the same problem outside of the Terascale 2 cards, or if they actually never had it with one; if any driver version OS option helped;
    considering the cards are not officially supported and no new drivers are being released I suppose it's understandable that the issues is not going to be fixed or noticed,

    outside of Frosbite games I saw different issues like, Fallout 4 with Godrays enabled gives us this with any non GCN Radeon:
    http://imgur.com/a/nTUZt
    including the Terascale 3 cards, but in this case disabling the godrays effect, or tweaking settings to disable some shadows can give you a clean playable game.

    I also noticed that Rise of the Tomb Raider have severe glitches with motion blur turned on, Deus Ex Mankind divided gives a somewhat similar but less visible glitch compare to FO4 on the benchmark test around one security guard but only at certain angles (with how it reacts to the light behind him), but I fail to notice the problem during normal gameplay; I haven't played that many newer games with my 5850 to know if it's common to have visual glitches in more games,

    it would be interesting to know if anyone else is noticing these problems, or others not mentioned here and if any kind of fix exists, specially for the the Frostbite games, it's a very distracting bug on BF1/Battlefront I think.


    I was uncertain if this or the other sub forum would be more appropriate for this thread, but given that this is, I think more likely to be a driver or software problem...
     
  2. mirh

    mirh Member Guru

    Messages:
    103
    Likes Received:
    5
    GPU:
    MSI HD 7750 1GB
    Did you try user1 drivers?
     
  3. Romulus_ut3

    Romulus_ut3 Master Guru

    Messages:
    780
    Likes Received:
    252
    GPU:
    NITRO+ RX5700 XT 8G
    #inb4udontdeservesupport
     
  4. Spectrobozo

    Spectrobozo Guest

    Messages:
    48
    Likes Received:
    0
    GPU:
    8600GT
    I haven't but my understanding from reading the thread is that it doesn't change anything for D3D games compared to the last official beta release; so I don't have any high hopes for that one



    I suppose that's OK given how old some of those cards are;

    but at the same time those were DX11 cards and DX11 is still being used for almost every single game; and some of the TeraScale2/3 cards are not as old as the 5000 series and were sold as budget options not so long ago,
    I think the only thing you can compare against is Nvidia cards from the same period, and I don't see the same complete lack of support for Fermi,
     

  5. Romulus_ut3

    Romulus_ut3 Master Guru

    Messages:
    780
    Likes Received:
    252
    GPU:
    NITRO+ RX5700 XT 8G
    I was being sarcastic, mate. There's been a trend of late around this section of the forum that deems people unworthy of asking for driver support for older hardware, which I don't agree to at all.
     
  6. user1

    user1 Ancient Guru

    Messages:
    2,780
    Likes Received:
    1,303
    GPU:
    Mi25/IGP
    Main problem is software support in general, this is not just a driver issue, plenty of games are issue free, I remember playing the battlefront beta with no artifacts, What you are seeing is the result of ea not testing newer versions of frostbite on older hardware / software and not fixing/ working around the bugs.

    a Game like overwatch doesn't have these same kind of issues since they offically have their minimum spec at a hd 4850, which i would assume they tested and worked out any bugs even though the card its self hasn't been supported by amd since 2013.

    In a perfect world every developer would write Proper code with best practices inmind, and amd would fix only bugs that are caused by the driver not being 100% compliant but time is money and devs cut corners.

    The prevelance of "game ready drivers" has more to do with fixing bugs and performance issues that the developers failed to fix than actual problems with the driver.

    Of course amd or nvidia fixing every game before release is expensive, and nvidia does it alot better or at least used to. which is a big reason why amd likes dx12 since it shifts alot of the burden back to the developer.


    Main take away is that Ea and other publishers could fix those bugs, but for them it is not worth the time to fix since the number of users on those cards is too small to justify the cost.
     
  7. Scure

    Scure Guest

    Messages:
    9
    Likes Received:
    0
    GPU:
    Sapphire HD6870 1GB
    Yup. I have HD6870. Tried Battlefield 1 beta and had those graphical artifacts sadly. Too bad, because i was able to play at a pretty good quality. Np AMD, i won't buy new games then, i'm sure EA is happy too. :)

    I played Rise of the Tomb Raider too and had the same problem with motion blur, but everything worked perfectly when i disabled it. I didn't try the other games from your list. But did you try Warface? The whole game is bugged for me, it's unplayable.
     
  8. chris89

    chris89 Master Guru

    Messages:
    252
    Likes Received:
    9
    GPU:
    RADEON R5 M430 2GB
    Yeah I own an HD5850 reference. The "Squares" are likely memory overheating because of in excess of 100C temperature fan profile. You may of noticed a silent card? The fan is also very high speed and can throw out a gust felt from nearly 15 feet away at full speed.

    16.2.1 legacy win7 x64 works perfectly for me on modded bios.

    So as far as new games messing up, the card just needs a BIOS mod to correct the thermal issues. Although we can fix that much and get 844Mhz core out of it like I have done no problem.

    New games require more than 1GB of video ram, and especially tomb raider and GTA v among others if there is not enough ram available data will be null on screen. Meaning missing objects/ glitches. So it's ideal to monitor dedicated memory usage and set the game settings just below the 1024MB limit. That way it will be efficient in rendering all that it can.

    If you want please upload your bios .rom from GPUz dump and I can correct some common issues. Not to mention 844Mhz core which is 27GPixel/s 60GTexel/s. Stock is what 650Mhz? That's 20GPixel/s & 46GTexel/s. So we can gain as much as 31.18% in fps which helps a lot on this card. The memory as well can handle a huge overclock. I had mine up to 160GB/s from what less than 100GB/s? You can run it about 140-150GB/s totally fine. That's an extra literally 40-50-60% more memory bandwidth. So we can add like 20fps on to every game.
     
    Last edited: Mar 30, 2017
  9. Spectrobozo

    Spectrobozo Guest

    Messages:
    48
    Likes Received:
    0
    GPU:
    8600GT
    I understand,
    but you can say it was more of an answer to that line of thought than to just your post!


    good point, some games support old hardware/drivers pretty well, like you mentioned Overwatch is an example of it, but at the end of the day I think a lot of the lack of support from games comes from the lack of new drivers or support from AMD, if you look at the green side these games tend to work perfectly and Nvidia still includes Fermi on the main driver package, I would think this would stimulate game devs to make sure it works OK with Fermi a lot more than what AMD is doing, I've seen games which on minimum requirements indicate a Fermi card for Nvidia and a GCN card for AMD...


    yes I managed to play TR fairly well with motion blur off, like Fallout4 it's one of the games that if you disable the glitched effect you can play it quite well, but unfortunately, like BF1 not all games give you that option, I haven't tried Warface yet, the last cryengine game that I tried I think it was "evolve" when it became F2P, but I played it very little and didn't notice any bugs.




    the pictures I posted are not from my 5850 but from random videos from youtube with varying cards, also my 5850 runs pretty cool with a custom fan profile with the help of MSI afterburner, I have tested underclocking and there is no effect in regards to the glitches, I also can run heavy games with no glitches, I think Witcher 3 was fairly good (hard in terms of performance, but didn't have any apparent bugs),

    the limited amount of ram (1GB) can cause low performance and stuttering, but not really this kind of glitch I don't think, this kind of thing wouldn't happen with a 7850 1GB for example.

    regarding OC, my 5850 is good in terms of GPU OC, memory is not very good, but again, I've tested the things I mentioned at stock clocks and even with underclock to make sure it was not simply unstable OC/card

    one thing that you mention that I didn't really look into is that you are using Windows 7, I'm using Windows 10 (and was using 8/8.1 previously),
     
    Last edited: Mar 31, 2017
  10. user1

    user1 Ancient Guru

    Messages:
    2,780
    Likes Received:
    1,303
    GPU:
    Mi25/IGP
    First of all Nvidia has far more resources to patch games. Fermi was well ahead of its time, its very modern by comparison to the pre-gcn cards and very similar to kepler, not to mention nvidia was selling new fermi based products as late as 2015 (there are a few 28nm fermi chips). https://www.techpowerup.com/gpudb/2675/geforce-920m
    Would be suprising if they didn't support fermi at least partially.

    Though Amd did end support abruptly, given the cost of maintaining a separate aging code path along side gcn, It probably wasn't worth it given how poor apu sales had been for llano and the trinity/richland (the last "new" products to use vliw). I can't really blame them for killing support for 5-6 year old products that have their roots in the hd2000 series.

    Call me an amd apologist but I think its fair for them to throw in the towel on this issue.

    I would still put the majority of blame on the developer since they are ultimately the ones likely not sticking to spec(Dunno if its just me but I have a hard time believing a driver that has had 6-7 years of development is having a dx11 compliance problem). I personally believe that drivers shouldn't need updating unless they are adding new features /improving performance or fixing compliance issues, this "use the driver to patch the game" thing is quite asinine imo.

    (Nothing against nvidia or amd helping devs fix bugs , just using the driver to do it. these 400mb driver downloads these days are ridiculous)
     

  11. Spectrobozo

    Spectrobozo Guest

    Messages:
    48
    Likes Received:
    0
    GPU:
    8600GT
    I understand that Terascale2/3 is more "outdated" and different to its successor than Fermi, but personally I see the fact that it can actually play those glitched DX11 games at playable framerates and the fact that most of the Fermi and Terascale2/3 lifetime was at the same time outweighing that, for the comparison,

    I have to wonder, how expensive it would really be for a large company to keep a basic level of support for a few more years, to fix just major bugs, I think the number of people running Terascale2/3 GPUs, while a minority is not near 0 yet

    I don't see a big problem with driver sizes, you don't really need to updated every month, and with my current Internet it goes faster than it was back in the day for 20-30MB packages
     
  12. user1

    user1 Ancient Guru

    Messages:
    2,780
    Likes Received:
    1,303
    GPU:
    Mi25/IGP
    If you had 3 people dedicated to just pregcn bug fixes it could very easily cost amd 200k+ per year.. while that might not seem like much, when you consider amds entire r&d is was 240mil~ and nvidia was $330mill in q3 2015, and amd was still hemorraging money its not all that surprising they cut the low hanging fruit.

    As for lifespans i disagree Fermi was used by nvidia longer than amd used terascale 2/3 thats why you see a difference in support life, amds last new terascale based product was in 2012, and fermi had new silicon uptil 2014-15, and I can tell you i do not expect nvidia to support fermi past 2017 since they have already dropped fermi support from their debugger, if fermi isn't having as many issues its probably directly related to its similarity to kepler.



    Patching bugs with the driver might seem ok on the face of it , but it leaves problems down the line , since the software is not actually fixed .

    For example doom 2016 has a reflection shader bug , this was present on both gcn and pre gcn cards , it was "fixed" in the 16.200 branch of amd's driver
    unfornately for you and me since the bug wasn't patched in the game the bug remains present on all non gcn cards since 15.301 is the last driver branch that supports pre gcn. While i cant say for sure if its a bug with amds driver or a doom bug (opengl 4.3-4.5 was still pretty new to the driver) the fact it was present on 2 very different architectures tells me it is more likely doom.

    If a game had an audio problem you wouldn't automatically assume its the audio drivers , and then expect realtek to patch games every time it happened,you would probably submit a bug report to the developer of the game.

    it should also be noted that even with driver support bugs can stay for older cards, an example that comes to mind is running crysis 2 (2011)on a 7800gtx series geforce which was "fully" supported until 2012, it should technically run fine since its a dx9 game and i have ran it on an 8500gt which is inferior ,but instead it crashes and runs like crap , game runs fine on a 1900xt despite ati having dropped support in 2010

    I do think its worth pestering Dice about the frostbite bug since it seems to be a simple problem with the hud graphics, if enough people complain about it they will probably consider patching it despite what the customer support says.

    You are right though that driver sizes probably dont matter that much anymore, It still bugs me , since the entire linux mesa driver stack is only a few hundred mb (that includes all the nvidia , amd, intel ,and other video drivers)
     
  13. Yxskaft

    Yxskaft Maha Guru

    Messages:
    1,495
    Likes Received:
    124
    GPU:
    GTX Titan Sli
    It baffled me that AMD supports WDDM 1.3 on all its discrete DX11 GPUs (Terascale 2 and 3) but not for its Terascale 2 APUs

    If you go back to the DX10 generation, Battlefield Hardline can be run on a GTX 260, but a driver version check results in the game refusing to run on the HD 4870. All Nvidia's DX10 cards also got support for WDDM 1.2 whereas AMD left theirs at WDDM 1.1, having moved it to Legacy already before Windows 8's release.


    Fermi also never got finalized DX12 support, we can only suppose Nvidia decided it wasn't worth the resources. It got WDDM 2.0 and 2.1 support though.
    Once Volta releases, I wouldn't be surprised if Nvidia moves Fermi to Legacy support.
     
    Last edited: Mar 31, 2017
  14. Chastity

    Chastity Ancient Guru

    Messages:
    3,744
    Likes Received:
    1,668
    GPU:
    Nitro 5700XT/6800M
    During development, especially in games that push the envelope of what was done before, can expose bugs in driver code, or they may develop a more optimized methodology which gets incorporated into drivers. This is why drivers get updated for games.
     
  15. user1

    user1 Ancient Guru

    Messages:
    2,780
    Likes Received:
    1,303
    GPU:
    Mi25/IGP
    I think lack of resources is probably the biggest reason amd has had inferior long term support. 2010-2015 was pretty rough for amd.

    Im no expert but from what i have read, its true that edge cases do pop up from time to time but they are the exeception not the rule when it comes to this for older graphics cards. New cards are more likely to experince this type of bug since they don't have nearly the amount of testing.


    https://www.gamedev.net/topic/666419-what-are-your-opinions-on-dx12vulkanmantle/#entry5215019
    I recommend reading this, its not the best source and a bit old but it is enlightening.

    here is a snippet
     

  16. Spectrobozo

    Spectrobozo Guest

    Messages:
    48
    Likes Received:
    0
    GPU:
    8600GT
    it's true that Fermi lived longer in the low end, alongside Kepler with even a 28nm model (a similar GPU to the GT 430, but I think with 64bit memory only), but if you look at the main cards, like GTX 560 they were replace within a few months from the 6800s/6900s being replaced,
    and for a while AMD kept pushing some 6670s, not to mention the "6450" (even gained a R5 230 rebrand lol, you can still find those R5 230s legacy cards new on most stores...)

    I'm not aware of how the 7800 handles later DX9 games, but even the HD 3850 had some troubles with Skyrim after a few patches, I remember someone complaining of glitched shadows I think...




    regarding the driver version check, you can edit a value on the registry to pass as a newer version, Hardline worked without bugs on the 5850 when I tried it I think, the bad bugs only started with Battlefront,
    but in BF4 I noticed a problem (missing layer of mist I think) but it was not to bad,

    Andromeda on the starting scene is pretty terrible with the black squares; different driver versions (13.12, 15.7.11, 16.2), lots of variations of settings; it makes no difference, those games are not playable on the old cards unless you are really tolerant with this kind of thing, I find it to distracting, it's a shame because on low settings frame rate is not really a huge problem (from what I've tested, I haven't played those games for to long because of the graphical problems)
     
  17. mirh

    mirh Member Guru

    Messages:
    103
    Likes Received:
    5
    GPU:
    MSI HD 7750 1GB
    Mhh no, seriously, it's AMD that sucks SO-HARD at opengl.
    By the way, it *really* would be helpful if you got 16.200 branch (or at least even just atioglxx) to work with pre-GCN.

    Well, mesa has no raptr/catalyst/experience ****, for all that matters then.
    If you just check driver/display folder.. Sizes aren't all that off-scale, imo.

    Mee too.
     
    Last edited: Sep 12, 2017
  18. user1

    user1 Ancient Guru

    Messages:
    2,780
    Likes Received:
    1,303
    GPU:
    Mi25/IGP
    amds ogl support isn't that bad on windows(its pretty terrible on linux), main thing is the performance is terrible, not the actual extensions being broken(for the most part).
    The only reason I think its a problem with doom is that the doom beta did not have the performance issues and broken settings.

    as for the wddm1.3 issue, its not that big of a deal, for all you know there was erratum that broke it.

    That being said I found a dword for you to try
    KMD_ForceWDDMv1.3, should be added to hklm\ControlSet001\control\video\{xxxxxxxx}\0000
    and hklm\Controlset001\services\amdkmdag\

    you can see if that changes anything, though I imagine its for testing gcn cards.
     
    Last edited: Apr 11, 2017
  19. Spectrobozo

    Spectrobozo Guest

    Messages:
    48
    Likes Received:
    0
    GPU:
    8600GT
    complementing my previous post with more images, Mass Effect Andromeda seems affected pretty badly in some parts, the initial scene is quite terrible,
    https://www.youtube.com/watch?v=8HowoXx2lA8

    after that there is a scene with less problems
    https://www.youtube.com/watch?v=EwjeHBBrXqk

    I didn't manage to find Terascale 3 examples on youtube, I would be curious to see if there is any similar bugs on the starting scene, which is the easiest to test.
     
  20. mirh

    mirh Member Guru

    Messages:
    103
    Likes Received:
    5
    GPU:
    MSI HD 7750 1GB
    No, really.
    On linux they used to also have kernel-side problems in addition to usual plain APIs ones, but with AMDGPU driver (and open userspace very much functional) imo, they are doing even better than nvidia now.

    And then, as for doom, that's really even on the bright side of the fence.
    Be it merits of its experienced devs, or just it being an AAA game in the spotlight, perf is not so bad. Nvidia cards aren't so crazily faster, they are still way comparable.
    In my testings instead a GT430 was like 2-3 times faster than a RX480.

    It's not like there aren't Terascale cards officially rated for the thing too.

    Anyway, too bad my humble laptop is on Manjaro/W7 now, with really no intention to test ever again 10.
    But hopefully some fella passing over here will be able to check it.

    EDIT: also to check in there, value KMD_ForceToUseWddmVersion
     
    Last edited: Mar 11, 2020

Share This Page