Radeon Adrenalin Edition 19.11.1 drivers - Download & Discussion

Discussion in 'Videocards - AMD Radeon Drivers Section' started by Hilbert Hagedoorn, Nov 4, 2019.

  1. grumpynator

    grumpynator Member

    Messages:
    33
    Likes Received:
    8
    GPU:
    tuf 5700rx 8gb
    all fine here 5700 modenwarfare good and flawlesslly for me on Destiny 2
     
    Jackalito likes this.
  2. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Good to hear, think Destiny 2 had some issues so that's nice to see the game performing well and Modern Warfare I think was another game that could also push the card but IW has also been active with patches and support which I am fairly certain included crash fixes and general stability improvements though the drivers could still be a deciding factor. :D

    Will be interesting to see what's next, from Phoronix it looks like several major fixes and improvements have landed for Linux and Navi so that's a good development too and for Windows I suppose we'll just see what 19.11.2 or what's next will be bringing, Fallen Order on the 15th and Frostbite I think(?) even if Respawn are proficient in a custom build of the Source engine (Titanfall 2, I assume the same goes for Titanfall Apex too.) so maybe something in time for next week which should also see another cumulative update as part of the regular monthly update cycle for 1903 and maybe the release of the 1909 features and 19H2 after a few delays. (Not so much 1909 anymore at least.)

    And whatever other things could happen before the yearly update driver for December.
     
  3. Jackalito

    Jackalito Master Guru

    Messages:
    584
    Likes Received:
    101
    GPU:
    Radeon RX 6800 XT
    Yeah, I also keep hardware acceleration off, just in case.

    My current undervolting is 2050MHz target on Wattman with 1.100V, and it's been working great across a myriad of games for weeks now. And my 5700 XT from Sapphire is not the top model, but the cheaper Pulse, which I managed to grab when it was first released back in August for just 389€ in Spain :)
     
    LocoDiceGR likes this.
  4. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Pulse isn't bad though it has slightly worse thermals than the Nitro due to the smaller heatsink and one less fan though they all pretty much clock the same so it mainly comes down to overall silicon quality and what the heat and noise levels land at though AMD once again has a pretty broad spectrum of chips or their quality or how to say so a few of them overclock extremely well and others don't really do much at all or even undervolt very well plus the difference with micron and samsung GDDR6 modules and what looks to be some differences in PCB's and overall wattage and some other minor stuff.

    The one thing with Pulse is that the fans have two separate RPM values and it creates a mild oscillation or vibration that can be a bit irritating depending on noise sensitivity which the Nitro seem to avoid though the smaller mid fan is a bit different to the two outer fans on that and the bios and fan controller have low default speeds and then ramp up suddenly once the GPU hits over 60 degrees for I think it's the junction temp and this is achieved by boosting up and then spinning down to the target temp leading to a sudden increase in fan speed which thanks to AMD's Wattman and it's annoyances and misreadings can lead to the fan spinning up and down as the GPU sensors get close to the target temps or the bios default thresholds. :)

    Manual fan curve and being careful to monitor actual temperature and junction for both GPU core and memory can avoid most of that however, undervolting and trimming down the core clocks to what it averages around instead of the 2100 something Mhz default and 1.2 something core voltage can also shave off nearly 30 degrees on junction and a good 15 - 20 degree Celsius in edge / surface temperature while also reducing power draw from the GPU core by 50 or even 100w though it depends on the specific card and some can require a bit of tuning to find stable values for benchmarks, games and specific API's depending on overall GPU load.


    Compounded by the various driver issues and quirks both in earlier drivers like how I'm on 19.7.5 for stability and also sudden spikes or boosts where core clocks and voltage can briefly exceed target specs.
    (And bugs like how the minimum clock state might be ignored entirely and sensor misreadings and more. Wattman is a extensive set of tools but it has it's issues.)

    Settings for the defaults in older drivers also appear to be more aggressive than the newer ones which use more conservative defaults including the target for auto overclock and undervolt so there's been at least some changes to Wattman behavior since 19.7 or 19.8 somewhere not sure which exact driver.


    It's a solid card, Wattman and AMD's API for OverDrive (version 8 now?) has it's limits and changes to fan controllers and bios over the defaults don't always work as well with it but from my own testing over a few weeks and a couple of drivers the card performs really well and as long as settings aren't too high or low it is stable and it's possible to reduce fan noise and overall thermals and power draw by a nice amount without sacrificing much in terms of performance.
    (Somewhat CPU limited here though, card could probably hit 2000 with some more power and less hardware restrictions though memory is what is holding it back.)

    Memory can be finicky too both due to two separate chips and whatever their characteristics are (It's not shown anywhere.) and also what looks to be automatic voltage starting around 1.35v I think but overclocking increases it further though it's not possible to underclock the memory at least via Wattman itself below the default. Additional voltage via power play table and registry can also push a bit past 950 but crashes and ECC can be limited though there's also ranges where the memory is unstable and boosting it higher actually becomes stable again.


    Some additional performance to be had by hitting 920+ here but stability is finicky and at first it might even decrease performance if error correction kicks in without fully destabilizing the software used during testing or getting a driver crash or display artifacts so I've been keeping it at default and tested around 880 to 890 with varying results mostly not very good so it's back to default for now. (Some titles benefit, some see a decrease and others just crash.)

    Still a lot I don't understand and there's much more to learn or well try to get some understanding about, crashes seem power related though driver affected which makes me wonder if it's behavior to something changing with clock defaults or voltage defaults even for settings that can't be changed in Wattman and then since it takes 1 - 2 days a possibility of it also affecting timings but that's just another theory.
    (Less accurate with more uptime so the GPU or driver software overshoots and the voltage which could be core, memory or SOC or other might go unstable and crash but it could be just about anything.)


    As to the overall top model I am pretty sure that's the 50th AE editions and especially with a custom water block instead of the blower limitations unless undervolted and slightly downclocked then the stock and then the customs with build balancing and variances meaning it's a bit of a mix between Gigabyte, Sapphire and the others I don't completely remember who is behind what. :D
    (Although it's about the same in the end and limits for the core and memory will hamper things before hardware such as power draw and various bits on the PCB hah.)

    And if it wasn't for the driver uncertainty then yeah for around 400 Euro if pricing allows for that with popularity and demand and markup and what not it'd be a easy recommendation seeing how the card even without tweaking can almost reach Radeon VII levels of performance despite less cores and being marketed more as a mid to high end card rather than AMD's flagship or enthusiast though it sorta took that place anyway because it's less expensive and performs pretty good.

    Cost and performance wise it's pretty much that and Polaris now as these come down in pricing pushing aside Vega and the VII other than productivity or more compute oriented workloads not that they're bad for gaming but the 5700 is faster and less costly although stores might try to set up deals and sell existing stock so there's probably good deals for a Vega 56 and Vega 64 both probably even the Radeon VII before that one is completely gone from retail.



    Just need a high end variant of Navi to see how it scales up in terms of cores and such and what the resulting clock speeds, power requirement and of course the actual performance level lands at. I don't expect a linear increase but it probably can scale up a fair bit though AMD looks to be aiming for a lower end model with Navi14 or OEM's and then it might be Navi20 that goes for the high-end performance mid next year maybe or we'll see.
    Ryzen 4000 should also be out around that point probably as a refinement over the existing Ryzen 3000 so maybe Zen2+ something but that's just a guess until more info is actually available from official or at least trusted sources. :)



    EDIT: Well that's a bit of text, good gpu and some potential software flaws but if users don't run into any of those it's a extremely good performer for what AMD is aiming as a mid-range product and lower cost compared to the general higher end pricing segment of GPU's. :)
    Just a matter of when for the remainder actual confirmed software/driver bugs.
    (Still important though, buying a GPU now for full use in a month, two or more is not very probable after all should anything act up.)


    EDIT: Well all that and most of what I'm picking up is based on reading and some testing and experiences but I have a lot more I don't really know well at all and RDNA and Navi is a difference over what prior GPU's were so testing and error and some more error before figuring stuff out bit by bit at least to some extent.
    (Well I might be learning a bit, that's a plus at least compared to what I assumed initially and what was then proven to be a bit of a different thing entirely as to how this worked.)


    Then again underclocking or undervolting, fan curves or going for additional overclocking or power limit increases it's all based on just doing small changes and keeping at testing and maintaining stability so nothing new in that regard just that the GPU being more free form in no longer having actual P states allows for a more dynamic clock speed which takes a bit of time to tune in and set.

    And if Wattman has issues there's also Afterburner though limited inherently by Overdrive and potential issues or restrictions which I doubt are easily bypassed unless AMD fixes or changes things themselves as part of newer driver releases.
    (Which can happen but that can also lead to compatibility issues or other problems as well.)
     
    Last edited: Nov 7, 2019
    Paul L, Jackalito and LocoDiceGR like this.

  5. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Well I mentioned the fan behavior but it's easier to just link to a better and more in-depth view of how it behaves. :)

    Here's those reviews:
    https://www.igorslab.media/en/sapphire-radeon-rx-5700-xt-pulse-review-navi-with-certain-vibrations/
    https://www.igorslab.media/en/sapph...-pulse-review-navi-with-certain-vibrations/8/

    https://www.igorslab.media/en/sapph...th-less-weight-and-the-best-navi-card-so-far/
    https://www.igorslab.media/en/sapph...-less-weight-and-the-best-navi-card-so-far/8/

    These have the measurements for how the fan works compared for the Pulse and then the Nitro which with the Pulse model the fan behavior has this little hum or whirring which is just a thing, doesn't have to be a issue or even very audible but it's there from how this works although other noise or using headphones and it's probably not even registering for the user so it's not like it's a loud audible whine or anything either. :)

    There's also the possibility of coil whine but that's just a thing that happens regardless of GPU model for a variety of reasons.

    EDIT: Just to mention it as something that's part of the GPU and depending on sensitivity it might be noticeable or just be some barely there background noise or might not be heard at all at least until the fan ramps up a bit more.
     
    Last edited: Nov 7, 2019
    Jackalito likes this.
  6. Jackalito

    Jackalito Master Guru

    Messages:
    584
    Likes Received:
    101
    GPU:
    Radeon RX 6800 XT
    Yeah, I'm aware of that, which is why I created a custom fan profile in order to balance temperatures and noise.
    And memory overclock is not a huge problem with my card, as I believe it's Micron.
     
  7. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    I think AMD's default fan curve caps out around 40% don't exactly know how the settings apply to the Pulse or the Nitro but from how it looks like even the pulse is effective enough that most games manage at a lower fan RPM / speed percentage without requiring the noise to become overbearing which happens around 50 - 60% but the GPU settings really try to ramp up the fan speed over a certain threshold but yeah a few tweaks and the fan can be kept near silent. :)

    Bios wise it looks like the Pulse has a limit of 107 degrees junction and the Pulse XT 110 degrees although the fan controller and other settings ramp up well before that limit is hit.
    (Should be the throttle temp too not the outright GPU driver calls it quit temp though even with it being junction temperature that's still corresponds to a bit around 90 - 95 degree Celsius surface or edge temperature I believe so pushing it a bit close there and memory would be pretty hot too along with VRM's and such.)

    Well just thinking really and trying to remember what I was reading up on how that all was put together. :D

    Would be nice to have more details on the memory too although it sounds like Micron can be pushed a bit higher than the Samsung chips but there's no details or sensor info on anything other than temperature so nothing on chips, timing or such unless someone decodes it from the bios if that's available in there. Voltage readings I suppose most of all if it's correct that the value is dynamic so it can scale up if pushed higher although it does seem like the value can be affected via soft mod and power play editing though if it then scales from the new (higher I suppose.) value that must be really difficult to get right and from what I am reading pushing above 950 Mhz seems to be a bit of a challenge so the default limit while not that high might already be above what the majority of the VRAM chips can take without a voltage increase and risking stability that way. (At best, probably want to be pretty careful when increasing this higher than default.)

    That and AMD overvolting and overclocking from beginning anyway to hit higher performance even for the VRAM though scaling wise it looks like once again this is where the larger gains come from if the user can get over the 900 Mhz clocks reliably.


    Also a interesting possibility depending on if Navi20 will go with GDDR6 and some refinement to the 7nm process which I assume the GDDR is also on or if AMD might chance on HBM again and if so a refinement of the HBM2 chips though cost is going to be a factor for configuration, memory total and the other hardware though the benefits are certainly there just that it's costly even against the newer GDDR types.
    (More bandwidth, lower latency, less voltage and more though if nearly half the GPU cost is from the memory chips then yeah that's a thing alright.)

    Well we shall see once AMD actually unveils "Big" Navi sometime next year no doubt whenever that's ready for unveiling whether for one of the bigger trade shows or something else whenever the product is in a state where it's presentable as something besides "NVIDIA Killer" rumors and leaked device names and bits of info.
    (Marketing department might need a bit of a overhaul too but then I guess there's been worse.)


    EDIT: Well just some thoughts and random ideas I suppose, going to be fun to see what AMD does with RDNA and the upcoming RDNA2 and Navi20 type architecture advancement when there's some details on what this entails and how the GPU will be scaled up if it's going to be a bigger chip or at least one variant of it will go for a bigger GPU die and pushing for a high end or enthusiast type product.

    And how the result then stacks up against NVIDIA Turing and NVIDIA's reveal for next year whatever that will be.


    EDIT: So yeah while it might not be triple fan even the Pulse can be pretty inaudible once configured even if the percentage and RPM values might differ a bit from what Wattman reports. :D

    Not that it's too problematic, set a higher initial fan curve and figure out where it becomes too loud and use that as around the max value and have minimum near the idle temp and preferred speed for that and build a nice little curve between the two with the remaining "bits" in Wattman.


    Something like that, hopefully zero fan actually works this time for those who use that to ensure lowest possible fan noise until it hits whatever temp the fan kick in at. I'm not entirely sure it does though but a lower initial curve speed could work too.
    (Around 40 degrees C I think is where it's supposed to kick in otherwise if it works.)


    EDIT: And while it might be a bit finicky in reacting from one bip to the next it should be more of a curve and less of a steeper stair stepping like what happened with Vega.
    Although from testing it still likes to keep to the upper values between two temperature thresholds of the fan curve so it doesn't seem to follow the curvature exactly even if it's better than immediately shifting between two states.

    More states wouldn't hurt either but I guess it's a hard limit, setting something between 20 to 90 degrees junction every 10 degree or even every 5 would allow for a more fine controlled scaling though a rougher lower temp curve and finer higher temp can work too putting the pips you get around the 60 degree mark and above with something for the preferred lower speed value with one or maybe two markers for some control over curve rampup.
    (Keeping the GPU core and importantly also the memory junction temp nice and controlled mainly.)


    EDIT: Well if I'm already mid complaints here (Even if it's minor convenience tweaks.) and to end this little post also merge in the prior temperature target again and use both, fine control and a temperature limit where it can ramp up instead of throttling, Wattman's already more of a power tool than casual overclock flip this thing and forget about it so exposing more and adding additional user controllable settings and bringing back prior and working features wouldn't hurt if the API didn't completely axe these things. :D
     
    Last edited: Nov 7, 2019
    Paul L, Jackalito and LocoDiceGR like this.
  8. mustdiex

    mustdiex Active Member

    Messages:
    59
    Likes Received:
    21
    GPU:
    580
    hello sir,
    ı have rx 580 144hz monıtor use. screen tearing problems under 144fps.vsync and enhanced vsync do not work. how can i fix this ?

    best regards.
     
  9. GREGIX

    GREGIX Master Guru

    Messages:
    856
    Likes Received:
    222
    GPU:
    Inno3d 4090 X3
    Try windowed mode in games. Borderless window or whatever is called. Do not use sync (any) with that.
     
  10. Digilator

    Digilator Master Guru

    Messages:
    663
    Likes Received:
    217
    GPU:
    Sapphire 5700XT
    Have you tried Enhanced Sync + FreeSync(no v-sync in game)? Also, try to limit FPS to 142, Globally(remove any FPS limit in game settings).

    What games are affected for you? That could help - important info helps.
     

  11. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Pretty sure FreeSync is required to eliminate tearing entirely through supported display hardware if you're using methods such as adaptive sync although enhanced sync can however mitigate the penalties with VSync even if it might retain image tearing and assuming the drivers and Windows 10 doesn't act up it's also possible to utilize flip model to decrease latency though results will vary a bit compared to D3D12 or Vulkan where as far as I'm aware at least this is part of the API standard. :)

    Enhanced Sync and VSync on in-game should be similar to flip model although might differ compared to flip model with discard in terms of how much it impacts latency and then with Windows 10 and full screen optimization where it doesn't act up or cause a performance penalty this can also be worked into borderless or windowed modes though not every title is entirely compatible thus why per-exe compatibility toggles for disabling the feature is a thing.

    This is complicated by the varying quality of FreeSync including FreeSync 2 compatible displays and driver issues from where it doesn't work or has problems with specific framerate target ranges though I believe this can be worked around by lowering the framerate a bit below the maximum display refresh rate though I am not entirely certain on how it is with the current driver situation and what I'm reading is pretty mixed but also highlights the various panel specific shortcomings.
    (Should be better with Polaris than Navi at least, I think AMD has improved Enhanced Sync functionality so it's at least stable now but yeah the driver situation isn't the best either in current drivers and not just for Navi GPU's.)

    I do not have the hardware myself though it's probably going to be part of the next system build and hopefully AMD can work out the current driver inconsistencies at least. (Not much to be done for the varying panel quality and display variances short of trying to look up details beforehand if available.)


    EDIT: Maybe I'm mixing up some different features here.
    So VSync would synchronize and avoid tearing but then it has to wait on the frame to finish and triple buffering will add latency.

    FreeSync / Adaptive Sync would allow for the hardware to handle this.

    Enhanced attempts to do a software solution but it's not the same as FreeSync though it does improve on latency.


    But by turning off syncing there's still a chance of visible screen tearing.

    I might need to read up on this again, I think I'm messing it up a bit.
     
    Last edited: Nov 8, 2019
  12. Jayson

    Jayson Active Member

    Messages:
    67
    Likes Received:
    13
    GPU:
    Radeon RX 580 8GB
    for my setup, freesync replaces vsync when I’m below max refresh, so all you need to do is enable freesync AND vsync and cap your framerate to 140fps. That has been the most reliable thing for me. I never get any tearing or stutter and no noticeable increased input lag of any kind. Very responsive.

    I think a lot of people think vsync is bad even if they have freesync on. Fact is, if your game is running at a frame rate lower than max refresh, vsync is not active even if it’s selected in the settings.
     
    JonasBeckman and Krteq like this.
  13. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Good so that's how it is, I had it wrong then so that does clear it up on what FreeSync is doing when that's in effect. :)

    Think I took out the black screen issues too after some thinking at least the variant I was getting, over polling or super polling and it triggering the surge protector on the motherboard takes a while before it happens but since 19.7.5 to the newer drivers and particularly 19.9.3 and newer it looks like from a guess at least the GPU is being polled or measured or updated more frequently or something like that and some motherboards might take some issues with the frequency at which that occurs but I'll know in a day or two if it truly worked. Would have been easier if that was part of the release notes at all but I might at least have resolved that issue now there's just the other stuff ha ha. :D


    EDIT: Yep, pretty much confirmed stable.
     
    Last edited: Nov 10, 2019
  14. mustdiex

    mustdiex Active Member

    Messages:
    59
    Likes Received:
    21
    GPU:
    580
    Thank you for your reply. but havent freesync monıtor. previous drivers didn't have the problem either. i think my relationship with amd will end like this. I had no such problem when using nvidia.I understand that there is no solution. The 144 hz monitor is annoying to have a screen break.
     
  15. Grimbarian

    Grimbarian Ancient Guru

    Messages:
    2,109
    Likes Received:
    621
    GPU:
    RTX 3070 Ti
    Is anyone else finding that RDR2 completely ignores any settings in the driver gaming profile? So annoying this :(
     

  16. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    It might be going for the launcher so maybe add a manual profile with the same settings for the Rockstar Launcher and the game exe and see if that doesn't force it. Should be easy to tell though by setting AF to app or 2x and comparing the terrain texture detail though Wattman overall is more hit & miss when it comes to custom profiles.

    Overall the driver suite just needs work but if needed turn parallax off (I don't think it has that limitation though but just to mention it for if AF works or not.) and compare if that works or not as it should be immediately apparent if it's a blurred mess just a short distance from the player or if it retains sharpness.
    (If the game itself has AF that should be off as well when comparing.)

    EDIT: RDR2.exe at least in the profile file, hopefully the driver gets it right too.
    And if it's automatically detected but I don't think AMD covers the Rockstar client yet then that profile might be targeting the wrong exe so deleting/hiding it and then doing a new one might work better.
     
  17. Grimbarian

    Grimbarian Ancient Guru

    Messages:
    2,109
    Likes Received:
    621
    GPU:
    RTX 3070 Ti
    Yeah I added rdr2.exe but it's definitely not picking it up, I'll add the launcher as well but worst case I'll make my changes through general which should work fine, just want to see what it looks like with Radeon's AA/AF and not the game's :)

    Thanks for the reply :)
     
  18. Eastcoasthandle

    Eastcoasthandle Guest

    Messages:
    3,365
    Likes Received:
    727
    GPU:
    Nitro 5700 XT
    That's not new unfortunately. I was told a long time ago that aa/af only worked with DX9 titles. Not sure how true that was.
     
  19. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    AA overrides and AF works with D3D11 though AA it's mostly for enhancements and not forcing a specific level of MSAA, anisotropic filtering is easy enough to tell from how it completely undoes texture detail just a short distance from the camera position if the games setting for this is either disabled or non-existent and for AA that's a bit tougher short of MLAA but in cases where MSAA is applied enhancements like adaptive AA do have an effect but compatibility can be very varied.

    Assassin's Creed Origins and Odyssey if set to enhance will show pixelated edges for example, adaptive works though for the other AA setting but I think SSAA only applies for D3D9 and the title itself would need to be using some form of MSAA.
    (Which Red Dead Redemption 2 does support odd as it is to see it around over screen resolution scaling and temporal AA or some shader AA which the game also supports all of these too.)

    There's just a really large performance hit and well the effectiveness in modern games and all the shaders and other effects aren't that good for how costly it is although specific usage cases exist such as how Witcher 3 utilized it for hairworks although even then it was quite costly (Initially forcing 8x only could have something to do with that though. :p ) and for RDR2 reflection MSAA might be usable without too large of a hit improving SSR appearance for near camera fully captured and reflected polygonal objects in water and such without it just being a distant low-detail landscape blurry reflection which I doubt would benefit from at all since there's not much to anti-alias.
    (Mirrors perhaps if the has working ones but these tend to apply all sorts of tricks as seen in games like Mafia 3 for a nice example of maybe they optimized it a wee bit much. :D )


    EDIT: Guess there's also non-ReShade sharpening effects now via the display settings and upscaling with the possibility of also supporting D3D11 in a later driver update though D3D12 and Vulkan should be supported.
    (I assume the upscaling possibility is why it's there and not the main 3D overrides and options though I wouldn't mind seeing the two split into separate toggles.)
     
  20. bernek

    bernek Ancient Guru

    Messages:
    1,633
    Likes Received:
    93
    GPU:
    Sapphire 7900XT
    How did you find it ? Where it was hidding and most important was active only during full 3D load on the system so you wouldnt notice ?
     

Share This Page