AMD Radeon Adrenalin Edition 18.2.3 Drivers Download & Discussion

Discussion in 'Videocards - AMD Radeon Drivers Section' started by Hilbert Hagedoorn, Feb 22, 2018.

  1. OnnA

    OnnA Ancient Guru

    Messages:
    17,961
    Likes Received:
    6,819
    GPU:
    TiTan RTX Ampere UV
    Dirt4 Wokring perfect w/Freesync & Moded Fury HBM :D
    Finished 2 runs today for fun.

    BF1 = OK
    SW BF2 = OK
    HoMM VII = OK
    FH3 & FM7 = OK
     
  2. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    2,952
    Likes Received:
    1,244
    GPU:
    .
    You know that, talking about standard deviation, a couple of milliseconds (litterally 1-2 ms) means no stutter at all when playing a video on a screen when the refresh rates are not multiple of each others? Everything is solved by buffering, interpolation and synchronization. On Windows the APIs of DXGI are everything we need.
    Talking about video-games, if you have stuttering on pre-recorded cut-scenes is only due the improper use of the API, like using render-to-texture techniques without changing the presentation settings.
    If you have other issues it's just your system, not the lack of the wrong tool on the wrong work. And from what you wrote, it's obvious you never write a single line of real code, especially low-level graphics code.
     
    Last edited: Feb 25, 2018
    Jackalito likes this.
  3. iakoboss7

    iakoboss7 Member Guru

    Messages:
    153
    Likes Received:
    25
    GPU:
    Sapphire RX 480 8GB
    then i guess you are super smart and madshi (and everyone else) is super dumb for not being able to enable freesync?
    yeah...
     
  4. Shellar

    Shellar Guest

    Messages:
    102
    Likes Received:
    11
    GPU:
    RX 5700 XT
    PetBB, can your display show current refresh rate via OSD? If so, instead of arguing, show us Adaptive Sync / Free Sync in action using your choice of movie player.
     
    Jouven likes this.

  5. Eastcoasthandle

    Eastcoasthandle Guest

    Messages:
    3,365
    Likes Received:
    727
    GPU:
    Nitro 5700 XT
    I experienced a similiar problem I posted in this forum, not this thread but another one.
    Can you post a screenshot of Memory using task manager's performance tab?
     
  6. Agonist

    Agonist Ancient Guru

    Messages:
    4,287
    Likes Received:
    1,316
    GPU:
    XFX 7900xtx Black
    Wish my monitor did. But I have no freesync issue, even in 21:9 4k movies.
     
  7. PetBB

    PetBB Member

    Messages:
    13
    Likes Received:
    5
    GPU:
    Radeon VII, R9 Fury
    Sure. Here's the link: Freesync in VLC
    I used 50fps movie. As a display I used Asus XG258Q. It has Freesync range at 48-240Hz. I chose this model for demonstration because it has the most stable fps counter implemented in OSD. I have plenty other models with Freesync at work and I must say that fps counters in their OSDs aren't very precise. Especially when LFC works they usually show strange numbers. But everything is always fine with the motion.
    I used 64-bit version of VLC. To work with Freesync it may be necessary to switch video output module to Direct3D or OpenGL. Also, since 18.2.3 driver, it's mandatory to create Radeon Settings profile for VLC with forced Freesync.

    I recommend this demo for tests: Freesync demonstration It shows that stable framerate with Freesync isn't something strange.

    Freesync is also amazing for emulators. Some games have been designed for strange framerates like R-Type for 55Hz or some Williams and Midway arcades that work at 53Hz. After many years, thanks to Freesync, I can play them with smooth animation.

    Of course but I don't understand what's the connection between memory usage and Freesync? And in what situation I should make this screenshot?
     
    Last edited: Feb 27, 2018
    Shellar likes this.
  8. Wolf2079

    Wolf2079 Member

    Messages:
    14
    Likes Received:
    0
    GPU:
    NITRO RX 580 8 GB
    Good day, I need help, it turns out that from one time to another my Sapphire Nitro RX 580 8gb at rest shows the memories in 2000 mz, when the normal thing is that they are in 300 mz, I installed from the drivers 17.12.1, 18.2. 1 and now 18.2.3 and the problem persists, any suggestions would be appreciated, greetings
     
  9. PetBB

    PetBB Member

    Messages:
    13
    Likes Received:
    5
    GPU:
    Radeon VII, R9 Fury
    Maybe that's it?: https://bugzilla.mozilla.org/show_bug.cgi?id=1417241
     
  10. Eastcoasthandle

    Eastcoasthandle Guest

    Messages:
    3,365
    Likes Received:
    727
    GPU:
    Nitro 5700 XT
    The bottom portion. The small bar graph with Memory Composition to Non Page pool will do.
     

  11. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Technically speaking it is player's fault. There has been solution since day 1st encoder with motion vectors came. Videos are encoded as sequence of images and changes done to them. These changes are represented by motion vectors (they have direction and length + time this change happens is defined by encoded Hz). You can shorten motion vector in half and double number of frames generated from video. Or you can calculate time perfect images from 24Hz video into 60fps video stream going to monitor.
    In all honesty, I would consider that 18Hz video unwatchable anyway. But best solution is to get 120Hz+ screen. Then frame time variance drops at least to 1/2 of 60Hz for all content.
    Other way is to use video processing (if available via driver or browser plugin). Lastly, video can be downloaded and played in player capable to handle fps upscaling.
     
  12. PetBB

    PetBB Member

    Messages:
    13
    Likes Received:
    5
    GPU:
    Radeon VII, R9 Fury
    Thanks for the advice. Of course I know that I can use some post-processing methods to render the movie into different fps. For example I like SVP project very much. But I also like to watch movies in their original form and without distortions caused by post-processing filters. Using portals like Youtube or FB to watch few shorts I don't want to spend time on downloading and then playing with proper player or for reencoding them all. With Freesync it just works. There is a reason why when we watch BR discs from standalone bluray player on real TV sets it will choose 23.976Hz refresh rate and not just 60Hz.

    My original post wasn't about looking for help how to work around the problem. Like I said - simple Radeon Settings profile will solve the problem for video players. It was just to communicate about what changed in latest drivers to spare some time for others videophiles like me :) I'm still interested if someone knows how to restore Freesync to Opera or Firefox with this driver.
     
  13. Romulus_ut3

    Romulus_ut3 Master Guru

    Messages:
    780
    Likes Received:
    252
    GPU:
    NITRO+ RX5700 XT 8G
    Could you elaborate on the bold part, please? GCN 1.0 cards have feature level 11 and 11_1 too. I am sure that too is to be blamed on nvidia?
     
  14. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Feature levels have mandatory and optional support components. :)
    (To put it simply.)

    https://en.wikipedia.org/wiki/Feature_levels_in_Direct3D

    You can see the red and yellow bits in the charts a bit further down although how overall important these are and how well utilized the feature levels and backwards compatibility actually is in the games themselves I can't really say. Not too well I would imagine unless the game was built ground up for supporting DX12 or on the other side Vulkan over OpenGL with DOOM and Wolfenstein standing as showcases for this API.
    (And little by little CroEngine and Serious Sam Fusion having features such as swapping from one API to another from in-game which well that's unusual and then there's Ashes of the Singularity showing mixes multi-GPU support as also something pretty rare.)

    EDIT: For Windows 10 they've also updated WDDM for the device driver model a few times to I think it's at 2.3 now and then there's DXGI as well up to I think it's 1.6 now? (Some smaller features in RS3 was it?) in addition to D3D itself which is at 11.4 and 12.1 for now for D3D11 and D3D12 respectively as both are actively updated.
    (MSDN and Technet has more in-depth articles including changes but it's fairly technical and nothing really too important for regular users and more for developers.)


    I'm pretty much a amateur if even that myself so I have a lot more to learn too but I'm trying to stay somewhat up to date although it's a pretty broad subject with years of development and that's just the D3D thing itself as one part of DirectX and then a part of Windows which also has changed since XP, Vista, 7, 8 and now 10 :D

    (The little numbered notes on that wikipedia article should link to some more in-depth info too for each respective component or part of 12_0 and 12_1 support and feature levels in general.)

    Might help a little bit, unless I misunderstood the question entirely, if so I'm sorry about that. :)


    Doesn't really help when they (EDIT: AMD, NVIDIA.) divide up games between themselves too and you get various levels of support both pre and post release to just name one thing.
     
    Last edited: Feb 27, 2018
  15. S3r1ous

    S3r1ous Member Guru

    Messages:
    152
    Likes Received:
    25
    GPU:
    Sapphire RX 6700
    I believe in the idea that pretty much all problems center around the implementation.
    You can get near everything to work even with older APIs/Shader Models/Feature Levels
    as long its not hardware dependent.
    Then again sometimes its deemed not worth the effort to redesign from scratch/do huge overhaul and developers simply go with just supporting whats easier at first. They may take up the effort later for larger patches or when they are creating something new thou, etc.
    If the people working on it are really passionate and care, they probably
    will address all or most of the issues over time.
     

  16. Turanis

    Turanis Guest

    Messages:
    1,779
    Likes Received:
    489
    GPU:
    Gigabyte RX500
    A note for Radeon owners:

    This driver is a disaster in Futuremark API Overhead.In games is so-so,not a plus or a minus fps.

    API Overhead:
    minus 172.200 vs 18.1.1 (DX11 single thread&multi)

    minus 188.900 vs 18.1.1 (Vulkan).
     
  17. Eastcoasthandle

    Eastcoasthandle Guest

    Messages:
    3,365
    Likes Received:
    727
    GPU:
    Nitro 5700 XT
    Tip: When using api overhead...

    1. Use RamMap to empty both Standby list and Modified Page List.
    2. Search for AMD's DxCache and GLCache Folders. Delete the contents in them. Should be the 1st 2 entries in your Explorer Search. 2 or so entries might not be deleted.
    3. Start 3DMark and run the API Overhead test.
     
  18. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Shouldn't the benchmark be run twice since the first would be caching the shaders again? If the user has enabled DirectX shader caching of course or overriden it per profile or on a global level.

    I had to disable it since after a few drivers it started caching Windows processes which wasn't entirely stable (Start menu, explorer and a few others.) though the drivers will still cache OpenGL data and that can't be overriden.
    (Steam for example, not a fan of it but they have two OGL test executable files that run on start now since a few versions back of that client.)

    Per-profile might work though, not all overrides might so it could be a bit buggy but shader caching should work last I tried, helps smooth things out a little bit for older games that don't have a cache of their own though it's mostly for loading times from what I've tested although very shader heavy games might see a reduction in stutter too depending on how these are compiled.

    Call of Duty and it's extremely lengthy initial load times ha ha, it does store the cache data though but patches and driver updates will invalidate that requiring another lengthy wait for it to compile again.

    Dishonored 2 is a interesting one as well, ships with a pre-compiled shader cache but if AMD's is enabled it'll take minutes for it to boot to the main menu as the driver is now caching the shaders too though this is a one time thing and might be faster on a modern system with decent RAM and instead of a HDD a SSD giving some speed boosts to writing and reading data.

    (And more I guess, I could see benchmarks having a decent set of shaders that needed to be compiled or cached.)
     
  19. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Also the driver and some of the previous ones have been even more unstable in regards to fan speed settings than older ones, minor issue overall though but another cold week here, doors open and the GPU is idling at 22 degrees barely pushing into the low 60's for very demanding games and the fan is ramping up to 100% ha ha, it's a known issue for Wattman though since a long time back with some GPU models just being very sensitive and prone to incorrect values and bursts of running at full speed even at low temps or workloads. :D
    (Just trying to lock a speed in in Wattman should see the reported value being off by a few hundred rpm and features like zero speed is mostly non-functional although I prefer having the fan spin at a constant min value myself.)
     
  20. Turanis

    Turanis Guest

    Messages:
    1,779
    Likes Received:
    489
    GPU:
    Gigabyte RX500
    I always do benchmarks with fresh driver intall and after I let windows for ~30min to make his internal tasks. :)

    This 18.2.3 for Win7/8.1 is not good in terms of DX11 or Vulkan perf vs 18.1.1.
     

Share This Page