Radeon Software Crimson ReLive Edition 17.2.1 Download & Discussion

Discussion in 'Videocards - AMD Radeon Drivers Section' started by Warkratos, Feb 13, 2017.

  1. THEAST

    THEAST Guest

    Messages:
    221
    Likes Received:
    26
    GPU:
    GTX 3080
    No, it isn't. I finally updated from 16.11.5 because you said the problem with EDID override has been fixed but it hasn't; even the WHQL driver still completely ignores all custom resolutions after reboot.

    The UVD bug is also still there (as expected).

    DX12 Linked-node Multi-Adapter also still does not work for non-symmetric setups like mine (I shouldn't expect AMD to fix this, though).

    Finally, just like every other driver released in the past year, not only my 3DMark score didn't increase at all (unlike certain people who keep reporting score increases for every single driver), it is still around ~150 points lower than my best score with v16.3.2 that came out nearly a year ago.

    Afterburner seems to be working correctly though; probably the overclocking API issue plaguing most older GCN cards does not apply to GCN v1.0.
     
    Last edited: Feb 24, 2017
  2. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    2,952
    Likes Received:
    1,244
    GPU:
    .
    Are all three GPUs setted as a single crossfire unit in the GPU control panel? I do not have 3 GPUs to verify if the 3-CFX is supported in D3D12, but they need to be seen as a single adapter with 3 nodes to enable Linked-node mode. Otherwise they could be used only in multi-adapter (one 7950 and a 7970 with two nodes).
    In both cases, the applications need to support those hardware configurations.
     
    Last edited: Feb 24, 2017
  3. THEAST

    THEAST Guest

    Messages:
    221
    Likes Received:
    26
    GPU:
    GTX 3080
    Actually I just have two cards (maybe the way I have written my setup info is misleading), one 7950 and one 7970 working in CF, which work just fine together in DX11 but don't work at all in DX12 Linked-node Multi-Adapter. More info can be found in this thread.
     
    Last edited: Feb 24, 2017
  4. AlleyViper

    AlleyViper Master Guru

    Messages:
    562
    Likes Received:
    116
    GPU:
    Strix 2070S|6700XT
    TBH, everything I play doesn't require more than 8GB. So the next 8x2 or 16x2GB kit that I'll ever buy should be in ddr4 for another platform to replace this resilient AM3 fossil.
     

  5. Alessio1989

    Alessio1989 Ancient Guru

    Messages:
    2,952
    Likes Received:
    1,244
    GPU:
    .
    You can test DirectX 12 linked adapter mode with this sample: https://1drv.ms/u/s!AgiWqxyHqa0mgcIHl6IXfTdcD_9wHA

    Both D3D12LinkedGpus.exe and D3D12LinkedGpusAffinity.exe should run in AFR mode if you enable CFX in the GPU control panel (both samples should provide the same performance, more or less..). The D3D12SingleGpu.exe is provided just for reference, and will use only one GPU.
    You may need to install the last Visual C++ redistributable (x64) to run those samples: https://www.microsoft.com/en-us/download/details.aspx?id=53840
     
  6. OnnA

    OnnA Ancient Guru

    Messages:
    17,963
    Likes Received:
    6,824
    GPU:
    TiTan RTX Ampere UV
    Hmm, yup it's not working.
    I forgot that i have changed my Custom res. in Relive (so i have my 1920:1440 lol)
    But it missing my 2048:1536 66Hz ! and others from CRU. (but FS is working)
    I will ask Toasty about it ;)
     
  7. THEAST

    THEAST Guest

    Messages:
    221
    Likes Received:
    26
    GPU:
    GTX 3080
    All samples run on one GPU. I have already tried Time Spy and RotTR with the same results. The only thing that works on my setup is AotS but obviously that is because AotS works in Multi-node mode and needs CF to be disabled in the driver.
    Based on the reply I got from 3DMark's support, in Linked-node mode, the application queries the number of available GPUs from the driver and will use any number of GPUs the driver provides; in my case, for whatever reason, AMD's driver reports only one usable GPU to the application. I guess AMD's engineers couldn't be bothered supporting such rare setups.

    This is a bug in the driver, what do you expect him to do about it?
     
  8. Luraziel

    Luraziel Member

    Messages:
    21
    Likes Received:
    0
    GPU:
    XFX R9 270X Xfire
    oh wow! I just loaded Battlefront 2 last night and there was no discoloration! these drivers rock! (even if the UI Is sluggish and laggy...)
     
  9. Virs

    Virs Guest

    Messages:
    164
    Likes Received:
    0
    GPU:
    R9 390
    The WHQL version didn't fix the grass issue in ESO.
    I also hadn't tried Sims 4 in a long long time, tried it again, and it suffers a menu bug similar to multiple other games. Where some games with a lot of 2D elements in them (typically menus) have a lot of screen flicker going on. Sims 4 has plenty of those flickers during character creation.
     
  10. OnnA

    OnnA Ancient Guru

    Messages:
    17,963
    Likes Received:
    6,824
    GPU:
    TiTan RTX Ampere UV
    lol, so we need to wait for Updated drv. then :)
     

  11. bjoswald

    bjoswald Guest

    Messages:
    156
    Likes Received:
    6
    GPU:
    Intel UHD 630
    Hope there's a new driver soon. Flickering textures are annoying and the DLL errors shouldn't have happened. I'm sticking with the non-WHQL until then.
     
  12. MaCk0y

    MaCk0y Maha Guru

    Messages:
    1,283
    Likes Received:
    703
    GPU:
    4090 ICHILL BLACK
    With this driver, I get stuttering when Freesync is on on. 16.1.2 are Ok.
     
  13. Rambo

    Rambo Master Guru

    Messages:
    374
    Likes Received:
    331
    GPU:
    RX 7800 XT
    Is mantle working with FRCT? Tried to play DA:I but fps counter gets mad :banana: just want to have ~50 fps (locked to 51) but cannot force Mantle to keep fps_lock.
    Is there any way to force Mantle to keep FRCT?
     
    Last edited: Feb 25, 2017
  14. BugMeister

    BugMeister Guest

    Messages:
    166
    Likes Received:
    1
    GPU:
    2x RX480 NITRO+OC(8G)
    - this driver running beautifully with the recent 15042 insider build..
     
  15. LocoDiceGR

    LocoDiceGR Ancient Guru

    Messages:
    2,861
    Likes Received:
    1,075
    GPU:
    Gigabyte 3060 Ti
    Grim dawn is crashing with 17.2.1, no problem with previous driver.
     

  16. kurtferro

    kurtferro Guest

    Messages:
    115
    Likes Received:
    1
    GPU:
    SAPPHIRE NiTRO R9 390 WB
    No i use last stable build, most have same problems
     
  17. vestibule

    vestibule Ancient Guru

    Messages:
    2,192
    Likes Received:
    1,409
    GPU:
    Radeon RX6600XT
    Is it just me or do RX4 series cards and drivers hate ID tech5 engined games.
    Rage ok ish.
    Dishonoured2 ok ish.
    Wolfenstein new order. feel the pain.
     
  18. Chastity

    Chastity Ancient Guru

    Messages:
    3,745
    Likes Received:
    1,668
    GPU:
    Nitro 5700XT/6800M
    It's the OpenGL driver. It needs the same attention that the DirectX ones got over the last year. Not optimized at all.
     
  19. vestibule

    vestibule Ancient Guru

    Messages:
    2,192
    Likes Received:
    1,409
    GPU:
    Radeon RX6600XT
    Ty.
    Guess I will chip in and send in a report.
    Who knows, they may fix it, but then pigs might fly.
    ID tech6 rules. :)
     
    Last edited: Feb 26, 2017
  20. loekf

    loekf Guest

    Messages:
    1
    Likes Received:
    0
    GPU:
    RX460 / 4 GB
    Hi,

    I finished a new HTPC build during the weekend. Installed Crimson 17.2.1, but the additional settings I'm seeing are very limited. My main issue is the lack of overscan/underscan settings for my TV.

    Do you know if the workaround still could solve this ? Also for 17.2.1 ? The post refers to 17.1.1.

    So according to a post I found I should remove the driver first, then install an older release like 16.11.5 (ok for RX4xx support AFAIK), then install 17.2.1 in custom mode (don't install settings/ccc).

    I don't see GPU scaling a good solution. As an alternative, maybe creating a custom resolution with special timing might do the trick as well, but didn't look into this.

    For my old APU system usually an underscan/overscan correction of 10% did the trick.

    Just wondering, what's AMDs big plan here ? The current driver has very limited controls. Why would you remove the old settings before implementing new ones, or do they want to get rid of them completely ?

    (For clearity, parts of the desktop/screen are now outside the screen, so not visible)
     
    Last edited: Feb 27, 2017

Share This Page