AMD Radeon Software Crimson 16.3.2 driver

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Mar 29, 2016.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,541
    Likes Received:
    18,843
    GPU:
    AMD | NVIDIA
    You can now download the new AMD Radeon Software Crimson Edition drivers release 16.3.2 driver Version March 29 2016. This driver is compatible with Windows 7, 8.1 and 10....

    AMD Radeon Software Crimson 16.3.2 driver
     
  2. KissSh0t

    KissSh0t Ancient Guru

    Messages:
    13,948
    Likes Received:
    7,765
    GPU:
    ASUS 3060 OC 12GB
    *edit*

    Stupid me, wrong section.
     
    Last edited: Mar 29, 2016
  3. ApolloKT133

    ApolloKT133 Guest

    Messages:
    18
    Likes Received:
    0
    GPU:
    Sapphire Fury
    As always, thank you for posting these.
    For me it's too late, I'm shipping the Fury back after reanimating it countless times until it finally clocked itself to death on Saturday.
    No I'm not buying an Nvidia thingy, I'm going back to my R 290 and wait until they really finish this beautiful piece of technology.
     
  4. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Polaris might be something to look out for then as a upgrade but it seems more focused on power efficiency than pure speed so it might perhaps not be that much faster than the Fury in terms of performance when it's released.
    (Also they won't use HBM1 so they're not limited to only 4 GB of VRAM, unsure if they're GDDR5 or the faster GDDRX5 though but they're not HBM2 as that's apparently planned for the next GPU model called Navi was it?)

    Nvidia haven't really announced much about their upcoming GPU's other than some unofficial rumors so I have no idea how that GPU will be although if it's priced right it might be a good alternative too.
    (Depending on what you are comfortable with paying of course, here when I bought the Fury I'm using now the 980Ti models were around 60% more expensive to give a comparison although it's not like Nvidia will price all their upcoming GPU models around the 900$ - 1000$ range either of course.)


    EDIT: As for 16.3.2 I had some minor display corruption during the boot sequence but if it's fixed otherwise I don't really care too much about that, those other fixes aren't exactly too important for me but it's good to have of course.
    (No idea about Quantum Break support for April 5th or Dark Souls 3 support for April 12th, game isn't exactly demanding though from what I've seen so far with a 750 GPU performing really well for the most part so I'm not too worried for that particular game although Crossfire support would probably be good to have for those using multi-GPU setups.)

    EDIT: According to the profile dump these have a new profile for Killer Instinct compared to 16.3.1, no real changes otherwise but the profile file generally doesn't contain the newest profiles, that's in the actual .dll's as I understood it.
     
    Last edited: Mar 29, 2016

  5. Cave Waverider

    Cave Waverider Ancient Guru

    Messages:
    1,883
    Likes Received:
    667
    GPU:
    ASUS RTX 4090 TUF
    Does this driver's Radeon Software or Pseudo-CCC (Radeon Additional Settings) finally come with the "Enable AMD Crossfire for DirectX 9/10/11 and OpenGL applications that have no associated application profile" checkbox again or do I have to keep using the old Radeon Beta Software?
     
  6. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    HBM1 isn't limited to 4GB, you can have more than four stacks.
     
  7. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    Oh I see, so it's more of a question on how complicated it would be to "wire" all the HBM stacks on the GPU then what with tracings and connections and the memory controllers and all that stuff, plus of course the costs involved as HBM is probably pretty expensive.
    (I guess cooling is another important factor as the HBM stacks sit on the GPU die so heat can be a concern although it's not like they're stuck immediately on top the actual GPU core either.)
     
  8. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    It is pretty counter intuitive, but stack in HBM terminology is layer/slice in tower.
    (At least that's how it was in HBM1 material. Which sadly on many pages mixed HBM1 information with HBM2 plans in way that it was hard to distinct if HBM1 is supposed to reach certain capability or not.)
     

Share This Page