AMD Catalyst 15.6 Beta Driver for Windows OS out now

Discussion in 'Videocards - AMD Radeon Drivers Section' started by LtMatt81, Jun 22, 2015.

  1. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    This guy on Overclock.net is giving a very nice explanation:
     
  2. Dburgo

    Dburgo Guest

    Hi all I seldom write in this forum but browse here quite often. I was receiving great performance with these drivers, but windows 10 keeps installing over my drivers with thier mediocre drivers. Anyone know how to block it?
     
    Last edited by a moderator: Jun 29, 2015
  3. Hootmon

    Hootmon Guest

    Messages:
    1,231
    Likes Received:
    6
    GPU:
    XFX THICC III Ultra
    I think I just got smarter after reading that.
     
  4. Hootmon

    Hootmon Guest

    Messages:
    1,231
    Likes Received:
    6
    GPU:
    XFX THICC III Ultra
    I have heard that uninstalling the drivers with DDU will prevent that, but I have no personal experience with Win 10 yet.
     

  5. MatrixNetrunner

    MatrixNetrunner Guest

    Messages:
    125
    Likes Received:
    0
    GPU:
    Powercolor PCS+ R9 270X
    No need for them to intentionally gimp their old GPUs. The environment is set up so the older GPUs are naturally going to fall through the cracks. NVIDIA GPUs are hyper-optimized for the current software landscape, while (currently) AMD is using a more long-term strategy with GCN and a bit of brute force to cover up the weak spots.

    As I understand the NVIDIA Kepler problem, the issue is that unlike AMD, NVIDIA chose not to disclose architectural implementation details for Kepler, and perhaps even Maxwell. This means that game developers are much more dependent on NVIDIA to optimize the performance, rather than the game developers doing the optimization themselves. I got this from a blogpost about GCN architecture so it can be false, but it fits into the observed performance issues.

    Kepler SMX unit has 192 shaders, but their scheduler supposedly can only feed 128 shaders at one time. To utilize the SMX unit completely, some driver magic is required. Maxwell SMX unit is 128 shaders in size and can keep all of them busy. Going by their own advertising a Maxwell SMX performs at 80% of a Kepler SMX unit.

    By making the game developers dependent on them for libraries (GameWorks) and performance optimizations, NVIDIA are forcing a role for themselves in the PC games market (since they lost out on the consoles by pissing off MS and Sony in the previous generations). Their driver teams, however good, are falling behind on the workload the company chose to undertake in order to capture a larger market share. Hence last several months of driver problems.

    The reason they are so aggressive is because they are being choked out from the PC market by Intel and AMD. The discrete GPU market is shrinking, and once faster memory types come (HBM anyone?) it will be possible to create an high performance APU (2017 Zen HBM APU?) that will be sufficient for most users, leaving the add-in market to ultra high-end enthusiasts and even more expensive GPUs than today. NVIDIA does not hold a x86 license at this time, so they will not be able to compete in the PC compatible market.

    Unless they launch a console? They would have to do it themselves, because they don't have a good track record of working well with others (Tegra phones and tablets, original X Box lawsuit, laptop chips solder debacle,...)

    These are just some educated guesses, but it makes sense to me.
     
  6. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    This is exactly what I believe it is going on. NVIDIA is taking the more "now" approach, while AMD is trying to literally lodge themselves into the whole ecosystem and judging by Vulcan and DX12, as well as the omnipresence of GCN, they have made it. GCN was the best bet that AMD took after the Athlon I believe.

    I didn't know any of this. Do you have any links for all that? Developer documentation etc?

    As far as I see it, they try to pull an Apple/appstore thingy. Since nobody will optimize for their hardware anymore (all console games are optimized on the assembly level for GCN), they will slowly become a middleware company that sells GPUs. They will want to be the developer's "portal" for PC ports. They will tweak their middleware to run well only on their hardware, thus negating the performance deficiencies of said hardware. That means though that older versions of that hardware will start to suffer as newer generations of it appear, since they won't be able to maintain proper performance with it from a resource and QA standpoint. That is starting to be apparent with Kepler and the mess that was made with the drivers. NVIDIA guys on this thread pleaded for performance and bug reports, and I believe them. I believe that they don't have the internal resources to do that on a large scale any more.

    The only other player that has it is VIA, which could be a purchase target, although I have a feeling that the licensing to VIA entails clauses about a company sale (I might be wrong). The integrated performance at 14nm with a 135W envelope looks scary for NVIDIA right now, that is true. But that means that AMD has to hit both the node and the processor targets for 2016. Looking at the people on the problem, I have a feeling that they might actually pull it off this time.

    The feeling I gather about them is that nobody wants to work with them. Not Microsoft, not Sony, and not even some of the gaming companies they had dealings with. The blog post by Valve's Rich Geldreich was very enlightening about what they try to do with game developers:
    Where vendor A = NVIDIA.
     
  7. djsebfr

    djsebfr Guest

    Messages:
    132
    Likes Received:
    0
    GPU:
    Nvidia GeForce GTX 950
    Hi,

    I'am playing batman AK, with 15.5 beta details maxed without Gamerworks, and it's running very smooth, no stutters or bugs with single card HD7950 OC 1000/1500...(2560*1080).

    Maybe it was patched yesterday ? :3eyes:

    :banana:
     
  8. MatrixNetrunner

    MatrixNetrunner Guest

    Messages:
    125
    Likes Received:
    0
    GPU:
    Powercolor PCS+ R9 270X
    They did release a patch that enables ambient occlusion and rain effects on Batman and Batmobile. Perhaps they fixed something else as well.
     
  9. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    They are not even reselling it yet. It's like trying to drive a car that the car company still has in recall ffs. :p
     
  10. MatrixNetrunner

    MatrixNetrunner Guest

    Messages:
    125
    Likes Received:
    0
    GPU:
    Powercolor PCS+ R9 270X
    They said that the plan is to release patches as they get them ready, while the main issues are being worked on. Then it will be back on sale, probably with a bit of "We are sorry" free DLC.

    http://www.polygon.com/2015/6/28/8858541/batman-arkham-knight-pc-fix
     

  11. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    All this talk about how GCN is great, yet, HBM + 5000+ compute units loose to 2800 cuda cores and ddr5.

    Facts are facts.
     
  12. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    It's 4096 compute units, and it's actually better the higher the resolution gets. GCN is the only thing keeping them competitive, judging by the state of their drivers.
     
  13. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    yes, 4096, my bad. Yes it's better than the rest of the cards before it because it has more compute units, that's about it. And yes, it gets better in higher resolutions but it's still behind 980Ti. And it doesn't even matter since no one is going to play games in 4K @30-40fps on a 650$ card and 2000$ monitor. Same goes for 980Ti and TitanX.

    anyway, I am glad AMD did made something that comes close to nVidia, so don't get me wrong. I just fail to see how this hype about GCN is still a thing. It's old tech.

    PS. I think that the low prices are the thing keeping them competitive. Since Fury X doesn't have a low price, I doubt it will be competitive at all.

    Next year: Pascal from nVidia and more rebrands and GCN from AMD. :3eyes: I hope I am wrong but...
     
    Last edited: Jul 3, 2015
  14. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    So is Windows NT.
     
  15. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    What about windows NT?
     

  16. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    What he's telling you is that a good design stays around for years, if properly tweaked. GCN is fine, as is Windows NT. If you have a solid design you don't have to change it from zero every time, just refine it.

    Windows 10 is really a refinement of Windows NT.
     
  17. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    Windows holds ground thanks to DirectX and gaming not because of "good design". Linux is a better design by far.
    But ok, I get the analogy.

    Anyway, from what I can tell, nVidia does more with less, so apparently they have better design. They use less power, less cuda cores etc.
    AMD will need to do some very good refinement to get more market share. These are all same-ol-same -ol' designs in the eye of the end user. Add more cores, done. They lowered the power usage by inserting HBM and getting rid gddr5 - BRAVO! (really, I am not being cynical. Though, to run 4096 shaders at GHz they needed water cooling. I wonder if Dual Fury X will b e able to stay under 80C even with that).

    anyway, thanks for clarification of that analogy. Though, Windows NT kernel and windows Vista kernel (and up) are totally different kernels.
     
    Last edited: Jul 3, 2015
  18. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,129
    Likes Received:
    971
    GPU:
    Inno3D RTX 3090
    That's a bit subjective actually. The kernel itself is very very well designed, but until recently the internal structure in Microsoft was what was hampering it, innovation wise. Stability wise, if you have a Linux rolling release, you know that Windows doesn't really compare to Arch, but more to distros like CentOS and Debian Stable. It is very stable and faster than the Linux kernel for some stuff (such as desktop usage and grace under heavy I/O), but not as flexible (although this changes with Windows 10 Embedded, and Server Nano). This is a very good read if you are in a mood for a geeky read a rainy morning.

    This is almost 99% their driver. I have repeated this example to the ground at this point, but look at GCN 1.0 and Fermi/Kepler. GCN 1.0 begun its life as a competitor to the GTX 580, and it is now faster or on par in games like Ryse and FarCry 4 with cards like the 780 and the Titan. The reason this happened is drivers. It took AMD two and a half years, and at least three different driver forks (with the more recent being the still unreleased "W10" fork, with 1040.0 being the best driver), to take the 7970 to the point where it should have been originally.
    NVIDIA on the other hand looks more and more to me like a company that designs their hardware around their software. I won't say which approach is best, since both have their advantages and disadvantages. Since AMD is dealing with one design really (GCN), once they solve their CPU bottleneck problem (as it looks like they are doing), the gap will probably close, or even completely reverse.

    You have a point from a marketing perspective, but I believe it is a plus for anybody who buys a GPU to know that they have the same arch as the one on the current consoles. NVIDIA is the one who needs to adapt their code for newer games, and not AMD. This sentence might seem ridiculous to you in a first reading, but once the CPU overhead is removed (or very reduced with drivers like the 1040.0), you'll start noticing that AMD doesn't really seem to need per game driver optimization. The performance is already there because the developers of the games have done that optimization on the lower level. If GCN doesn't like squared roots, you won't find serious game engines using them. If GCN likes compute, you will find serious gaming engines using it. If you exclude the Gameworks effects (and not even all of them, HBAO+ runs better with the 280 instead of the 960 for example), the games themselves are designed around GCN limitations and performance targets. The Witcher 3 was running well on release day (with the "Windows 10" driver). Graphically heavy titles like Assetto Corsa, Ryse, AC Unity, FarCry4, Dying Light, even the crappy Batman port run very well on GCN 1.0, just with a driver that "feeds" the gpu a bit better, with no special optimizations.

    Not that much different actually. Vista pushed more things to User-mode, making the whole system less prone to security breaches, but making sacrifices in other areas. They are all the NT kernel. Even Windows 10 is released as a Windows NT kernel. The architecture hasn't really changed at all (and neither it should, I believe).
     
  19. gx-x

    gx-x Ancient Guru

    Messages:
    1,530
    Likes Received:
    158
    GPU:
    1070Ti Phoenix
    well, basically what you are saying is that we are waiting on DX12 to see if any of this will hold water. I don't know what a 1040.0 driver is, I don't have it (using latest asder00 compiled driver) so I don't know what it does. Also, comparing 280x to 960 is a bit...strange (neither can run Witcher 3 well with HBAO+ and high details). If you were to compare it with gtx 770 then ok but 960? yes it had good performance on release but I think nVidia just didn't optimize the drivers for keplers. gtx 680 still works pretty well imho. 580 is ancient in nVidia terms. I bet it makes less noise than 90% of 7970s and 280Xs on full load :) They learned their lesson with 480...

    and by the way, most game coders do a ****ty job. There was a post a while back where a person that was a coder and worked with coders explained the process of making the ****ty professional codes actually work by adding proper code into drivers to fix mistakes that game coders make all the time. My memory is not so good so I cannot remember the link to that post (another forum) but it was quite lengthy and in detail.

    As for Vista and later kernels, I am pretty sure I remember that they brought some things closer to "the metal" and some things up in the kernel ring, recreated large parts of the kernel and so on, and it's a lot more than just adding more shaders or whatever nV and AMD do. Graphics drivers are closer to the metal, there is WINASPI for sound etc. Sure they are NT kernels since they started from there, but don't they even have different code names?

    Anyway, I tried (and most of us did) Mantle and it doesn't do much. Tech demos on stands where they claim things are marketing BS, I even counted the number of objects and where they claim there are tens of thousands, I counted (I was bored) less than 1000 so...forget that. I am playing Dragon age now, I see no difference in using Mantle or DX11. Maybe a frame or two when fps dips into 40s, and that's it if it isn't just a margin of error due to scene fluctuation. Most progress was made on weak (old 4 core) AMD cpu's with weak AMD GPUs. And even that isn't much and certainly not enough to "turn the tide".
    Like I said, we have to wait for another over hyped thing called DX12 to see how AMD will behave. I don't expect anything. Besides, both companies will benefit from it.

    PS. There is a rumor going on that nVidia cripples Kepler to make people buy Maxwell. That just might be true to some degree. We would need to go back and re-test all the old games and cards to confirm or deny this.
    But, isn't that similar to what AMD is doing now? Making drivers that give edge to 300 series although the hardware is the same as in 200 series? That's actually even worse since kepler and maxwell are plenty different.

    PPS. Just for fun: Windows 10 is not released :) Don't be offended please, I just couldn't resist :0 :)
     
    Last edited: Jul 3, 2015
  20. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,677
    Likes Received:
    287
    GPU:
    RX 580 8GB
    I like to think of Mantle like a proof of concept. Bringing console-like to-the-metal API to PC. The games that supported Mantle used D3D engines adapted/modified to work with Mantle, they weren't built with Mantle in mind, like Frostbite.
     

Share This Page