AMD Catalyst 15.6 Beta Driver for Windows OS out now

Discussion in 'Videocards - AMD Radeon Drivers Section' started by AMDMatt, Jun 22, 2015.

  1. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,005
    Likes Received:
    139
    GPU:
    Sapphire 7970 Quadrobake
    This is the driver I mean, we probably mean the same.

    I was actually comparing it with the 280, not the 280x. They sit on the same pricepoint as the 280. As for the Witcher, I run it with all @ max except shadows and grass distance @ high, at 1440p30 locked. These benchmarks are using the 15.4 driver (and not 1040 which is much better), and still you can see that the 285 (which is slower than the 280x), is destroying the GTX 770 @ 1080p Ultra, and it is 3fps ahead of the 780.
    http://media.Disney.net/images/media/2015/game-bench/witcher-bench-1080-u.jpg

    That's on a review that the guys who did it admitted that they even had problems with the AMD drivers, and they couldn't run the bench not them. This is not even the "Witcher" driver of the old "fork", and 1040 contains the game optimizations and it is from the "new" fork.

    You are referring to this awesome post. What I'm talking about though is not API calls and the rest, but the algorithms that the game engine is implementing. Needing to bypass the usage of square root calculations on GCN for example, or reducing shaders to sizes that fit GCN cache, is a fundamental design choice you make for the engine, and you do that regardless of the API you'll use. You use the API to communicate what you are doing to the hardware, but what you are doing actually depends on said hardware. The extremely ****ty Batman port is a good case study for exactly that. It is obvious that it is PS4 code, almost directly ported. And oh the surprise, the 280x is faster than the 780 at it. In a game that NVIDIA has a special driver for, and the AMD cards are not even using the "good" driver for (1040, or the latest R300 driver).
    [​IMG]

    Windows releases have different code names (Whistler, Chicago etc), but kernels only have version numbers. Vista is a full blown Windows release, and the kernel is just a part of it. The most important change that Vista brought to the kernel was the Windows Driver Foundation, and even that was backported to Windows XP and 2000. Most people tend to confuse kernels with the OS on top of them.
    Graphics drivers are extremely complex pieces of software, that resemble operating systems and on the fly compilers.

    Mantle did quite a lot actually. Speaking from personal experience, the difference in BF4 multiplayer is very very real. Less stutter, much higher lower framerates. In Thief the difference in pure rendering performance was even more pronounced, and with Mantle I had no stutters at all. Same for Civ: Beyond Earth. Up to this point, my experience with Mantle has been extremely positive. These are some good examples of the differences. Mantle was just a proof of concept in the end, and it brought us the base for lower level APIs that are sorely needed.

    AMD will benefit more, because the driver won't really be a factor any longer. Their hardware has proved time and time again that it is usually better than NVIDIA's.

    There was a performance regression apparently, but it was fixed, and NVIDIA engineers were asking people to report if they had any more problems. That makes my "creating hardware around software" theory gain some credit, since it seems that they need to maintain drivers for four different architectures now (Fermi, Kepler, Maxwell v1, Maxwell v2), and they are running out of testing resources.

    What AMD is doing is disgusting and unethical. Windows 10 is not released, but drivers leak :D

    (I'm not easily offended actually :p )
     
  2. gx-x

    gx-x Maha Guru

    Messages:
    1,297
    Likes Received:
    106
    GPU:
    1070 G1

    yea, we do, brain fart here, didn't even read the whole thread name so yea...shame is all over me atm :p

    And sorry about misreading your comparison 280/280x. It's 7 a.m. here, I need to go to sleep lol

    anyway, I read your post with care. You are the guy here that sucks all info from internet and knows what he is talking about so I can't really argue with your statements. I sometimes just call things as I see them (or as marketed/hyped). Your input is valuable and your time probably wasted on me, but I do enjoy our parlors, I always learn something new or get put in place if I am wrong (and I can be stubborn).

    I read your whole thread about dx11 a while ago (and the one that disappeared o_O ) and that gave me a clue (since I am on intel i5) why I haven't noticed almost any difference between using Mantle in Civ 5 over dx11. Textures did stream a bit faster once you start the map, but that was it.

    now in DA:I I also don't see much difference (maybe because 1040 is awesome driver?!) but all in all, yes, it's a great shame on AMD, I've been making jokes on their drivers for years and with VSR thing available only on 290 series due to hardware limitations I just went bonkers. And now we all have VSR but not thanks to AMD, I guess they still claim that it cannot be done. Grrr...

    I do have to comment on this tho, aren't we confusing cheaper with better here? Yes, they gave us cards that perform similar to some nVidia cards but cost less, sometimes way less (I will never forget when I snagged 5850 for 220$ on launch before they raised the price due to demand (bitcoin miners <khm> ) there was just nothing nVidia could do about 5830/50 at that price range. I was a very happy little camper.
    Snagged this Sapphire 280X with a year of warranty for 150$. There is nothing nVidia can do about it (miners selling cards, lot of them) so are they better or just plain to cheap to pass?! I tend to think that AMD just offers similar performance for less money for those ho don't want to pay premium. That's my opinion anyway. There is also plenty of software that benefits from CUDA more than OpenCL or DCU, not games, applications, like 3D software etc.

    PS. I don't play BF4 but I saw benchamrks and it doesn't seem to do anything on intel i5/i7 with raadeon 280x/290. Of course, you can't see "smoothness" in charts but you can look at frametimes and spikes or the lack of thereof.

    Thief - sorry, hate stealth games. Want to kill/shoot, not sneak. Too old for that $hit. I am kinda guy that eagerly awaits new doom and new need for speed reboot.

    cheers! :)
     
    Last edited: Jul 3, 2015
  3. Astralify

    Astralify Member

    Messages:
    10
    Likes Received:
    0
    GPU:
    AMD Radeon R9 290 Tri-X
    Hey guys. So does this driver(1014) have the optimizations from the 1040 driver? I mean did anyone make some comparisons between the two (mainly in games like Witcher 3 and GTA)? Or I should just go for the 1040 driver. I have a 290. Thanks. :)
     
  4. rafaelluik

    rafaelluik Active Member

    Messages:
    93
    Likes Received:
    0
    GPU:
    AMD Radeon R9 280

  5. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,005
    Likes Received:
    139
    GPU:
    Sapphire 7970 Quadrobake
    @gx-x I believe we are more or less on the same page.

    As for the Frame Limiter, I just saw the tweet of the guy. It is only for the 300 series. Right. So Matauri and Unwinder can make it work for all hardware on the planet, but a whole company can only make it for the "new" series comprised of "old" cards. This is ****ing pathetic.
     
  6. leszy

    leszy Master Guru

    Messages:
    316
    Likes Received:
    13
    GPU:
    Sapphire V64 LC
    We can't know if it is the same chip. We only know, it's the same PCB on 290-390 . For now, only some NV users, "know" it's the same chip ;) Looking at performance, I don't think it's the same chip. Anyway, this chip has name Grenada not Hawaii. Of course NV users "know", that new name it is only marketing trick - but I am not NV user :)
     
    Last edited: Jul 3, 2015
  7. leszy

    leszy Master Guru

    Messages:
    316
    Likes Received:
    13
    GPU:
    Sapphire V64 LC
    28" 4K FreeSync monitors are priced now around $500.
    https://www.overclockers.co.uk/productlist.php?groupid=17&catid=948
     
  8. gx-x

    gx-x Maha Guru

    Messages:
    1,297
    Likes Received:
    106
    GPU:
    1070 G1
    TN panel? Thanks but no thanks. It's 21st century. Besides, 30fps is 30fps, it won't stutter on freesync but it's still low fps.

    PS. I personaly am not in the market for high price high end gaming. 1080p is just fine for me.
     
    Last edited: Jul 3, 2015
  9. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,005
    Likes Received:
    139
    GPU:
    Sapphire 7970 Quadrobake
    Dude, please mercy. It's a frame limiter. A brazilian guy can make it work for all the GPUs on the planet, and AMD can't make it work for their own hardware? :infinity:
     
  10. gx-x

    gx-x Maha Guru

    Messages:
    1,297
    Likes Received:
    106
    GPU:
    1070 G1
    and we know it's the same chip...
     

  11. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    7,005
    Likes Received:
    139
    GPU:
    Sapphire 7970 Quadrobake
    It doesn't even matter. Unwinder with RTSS and Matauri with RadeonPro made the frame limiter, vsync and triple buffering work with every single card. From NVIDIA monsters to Intel integrated. There is no excuse except that some suit decided that it would be part of the "kewl" series.
     
  12. sammarbella

    sammarbella Ancient Guru

    Messages:
    3,931
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    And what is good for?

    4K gaming still need CFX to reach STABLE 4K/60FPS even using Fury X.

    Where is the crossfire FreeSync AMD support without ghosting?

    Who wants to play 4K games at 30-35 FPS with single GPU?
     
  13. leszy

    leszy Master Guru

    Messages:
    316
    Likes Received:
    13
    GPU:
    Sapphire V64 LC
    I use Gigabyte R9 290 for eighteen months. OC version - the selected good chip with a selected memory. I know how is scaling the 290 during OC. I know how far can go R9 290 without raising the voltage and how much with a higher limit. Efficiency, which is the standard for 390 is attainable for few 290. Never mind, I'm not going to convince anyone. Anyway, here are people who compared the mask 290 and 390. They know their blueprints and just know better. Or so they believe. Faith works miracles. Is it possible to change the views of someone who is truly a believer? The guy at HardOCP not seen the masks and speculated some in his review, and certainly he believed in the truth of their own supposition. We do not believe in the statement of AMD, but we believe in speculation of the guy from HardOCP. We have every right to do so. In the end, everything around us is only a matter of faith. After all, each of us can be, only a brain connected to an computer :)
     
  14. leszy

    leszy Master Guru

    Messages:
    316
    Likes Received:
    13
    GPU:
    Sapphire V64 LC
    Yes. Set for $ 1,150 will not give you comfortable playing in 4K, but for $ 1,800 you can already have an excellent, quiet setup. It's much better than a year ago. FreeSync for CF also probably will come up.
     
  15. oGow89

    oGow89 Maha Guru

    Messages:
    1,213
    Likes Received:
    0
    GPU:
    Gainward Phoenix 1070 GS
    What?
     

  16. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,144
    Likes Received:
    74
    GPU:
    RX 580 8GB
    Efficiency, you mean power usage or what? I don't get it. The 390 is at 1050MHz, not a big deal. Install the modded Windows 10 drivers and overclock 50Mhz and call it a day.
     
  17. The Mac

    The Mac Ancient Guru

    Messages:
    4,408
    Likes Received:
    0
    GPU:
    Sapphire R9-290 Vapor-X
    If you believe AMD, it operates by controlling powerplay, not strictly limiting frames..

    The reason it only works on 390 is because they redesigned the power control on the chip.

    This was one of the upgrades from 290.
     
  18. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,144
    Likes Received:
    74
    GPU:
    RX 580 8GB
    200 series is supposedly supported.
     
  19. The Mac

    The Mac Ancient Guru

    Messages:
    4,408
    Likes Received:
    0
    GPU:
    Sapphire R9-290 Vapor-X
    not according to hallock.

    3xx only.
     
  20. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,144
    Likes Received:
    74
    GPU:
    RX 580 8GB
    Source: https://twitter.com/Thracks/status/616627726162767872
     

Share This Page