AMD Catalyst 15.6 Beta Driver for Windows OS out now

Discussion in 'Videocards - AMD Radeon Drivers Section' started by LtMatt81, Jun 22, 2015.

  1. headapohl

    headapohl Member

    Messages:
    41
    Likes Received:
    0
    GPU:
    1080Ti SC2 Hybrid
    Anyone have an idea or heard a rumor when AMD was going to release a unified driver for their cards?
     
  2. AlleyViper

    AlleyViper Master Guru

    Messages:
    505
    Likes Received:
    77
    GPU:
    Strix 2070S A|480 X
    Thanks for your comment. I've completely forget that the TDRs and desktop corruption started on the 15.3b (not on 15.4b) until now. My 7870 seem to have a similar vbios branch 015.017.000.001.000838, equipped with Hynix gddr5 (perfectly stable up to 1350). Also gpu-z doesn't seem to catch those VDDC drops, keeping it steady at stock 1,188-1,189V readings.

    This Asus is a custom pcb model, but I also have a friend with the exact mpc-hc TDR running a reference pcb Sapphire hd7870 under similar updated 8.1x64. His system is completely different (p55 based).
    So, this seems to affect completely different Pitcairn cards.

    I'll also wait for the next major driver release before considering RMA, as the omegas give me no issues. Meanwhile, I'll report the issue to AMD.
     
    Last edited: Jun 28, 2015
  3. jaju123

    jaju123 Master Guru

    Messages:
    355
    Likes Received:
    3
    GPU:
    2x AMD R9 290 Crossfire
    Probably when Windows 10 release date hits us. End of July.
     
  4. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,074
    Likes Received:
    908
    GPU:
    Inno3D RTX 3090
    I read the whole post. It seems that like AMD, NVIDIA are migrating to Windows 10, and the performance from Kepler hasn't yet been carried over completely. The difference in the article was 3-4% max, which is really negligible. It also seems that they are working on it.

    If you mean about Witcher 3, I bet that what they did was lowering the tessellation multiplier on the driver level. There is nothing else you can do about low tessellation performance, and Kepler has less than even Tonga does. I still don't believe they gimp them on purpose. That would be what AMD is doing with their drivers for the 300 series, to the rest of the hardware they have released :D
     

  5. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,652
    Likes Received:
    271
    GPU:
    RX 580 8GB
    They didn't mention specific game optimizations for Kepler, just "Optimizations for Kepler".
     
  6. Undying

    Undying Ancient Guru

    Messages:
    19,901
    Likes Received:
    8,160
    GPU:
    RTX 2080S AMP
    Just a talk, Kepler is left behind. Even new Batman AK shows 780 and 280X perform around the same, isnt that funny? Poor 770 isnt even competitive anymore.

    You remember when 680 and 7970 aka 280X ware same performance? Makes me wonder if the same thing will happen to Maxwell when Pascal comes.
     
    Last edited: Jun 28, 2015
  7. sammarbella

    sammarbella Ancient Guru

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    Let me ask you something as you have first hand experience, i'm just curious. :)

    Do you switched form a 290X CFX to a single 980Ti directly?

    I understand the upgrade if you play a lot of games with broken CFX support but...
    ...How is performing your new 980Ti vs 290X CFX where crossfire is working well?

    I guess your 290X are 4 GB ones so in the case of GTAV or other games who max 4 GB like Mordor with HD textures (this is obvious...):
    Is the 6GB of 980Ti making a difference when settings are maxed and more than 4 GB is used?

    I think 980Ti is a better option to upgrade than Fury X cause it has 6 GB of Vram, performs better at 1080p and the price is similar if not lower.

    There are many games that can't be maxed at 1080p even with SLI or CFX and this doesn't seem to be solved moving the focus to 4K with single GPU!

    I will wait some months more (or a year) to upgrade to 2 GPUs again with at least 8 GB each this time cause i want to properly play games at 4K with a 4K or ultrawide monitor so multi gpu and adaptative sync solution will be needed.

    I want to buy AMD GPUs but the lack of a real raw power jump, the lack of 8 GB VRAM flagship GPU, the Freesync Crossfire support not solved (ghosting problems...), the bad and slow game support and the weak position of AMD in relation to game devs make this move really difficult.

    DX12 is the next batlle in the war between Nvidia and AMD.

    Maybe the last one AMD if my fears about Gameworks 2.0 and Nvidia strong position over game devs even in a DX12 "ecosystem" are real. :(
     
    Last edited: Jun 28, 2015
  8. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,074
    Likes Received:
    908
    GPU:
    Inno3D RTX 3090
    Do you have any link for that? According to that guy testing, the difference is almost on the statistical error between drivers, and it is to be expected since they switch to WDDM 2.0.

    Just a note on that. It was apparent from the start that GCN was a much more forward thinking hardware architecture than Kepler. All the latest games that use Physically Based Rendering will run better on GCN, just because of the compute requirements. Nothing is being left behind, it is games and engines themselves that are better. Furthermore, after three years almost, AMD has finally some good performing drivers. The gap will be wider if they tested with drivers like the 1040. Does that mean that NVIDIA has left anything behind? I don't think so. Judging by the performance in general, it seems to me that Kepler had almost 100% of its potential from the start, while GCN was hampered by slower drivers. Add to that newer games that simply love compute with all the PBR effects, the fact that all console games are optimized for GCN in the lower level code level (if GCN doesn't like something, or is good at something, the game code will reflect that), and then yes, the 7970/280x should be equal or faster with the 780/Titan.

    The raw power jump is actually quite real with the Fury X. And you can downsample everything on 1080p, so the 108-p performance shouldn't really matter that much. I would wait at least until the Windows 10 reveal/drivers to check the performance on that. Most people in here will tell you that you can't compare architectures directly (and they are correct), but depending on the time you want to keep those cards, the 4096 shaders on the Fury X and the ridiculous memory bandwidth look really nice. The only "problem" I see are the 4GB, which should be mitigated in multi-gpu configurations under DX12.
     
  9. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,652
    Likes Received:
    271
    GPU:
    RX 580 8GB
    They first confirmed in their forum about Kepler performance in general. After that came the driver.

    http://forums.guru3d.com/showthread.php?t=399569

     
  10. sammarbella

    sammarbella Ancient Guru

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    Of course there is a real raw power jump in Fury X over 290X, i didn't expressed it well enough. :)

    I mean there is no SUBSTANTIAL raw power jump to justify the upgrade especially in his case (and mine): 290X CFX to single Fury X.
     

  11. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,074
    Likes Received:
    908
    GPU:
    Inno3D RTX 3090
    Thanks for that. Since the drivers were more or less close on release, do you believe that they had it on the pipeline as a fix anyway?
     
  12. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,652
    Likes Received:
    271
    GPU:
    RX 580 8GB
  13. mcfart

    mcfart Master Guru

    Messages:
    311
    Likes Received:
    2
    GPU:
    HD 5970
    Probably. Nvidia will pull out all the stops to make people upgrade.


    If AMD was actually competitive, then Nvidia woulden't be burying their own legacy GPUs.
     
  14. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,074
    Likes Received:
    908
    GPU:
    Inno3D RTX 3090
  15. theoneofgod

    theoneofgod Ancient Guru

    Messages:
    4,652
    Likes Received:
    271
    GPU:
    RX 580 8GB
    I'm not siding with either theory, just showing you sources to things said by NVIDIA, etc.
    Don't know enough about NVIDIA's drivers to care really :D Yet!
     

  16. kcuestag

    kcuestag Master Guru

    Messages:
    904
    Likes Received:
    0
    GPU:
    Gigabyte GTX980Ti G1

    From what I've seen, an OC'd GTX980Ti can easily perform the same as two R9 290X.

    I am not looking to gain performance. I know that in some games I will WIN performance, and some maybe not win at all, but the card will run cooler, quieter, and my computer will draw a LOT LESS power, not to mention the smoothness of running a single card and not having to deal with things like stuttering due to cpu overhead or frame pacing.

    I always tend to do that when upgrading to new generation, I move from 2 older cards to one of the new generation and few months later I try to get the 2nd card for CFX or SLI. :)

    I can't really tell how it compares to my 2x 290X yet because I am not recieving the 980Ti until Tuesday or Wednesday, but if you want I can PM you and I will let you know how I feel about this change.

    I have to agree, it does not justify the change, considering I sold both my 290X for about 560€ and got the 980Ti G1 for 770€, I lost 210€ and performance wise I will be pretty much the same as before, but then again like I Said above I will win in the following:

    - Overall computer temperatures
    - Power draw (meaning my room will not be as hot..)
    - Smoothnes of single GPU
    - Newer techonology


    To conclude, I did not "upgrade" from 2x 290X to 980Ti for performance, I did it because I like trying every new generation, if it were for performance, I'd still have the 2x 290X which are more than enough for 1440p.
     
    Last edited: Jun 29, 2015
  17. sammarbella

    sammarbella Ancient Guru

    Messages:
    3,929
    Likes Received:
    178
    GPU:
    290X Lightning CFX (H2O)
    Even a 980Ti OCed like the G1 could have a very hard time beating my 290X CFX 1150/1600 stable, silent and cold OC (-60 C on load WC 1080 RAD).

    :D

    There you point the two main advantages: the power will be SUBSTANTIALLY less than a OCed 290X CFX and the single GPU has not the CFX/SLI headaches.

    I use to do the same, i will wait for the next GPU launch with 8 GB Vram and see what is the new DX12 scenario on W10, next months will be very interesting.

    :)
     
  18. kn00tcn

    kn00tcn Ancient Guru

    Messages:
    1,604
    Likes Received:
    2
    GPU:
    570m / MSI 660 Gaming OC
    did i read this properly? are you saying you should lower your fps to 4k levels to make a better delta between another gpu or brand!? :puke2:

    some people want 1080p60 or worse... 1080p120, how is downsampled or native 35fps 4k going to help?

    let me make a very hypothetical example:

    1080p hawaii - 35fps
    1080p fiji - 45fps
    1080p maxwell - 55fps
    4k hawaii - 20fps
    4k fiji - 35fps
    4k maxwell - 30fps

    in this example, fiji is better for 4k users, great choice yes, but it's useless for 1080p users, there is nothing that can be done other than cpu improvements to the driver (IF that is the problem)

    downsampling is crap, you dont get the sharpness of native pixels & you lose all that performance (i could try some more on my 660 if needed, but i better do it on some light game so i can focus only on image quality at fixed 60fps)
     
  19. PrMinisterGR

    PrMinisterGR Ancient Guru

    Messages:
    8,074
    Likes Received:
    908
    GPU:
    Inno3D RTX 3090
    This is a bit of twisting. What I said was that in the higher resolutions that both these cards should be used on, the differences are not that great. I still don't trust AMD to be on time with building a driver to feed this thing though.

    You are right about this. If I was going for ultra-high fps on DX11, I would never consider an AMD card, just because the driver is so slow.

    A lot of people confuse antialiasing with higher resolution gaming. You still need antialiasing even with a 4k display, although less than what you would need for a 1080p display. The key difference is the distance the viewer has from the display, and the display's PPI.
    I can tell you that 4x supersampling at 1080p looks fantastic, and it is a legit usage of a card like that. It will make every single game you play look tons better just by doing that. Everything is in the eye of the beholder though, I prefer AA to a lot of other stuff, I can understand if it doesn't interest you.
     
  20. kn00tcn

    kn00tcn Ancient Guru

    Messages:
    1,604
    Likes Received:
    2
    GPU:
    570m / MSI 660 Gaming OC
    i love AA... but also love performance... so for example, i drop MSAA & switch to FX/ML/SMAA (SM is the best out of these 3)

    anything not native pixels is going to be blurry or awkward compared to native

    i could try some testing i suppose... but so many games are going to drop below 60 so it's not worth it

    alternate statement: if i was willing to buy a fury x, i would be willing to buy a 4k monitor, i wouldnt waste clarity with downsampling

    btw i sit quite close to a monitor, only a forearm's length, i like immersion & being able to see every single pixel, so i'm right there with you saying you always need AA even at high resolution/ppi (in fact i have to deal with blurry scaled 720p ps3 games with no AA, now that's a tough one)

    i should do AA experiments with high ppi mobile devices since that's all i have available, i could try to calculate the distance i need to mimic the double density over my current monitor

    but still... in motion... AA is hardly noticeable, that's why it's the first to go if i'm getting too slow, would much rather keep AO than AA

    (EDIT: i realize the irony of my statements while using a mobile fermi & cheap kepler, so i'll mention that in 2008 i was the bleeding edge 4870x2 until it died in 2013, now it's just... let's see skylake & fiji, make some money, play less demanding backlog, etc)
     
    Last edited: Jun 29, 2015

Share This Page