Guru3D.com Forums

Go Back   Guru3D.com Forums > Videocards > Videocards - AMD Radeon Catalyst Drivers Section
Videocards - AMD Radeon Catalyst Drivers Section In this section you can discuss everything AMD Catalyst related. Catalyst drivers are for all AMD based graphics cards and APUs.



Reply
 
Thread Tools Display Modes
Old
  (#101)
siriq
Master Guru
 
siriq's Avatar
 
Videocard: Evga GTX 570 Classified
Processor: FX 8350@4.8&PII x6 1090T
Mainboard: GA 890FX-UD5
Memory: 16GB DDR3 1600MHz
Soundcard: Asus Xonar D2X PCIE
PSU: Seasonic
Default 05-09-2017, 21:26 | posts: 749 | Location: Earth

Quote:
Originally Posted by Romulus_ut3 View Post
Just for fun, I feel like informing you that the UVD bug is only present for GCN 1.0 mostly and 1.1 in some cases. RX 480 is unaffected.
Nice to know in the fun way.
   
Reply With Quote
 
Old
  (#102)
Spartan
Master Guru
 
Spartan's Avatar
 
Videocard: R9 290 PCS+
Processor: i5 4690K @ 4.2Ghz / H80i
Mainboard: ASUS Z97-PRO GAMER
Memory: 2x8GB HyperXFury 2133Mhz
Soundcard:
PSU: Corsair CS750M
Default 05-09-2017, 21:31 | posts: 675 | Location: United Kingdom

Quote:
Originally Posted by blppt View Post
Arent those all DirectX 9 games?
Arkham Origins is dx11.
   
Reply With Quote
Old
  (#103)
PrMinisterGR
Ancient Guru
 
PrMinisterGR's Avatar
 
Videocard: Sapphire 7970 Quadrobake
Processor: Core i7 2600k@4.5GHz
Mainboard: Sapphire Pure Black P67
Memory: Corsair Vengeance 16GB
Soundcard: ASUS Xonar D2X
PSU: EVGA SuperNova 750 G2
Default 05-09-2017, 21:48 | posts: 6,766

Quote:
Originally Posted by blppt View Post
Disagree on the "no issue" thing. For one thing, forcing it on would prevent certain games from even loading (Guild Wars 2 comes to mind for me).
That was most likely issues with hooking and third party interfering overlays like Steam/RTSS etc, something that a driver implementation wouldn't face.

Quote:
Originally Posted by blppt View Post
Secondly, for a game that didnt already have triple buffering + Vsync, I was never actually able to force triple buffering on. My guess as to why AMD and Nvidia never looked into what the RadeonPro guy (or the D3DOverrider guy for 32bit/DX9 games) was doing with his forced TB method was that for the amount of games/systems it actually worked on, it wasnt worth it.
Can you give me one title as an example? I don't remember TB not working with anything that RadeonPro actually worked on.

Quote:
Originally Posted by blppt View Post
If you read various threads on the nvidia reddit, they confirm that the Triple Buffering setting in the NVCP only affects OGL games. It does absolutely nothing for D3D.

Example: https://www.reddit.com/r/nvidia/comm...ple_buffering/

They also suggest that our "Fast Sync" setting is about the closest you can get to forcing triple buffering on in games that arent coded for it, but I've never tried that.
This is just a shame. Fast Sync is nothing like triple buffering btw. To highlight that it's possible to force it everywhere, it's actually in the list of features that AMD is taking feedback on about implementing.

http://radeon.com/radeonsoftware/fee...5.1&lang=en_US
   
Reply With Quote
Old
  (#104)
blppt
Member Guru
 
Videocard: 290X CF, Titan Black SLI
Processor: Core i7 / FX9590
Mainboard:
Memory:
Soundcard:
PSU: EVGA 1300G2
Default 05-09-2017, 21:51 | posts: 72

Quote:
Originally Posted by Spartan View Post
Arkham Origins is dx11.
Interesting. In any case, I think that especially for DX11 games, the success rate with force enabling/disabling triple buffering might have been too low for Nvidia or AMD to feel the need to add such a feature to their drivers.

DX9 I expect a higher success rate with as you were able to force other things on that you couldnt really do with 10+, or at least effectively, like MSAA. Probably due to the extensive rendering path rewrite between 9-10 (11).
   
Reply With Quote
 
Old
  (#105)
blppt
Member Guru
 
Videocard: 290X CF, Titan Black SLI
Processor: Core i7 / FX9590
Mainboard:
Memory:
Soundcard:
PSU: EVGA 1300G2
Default 05-09-2017, 21:58 | posts: 72

"To highlight that it's possible to force it everywhere"

I'm not sure if thats actually possible, or that people just submitted it and it got a lot of votes?

According to just about everything I've read (and tried to understand) on the subject, its not possible to actually force enable triple buffering in Direct3d if the game engine does not natively support it---a lot of people, I think, mistake triple buffering for "rendering ahead" which Nvidia already has a feature in the NVCP (and AMD doesnt). (Maximum Pre-Rendered Frames), which increases input lag, whereas true triple-buffering shouldnt affect lag that much.

I could be completely wrong, but thats what I've seen.
   
Reply With Quote
Old
  (#106)
PrMinisterGR
Ancient Guru
 
PrMinisterGR's Avatar
 
Videocard: Sapphire 7970 Quadrobake
Processor: Core i7 2600k@4.5GHz
Mainboard: Sapphire Pure Black P67
Memory: Corsair Vengeance 16GB
Soundcard: ASUS Xonar D2X
PSU: EVGA SuperNova 750 G2
Default 05-09-2017, 22:46 | posts: 6,766

Quote:
Originally Posted by blppt View Post
"To highlight that it's possible to force it everywhere"

I'm not sure if thats actually possible, or that people just submitted it and it got a lot of votes?

According to just about everything I've read (and tried to understand) on the subject, its not possible to actually force enable triple buffering in Direct3d if the game engine does not natively support it---a lot of people, I think, mistake triple buffering for "rendering ahead" which Nvidia already has a feature in the NVCP (and AMD doesnt). (Maximum Pre-Rendered Frames), which increases input lag, whereas true triple-buffering shouldnt affect lag that much.

I could be completely wrong, but thats what I've seen.
That list is created by AMD, not people. You can only vote in it, not add/remove proposed features. It is completely possible to enforce triple buffering, especially if you handle it on the driver level. Pre-Rendered frames and flip queue size is again a different thing. There are flip queue controls that currently work in the driver and they are again accessible through RadeonPro and RadeonMod.
   
Reply With Quote
Old
  (#107)
Romulus_ut3
Master Guru
 
Videocard: HD 7950 Vapor-X 3GB Boost
Processor: Core i5 3470 @ 3.8 GHz
Mainboard: Gigabyte GA-Z77-D3H Rev 1
Memory: Vengeance 2x4GB 1600 CL9
Soundcard: Creative 24 bit Live! 5.1
PSU: Thermaltake 650W Smart
Default 05-09-2017, 23:29 | posts: 502 | Location: Bangladesh

Quote:
Originally Posted by blppt View Post
It has never done anything for me in either enabled or disabled position, going back to my Titan Blacks, or even my SLI'd 680s in D3D games. Now, RadeonPro and D3DOverrider had been able to force Triple Buffering on/off in some games with varying success.

If the game engine supported Triple Buffering with VSync, Triple Buffering was on. I couldn't disable it or force enable it using the NVCP. The only way I could force TB is Borderless Windowed mode, and the reason is that all that stuff is taken over by Windows' DWM, and no longer at the mercy of the game engine or NVCP.
Triple buffer in the nvcp has worked for me since the age of FX 5200. I've gotten it to smooth out numerous FIFA/Sports games where enabling Vsync would drop my framerate to half of what my refresh rate is. Today's fancy DWM or borderless windowed mode didn't even exist back then. In games like Crysis, running it on Windows Vista SP2 without that Triple Buffering option marked resulted in framerate dipping to 30 when Vsync was enabled, so don't believe everything you read on Reddit. Things changed since Windows 8/10 I'd imagine but I remember clearly that Triple Buffering set nvidia and AMD apart when it came to playing older titles like Assassin's Creed II when it originally launched.

Also, Triple Buffer is highly discouraged for SLI/CFX configurations and IIRC it just wouldn't work/play nice with multi GPUs.

Last edited by Romulus_ut3; 05-09-2017 at 23:50.
   
Reply With Quote
Old
  (#108)
blppt
Member Guru
 
Videocard: 290X CF, Titan Black SLI
Processor: Core i7 / FX9590
Mainboard:
Memory:
Soundcard:
PSU: EVGA 1300G2
Default 05-10-2017, 05:53 | posts: 72

Quote:
Originally Posted by PrMinisterGR View Post
It is completely possible to enforce triple buffering, especially if you handle it on the driver level.
From what I'm reading, what we are actually able to force on for D3D games and Windowed Borderless is not actually true triple-buffering. It has a queue buffer of 3, yes, but its not actually triple buffering, which is why people are always complaining about it introducing latency. Nvidia's fastsync (which I have yet to try on my 1080s) is supposedly as close as you can get to forcing true triple-buffering in D3D games. Apparently what Nvidia Microsoft AMD refer to as "triple buffering" isnt actually correct--its referring to a swap queue/render ahead queue in D3D.

"There are flip queue controls that currently work in the driver and they are again accessible through RadeonPro and RadeonMod."

Right, but not in the official control panel. Perhaps that is what they are asking to enable?
   
Reply With Quote
Old
  (#109)
blppt
Member Guru
 
Videocard: 290X CF, Titan Black SLI
Processor: Core i7 / FX9590
Mainboard:
Memory:
Soundcard:
PSU: EVGA 1300G2
Default 05-10-2017, 05:58 | posts: 72

Quote:
Originally Posted by Romulus_ut3 View Post
Also, Triple Buffer is highly discouraged for SLI/CFX configurations and IIRC it just wouldn't work/play nice with multi GPUs.
But its not just reddit---there are official Nvidia forums claiming that it will not affect anything but OGL games. Maybe it used to do something in older D3D/Windows versions and doesnt anymore?

Also, from what I understand---enabling CFX/SLI automatically enables triple-buffering (or what we call TB), as it is required for AFR to work.
   
Reply With Quote
Old
  (#110)
Romulus_ut3
Master Guru
 
Videocard: HD 7950 Vapor-X 3GB Boost
Processor: Core i5 3470 @ 3.8 GHz
Mainboard: Gigabyte GA-Z77-D3H Rev 1
Memory: Vengeance 2x4GB 1600 CL9
Soundcard: Creative 24 bit Live! 5.1
PSU: Thermaltake 650W Smart
Default 05-10-2017, 08:33 | posts: 502 | Location: Bangladesh

I have already stated that the option made a difference prior to Windows 8/10, so yes, it made a difference in older OSes. Now most games ship with Triple Buffer as I haven't dealt with Vsync causing my framerate to tank to half of what my refresh rate is.

Last edited by Romulus_ut3; 05-10-2017 at 08:36.
   
Reply With Quote
Old
  (#111)
PrMinisterGR
Ancient Guru
 
PrMinisterGR's Avatar
 
Videocard: Sapphire 7970 Quadrobake
Processor: Core i7 2600k@4.5GHz
Mainboard: Sapphire Pure Black P67
Memory: Corsair Vengeance 16GB
Soundcard: ASUS Xonar D2X
PSU: EVGA SuperNova 750 G2
Default 05-10-2017, 13:18 | posts: 6,766

Quote:
Originally Posted by blppt View Post
From what I'm reading, what we are actually able to force on for D3D games and Windowed Borderless is not actually true triple-buffering. It has a queue buffer of 3, yes, but its not actually triple buffering, which is why people are always complaining about it introducing latency. Nvidia's fastsync (which I have yet to try on my 1080s) is supposedly as close as you can get to forcing true triple-buffering in D3D games. Apparently what Nvidia Microsoft AMD refer to as "triple buffering" isnt actually correct--its referring to a swap queue/render ahead queue in D3D.

"There are flip queue controls that currently work in the driver and they are again accessible through RadeonPro and RadeonMod."

Right, but not in the official control panel. Perhaps that is what they are asking to enable?
No, Flip queue (as you mention) is a different thing to Triple Buffering. Triple buffering is actually a queue buffer of 2, plus the front buffer being displayed.
   
Reply With Quote
Old
  (#112)
blppt
Member Guru
 
Videocard: 290X CF, Titan Black SLI
Processor: Core i7 / FX9590
Mainboard:
Memory:
Soundcard:
PSU: EVGA 1300G2
Default 05-10-2017, 14:32 | posts: 72

Quote:
Originally Posted by Romulus_ut3 View Post
I have already stated that the option made a difference prior to Windows 8/10, so yes, it made a difference in older OSes. Now most games ship with Triple Buffer as I haven't dealt with Vsync causing my framerate to tank to half of what my refresh rate is.
Which I guess (If I'm reading this correctly) seemingly supports the theories out there that to have true triple-buffering in games, you need to have the game engine implement it. I've seen the option recently, like in Thief (which I gather uses UE3 or some custom version of it) where you can actually choose between triple and double buffer, and DX:MD.
   
Reply With Quote
Old
  (#113)
blppt
Member Guru
 
Videocard: 290X CF, Titan Black SLI
Processor: Core i7 / FX9590
Mainboard:
Memory:
Soundcard:
PSU: EVGA 1300G2
Default 05-10-2017, 14:42 | posts: 72

Quote:
Originally Posted by PrMinisterGR View Post
No, Flip queue (as you mention) is a different thing to Triple Buffering. Triple buffering is actually a queue buffer of 2, plus the front buffer being displayed.
From what I gather, the reason why the "forced" triple buffering in fullscreen exclusive D3D isnt real triple buffering (and thus can introduce noticable latency) is that D3D has no ability baked in to discard unnecessary buffers/data if they arent needed or desired. Does that make sense to you, or am I misreading this stuff?

In any case, I cant make the "Triple Buffering" option in NVCP do anything on my system no matter what I do for D3D games in W10, even disabling SLI for troubleshooting (apparently for AFR/SLI to work, a psedo-TB has to be in effect anyways).
   
Reply With Quote
Old
  (#114)
Chastity
Master Guru
 
Chastity's Avatar
 
Videocard: Sapphire Nitro 390 BP
Processor: Intel i7 950@4GHz
Mainboard: Rampage III Black X58
Memory: 6GB Dominators 1600
Soundcard: X-Fi USB HD
PSU: CM Silent Gold 1200
Default 05-10-2017, 18:23 | posts: 522

Triple Buffering option in Radeon Settings only applies to OpenGL games. Has no affect on DirectX games, which is why we needed to use a 3rd party application to enable it for DX. Same for NVIDIA.
   
Reply With Quote
Old
  (#115)
blppt
Member Guru
 
Videocard: 290X CF, Titan Black SLI
Processor: Core i7 / FX9590
Mainboard:
Memory:
Soundcard:
PSU: EVGA 1300G2
Default 05-10-2017, 20:03 | posts: 72

Quote:
Originally Posted by Chastity View Post
Triple Buffering option in Radeon Settings only applies to OpenGL games. Has no affect on DirectX games, which is why we needed to use a 3rd party application to enable it for DX. Same for NVIDIA.
Yeah, the contention was originally I think, that AMD labels its current control panel as "OpenGL Triple Buffering" whereas Nvidia just says "Triple Buffering" with no mention of what API it actually works with.

Then I quickly took it to a dark place with breaking down the very definition of what triple-buffering actually is, and well....
   
Reply With Quote
Old
  (#116)
Chastity
Master Guru
 
Chastity's Avatar
 
Videocard: Sapphire Nitro 390 BP
Processor: Intel i7 950@4GHz
Mainboard: Rampage III Black X58
Memory: 6GB Dominators 1600
Soundcard: X-Fi USB HD
PSU: CM Silent Gold 1200
Default 05-10-2017, 22:05 | posts: 522

Quote:
Originally Posted by blppt View Post
Yeah, the contention was originally I think, that AMD labels its current control panel as "OpenGL Triple Buffering" whereas Nvidia just says "Triple Buffering" with no mention of what API it actually works with.

Then I quickly took it to a dark place with breaking down the very definition of what triple-buffering actually is, and well....
On the NV CP, if you read the description on the bottom when you highlight "Triple Buffering" you will see they mention this is for OpenGL.
   
Reply With Quote
Old
  (#117)
AlleyViper
Master Guru
 
Videocard: MSI RX 480 Gaming X 8G
Processor: PHII 965BE @4.0GHz
Mainboard: DFI LP DK 790FXB-M3H5
Memory: 2x4GB Crucial 1866C9
Soundcard: SB Z
PSU: RM750
Default 05-11-2017, 00:04 | posts: 276

x2, I remember that OpenGL detail detail in the description since 6800GT AGP days. Edit: nope, even before that I was using it in IL-2's OGL under a FX5900.
   
Reply With Quote
Old
  (#118)
The Mac
Ancient Guru
 
Videocard: Sapphire R9-290 Vapor-X
Processor: i7-4970K
Mainboard: MSI Z97 Gaming 5
Memory: Corsair Vengence Pro 16GB
Soundcard:
PSU: Corasir AX750
Default 05-11-2017, 02:11 | posts: 4,407 | Location: USA

AMD has made a statement in the past that they wont add TB for DX as its outside of the scope of recommendations by MS.

MS recommends the game engine implement it.

ALL 3rd party implementations of TB are not true TB per JapAMD who wrote radeopro.
   
Reply With Quote
Old
  (#119)
N0bodyOfTheGoat
Master Guru
 
Videocard: Sapphire Nitro R9 390 8GB
Processor: i7 5820k
Mainboard: Gigabyte X99 Gaming 5P
Memory: 4x4GB
Soundcard: Creative onboard
PSU: Corsair RM850W
Default 05-11-2017, 06:43 | posts: 180 | Location: Southampton, UK

Quote:
Originally Posted by N0bodyOfTheGoat View Post
Promising start, for the first time in a while my pc booted to desktop first time and no Wattman error message by the clock.
Sadly, this was a one-off event.

Most times I start my pc, it turns itself off just after the Windows 10 logo appears, or it simply hangs on the logo screen. After turning the pc back on again, or doing a hard reset from the hang, it normally boots.

And then after logging in, there is a message about Wattman experiencing an error and being reset.

All I have adjusted on Global Wattman is turning "Chill" on, which has only been in the last few days, normally it has been completely untouched. I use Sapphire Trixx 6.30 to set a custom fan profile.
   
Reply With Quote
Old
  (#120)
Romulus_ut3
Master Guru
 
Videocard: HD 7950 Vapor-X 3GB Boost
Processor: Core i5 3470 @ 3.8 GHz
Mainboard: Gigabyte GA-Z77-D3H Rev 1
Memory: Vengeance 2x4GB 1600 CL9
Soundcard: Creative 24 bit Live! 5.1
PSU: Thermaltake 650W Smart
Talking 05-11-2017, 09:07 | posts: 502 | Location: Bangladesh

I don't see anywhere in the description that mentions anything regarding Triple Buffering being limited to D3D or OpenGL.

 Click to show spoiler



However, the most common mistake people are making is this.

When you hover your mouse over the Vsync Option, this is what it says:

 Click to show spoiler



So the Vsync support's description of it's usage scenarios is being passed on as Triple Buffer implementation, eh?

What's next? If you can't enable Vsync for a game from the NVCP, the Triple Buffering is useless, therefore Triple Buffer is only supported in OpenGL titles, that's going to be the logic that's going to get thrown at me I assume?

Last edited by Romulus_ut3; 05-11-2017 at 09:17.
   
Reply With Quote
Old
  (#121)
Spartan
Master Guru
 
Spartan's Avatar
 
Videocard: R9 290 PCS+
Processor: i5 4690K @ 4.2Ghz / H80i
Mainboard: ASUS Z97-PRO GAMER
Memory: 2x8GB HyperXFury 2133Mhz
Soundcard:
PSU: Corsair CS750M
Default 05-11-2017, 11:11 | posts: 675 | Location: United Kingdom

Quote:
Originally Posted by The Mac View Post
ALL 3rd party implementations of TB are not true TB per JapAMD who wrote radeopro.
I choose RP's fake TB over garbage in-game vsync anytime.
   
Reply With Quote
Old
  (#122)
blppt
Member Guru
 
Videocard: 290X CF, Titan Black SLI
Processor: Core i7 / FX9590
Mainboard:
Memory:
Soundcard:
PSU: EVGA 1300G2
Default 05-11-2017, 20:27 | posts: 72

Quote:
Originally Posted by Chastity View Post
On the NV CP, if you read the description on the bottom when you highlight "Triple Buffering" you will see they mention this is for OpenGL.
Damn, I cant believe I missed that. I've only scrolled passed that option, oh, about a million times.
   
Reply With Quote
Old
  (#123)
blppt
Member Guru
 
Videocard: 290X CF, Titan Black SLI
Processor: Core i7 / FX9590
Mainboard:
Memory:
Soundcard:
PSU: EVGA 1300G2
Default 05-11-2017, 20:33 | posts: 72

Quote:
Originally Posted by Romulus_ut3 View Post
What's next? If you can't enable Vsync for a game from the NVCP, the Triple Buffering is useless, therefore Triple Buffer is only supported in OpenGL titles, that's going to be the logic that's going to get thrown at me I assume?
I'm not sure you can say that definitively, since you *can* have a game that allows you to enable vsync, but is limited to double-buffer. I think L.A. Noire was one such case, at least in the early patches. So theoretically you could have a title that has working vsync but you'd want to add triple buffering as well.

But unfortunately, based on my own experiences and what other people are saying, it appears that toggle in the NVCP does nothing for D3D games (anymore).
   
Reply With Quote
Old
  (#124)
z8373767
Member Guru
 
z8373767's Avatar
 
Videocard: R9 Fury Nitro OC+
Processor: i5-5675C @ 4.1GHz 1.27V
Mainboard: Gigabyte Z97X-Gaming 3
Memory: DDR3 2x8GB 2400 CL11
Soundcard: Sound Blaster X-fi 3
PSU: XFX TS 750W Gold
Default 05-11-2017, 21:48 | posts: 51 | Location: Poland

Quote:
Originally Posted by Spartan View Post
I choose RP's fake TB over garbage in-game vsync anytime.
Same as me. Vertical sync in Assassin's Creed games is a garbage (stuttering on mouse). Vsync + TB through Radeon Pro is very smooth and solve stuttering problem. GTA V works well to (SMAA).
Almost every modern game which i played working with RadeonPro.
Witcher 3 is the only one, which i can't run with RP

Last edited by z8373767; 05-11-2017 at 21:53.
   
Reply With Quote
Old
  (#125)
Romulus_ut3
Master Guru
 
Videocard: HD 7950 Vapor-X 3GB Boost
Processor: Core i5 3470 @ 3.8 GHz
Mainboard: Gigabyte GA-Z77-D3H Rev 1
Memory: Vengeance 2x4GB 1600 CL9
Soundcard: Creative 24 bit Live! 5.1
PSU: Thermaltake 650W Smart
Default 05-11-2017, 22:05 | posts: 502 | Location: Bangladesh

I proved the description bit to be wrong.

The easiest way to determine whether the nvidia control panel's Triple Buffer makes a difference is to run Crysis with Triple Buffer and Vsync on, then run it again with Triple Buffer disabled and leaving the Vsync on. If your framerate drops to half of what your monitor's refresh rate is (Double Buffer) then you'll have your answer. Recommended OS to try this on is Windows 7.
   
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Powered by vBulletin®
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
vBulletin Skin developed by: vBStyles.com
Copyright (c) 1995-2014, All Rights Reserved. The Guru of 3D, the Hardware Guru, and 3D Guru are trademarks owned by Hilbert Hagedoorn.