Guru3D.com Forums

Go Back   Guru3D.com Forums > Videocards > Videocards - AMD Radeon Catalyst Drivers Section
Videocards - AMD Radeon Catalyst Drivers Section In this section you can discuss everything AMD Catalyst related. Catalyst drivers are for all AMD based graphics cards and APUs.



Reply
 
Thread Tools Display Modes
Old
  (#76)
blppt
Member Guru
 
Videocard: 290X CF, Titan Black SLI
Processor: Core i7 / FX9590
Mainboard:
Memory:
Soundcard:
PSU: EVGA 1300G2
Default 05-08-2017, 17:37 | posts: 72

Quote:
Originally Posted by go4brendon View Post
Final nail in the coffin for me. Had to run ME:A on a single gpu. Busy selling my 295x2 and bought 1080ti seahawk x. Sorry team red I'm out.

Pretty similar situation here--- 2 290x(s) 8gb models, still waiting for Vega. Single 290x just doesnt cut it for this game, even with tessellation turned down to 8x, so there must be some other issue going on here.

I know someone below suggested Gameworks, but would that *really* have any effect on the obvious texture loading bugs for AMD cards? I know Gameworks saturates tessellation, but I've never seen an issue with texture loading, just overall lower framerates before AMD truly updates their drivers (or you manually lower tessellation in the AMD Settings Panel).

That these drivers were released with a CFX profile leads me to believe that they were tested/developed on a ME:A build that never saw the public light of day, because I cant imagine anybody would think this is "working".
   
Reply With Quote
 
Old
  (#77)
Krteq
Master Guru
 
Videocard: R9 290 Tri-X OC
Processor: Core i7 6700K @4,7 GHz WC
Mainboard: GIGABYTE Z170-D3H
Memory: 2x8GB DDR4 3200MHz
Soundcard: Prodigy 7.1 HiFi + HD515
PSU: Seasonic 620W
Default 05-08-2017, 18:19 | posts: 319 | Location: Czech Republic

Quote:
Originally Posted by Romulus_ut3 View Post
Playing older games like Assassin's Creed, Mass Effect Trilogy, Need For Speed, Crysis etc. is a much better experience on nvidia cards than AMD. AMD's GPUs have far worse frametimes resulting in worse gameplay experience compared to nvidia's specially when equipped with less capable CPUs.
Wow, this myth still exists?

Frametimes are comparable since 2014 (that famous TechReport article) and every trustworthy tech side is testing frametimes in HW reviews today. Just check some reviews and stop spreading this misinformation.
   
Reply With Quote
Old
  (#78)
AlleyViper
Master Guru
 
Videocard: MSI RX 480 Gaming X 8G
Processor: PHII 965BE @4.0GHz
Mainboard: DFI LP DK 790FXB-M3H5
Memory: 2x4GB Crucial 1866C9
Soundcard: SB Z
PSU: RM750
Default 05-08-2017, 19:05 | posts: 276

Weird, on my 7870 radeon aditional settings doesn't open (a message saying that it can't be started there's nothing available to configure in there pops), while on my rx480 it opens with the few remaining options.

Edit: Btw, regarding the UVD bug, there's some hope it might end up fixed: link
   
Reply With Quote
Old
  (#79)
MerolaC
Ancient Guru
 
MerolaC's Avatar
 
Videocard: Sapphire NITRO R9 390 8GB
Processor: i7 6700k@Stock@CorsairH60
Mainboard: ASUS Maximus VIII Ranger
Memory: Vengeance LPX 16G@3200
Soundcard: SB Titanium HD
PSU: OCZ Fatal1ty G.G. - 750w
Default 05-08-2017, 19:52 | posts: 2,797 | Location: Argentina

Forza Horizon 3 Slow loading times.

This sounds good.
https://community.amd.com/message/2797074
   
Reply With Quote
 
Old
  (#80)
Seren
Master Guru
 
Videocard: Gigabyte 797OC-3GD (1GHz)
Processor: AMD 8320 @ 4.1Ghz
Mainboard: Asus M5A97 Evo R2.0
Memory: 2x8GB Patriot VBM,1866,10
Soundcard: Realtek ALC892
PSU: XFX 750w Core Edition
Default 05-08-2017, 20:07 | posts: 246 | Location: Perth, Western Australia

Quote:
Originally Posted by AlleyViper View Post
Edit: Btw, regarding the UVD bug, there's some hope it might end up fixed: link
That UVD bug is weird. I never experienced it on my previous Windows install 7 -> 8.1 -> 10. I did however experience it once, shortly after doing a fresh CU install. I don't think my PC was properly configured (latest C++ redistributables, custom power plan, tweaks etc.). I haven't seen it since.
   
Reply With Quote
Old
  (#81)
TheDukeSD
Member Guru
 
Videocard: Sapphire HD 7750 1 GB
Processor: Intel Pentium G3260
Mainboard: ASRock B85M Pro3
Memory: 4x 2GB DDR3 Kingmax
Soundcard: C-Media CM108 USB
PSU: Seasonic S12II-380 Bronze
Default 05-08-2017, 20:47 | posts: 113 | Location: Romania

Quote:
Originally Posted by Krteq View Post
Wow, this myth still exists?

Frametimes are comparable since 2014 (that famous TechReport article) and every trustworthy tech side is testing frametimes in HW reviews today. Just check some reviews and stop spreading this misinformation.
Vsync on in some older games might not put enough load on the gpu to keep constant clocks, if the gpu/mem clocks change often the game might not feel fluid (it depends a lot on the quality of the stuff used to build that particular gpu so it can vary from model to model from manufacturer to manufacturer).

Some people have reasons to keep vsync on:
- like they use GCN 1.0
- like they see the tearing and it's annoying/distracting for them

I have no clue what actually AMD has done with drivers past 16.7.2, until UVD bug will actually be fixed I won't update the drivers.
   
Reply With Quote
Old
  (#82)
PrMinisterGR
Ancient Guru
 
PrMinisterGR's Avatar
 
Videocard: Sapphire 7970 Quadrobake
Processor: Core i7 2600k@4.5GHz
Mainboard: Sapphire Pure Black P67
Memory: Corsair Vengeance 16GB
Soundcard: ASUS Xonar D2X
PSU: EVGA SuperNova 750 G2
Default 05-08-2017, 20:52 | posts: 6,766

Quote:
Originally Posted by Krteq View Post
Wow, this myth still exists?

Frametimes are comparable since 2014 (that famous TechReport article) and every trustworthy tech side is testing frametimes in HW reviews today. Just check some reviews and stop spreading this misinformation.
It's not a myth. Newer drivers have improved it, but it's quite real.

Quote:
Originally Posted by TheDukeSD View Post
Vsync on in some older games might not put enough load on the gpu to keep constant clocks, if the gpu/mem clocks change often the game might not feel fluid (it depends a lot on the quality of the stuff used to build that particular gpu so it can vary from model to model from manufacturer to manufacturer).

Some people have reasons to keep vsync on:
- like they use GCN 1.0
- like they see the tearing and it's annoying/distracting for them

I have no clue what actually AMD has done with drivers past 16.7.2, until UVD bug will actually be fixed I won't update the drivers.
The crime here is that AMD isn't implementing enforceable vsync modes/triple buffering, and that they don't allow Chill (which is a surprisingly decent frame limiter) to work with everything. These are literally the only two features their driver lacks.
   
Reply With Quote
Old
  (#83)
Spartan
Master Guru
 
Spartan's Avatar
 
Videocard: R9 290 PCS+
Processor: i5 4690K @ 4.2Ghz / H80i
Mainboard: ASUS Z97-PRO GAMER
Memory: 2x8GB HyperXFury 2133Mhz
Soundcard:
PSU: Corsair CS750M
Default 05-08-2017, 22:20 | posts: 675 | Location: United Kingdom

Quote:
Originally Posted by Krteq View Post
Wow, this myth still exists?

Frametimes are comparable since 2014 (that famous TechReport article) and every trustworthy tech side is testing frametimes in HW reviews today. Just check some reviews and stop spreading this misinformation.
Not sure about the rest, but AC Black Flag ran like dog turd on my pc ~two months ago, even at 1080p.
   
Reply With Quote
Old
  (#84)
blppt
Member Guru
 
Videocard: 290X CF, Titan Black SLI
Processor: Core i7 / FX9590
Mainboard:
Memory:
Soundcard:
PSU: EVGA 1300G2
Default 05-08-2017, 22:48 | posts: 72

Quote:
Originally Posted by TheDukeSD View Post
- like they see the tearing and it's annoying/distracting for them
.
TBH, I have never understood how somebody can crank up the details on their games, then leave vsync off to tear up all those pretty textures. I cant stand it---and up until freesync/gsync it was double/triple buffering nightmares.
   
Reply With Quote
Old
  (#85)
blppt
Member Guru
 
Videocard: 290X CF, Titan Black SLI
Processor: Core i7 / FX9590
Mainboard:
Memory:
Soundcard:
PSU: EVGA 1300G2
Default 05-08-2017, 22:53 | posts: 72

Quote:
Originally Posted by PrMinisterGR View Post
It's not a myth. Newer drivers have improved it, but it's quite real.



The crime here is that AMD isn't implementing enforceable vsync modes/triple buffering, and that they don't allow Chill (which is a surprisingly decent frame limiter) to work with everything. These are literally the only two features their driver lacks.
FWIR, its currently extremely difficult for AMD and Nvidia to "force" triple buffering in games due to the way Direct3d works, which is why the "enable /disable triple buffering" in the NVCP does absolutely nothing in modern DirectX games. At least AMD is honest about it---labeling the switch "Open GL Triple Buffering"

The easiest way to force Triple Buffering AFAIK, is to run the game in borderless fullscreen, which unfortunately disables CFX for those of us with Multi-GPU, though Nvidia's SLI still works in that situation. I find though, with a few exceptions, that most new games dont run quite as smoothly in Borderless Fullscreen as fullscreen exclusive.
   
Reply With Quote
Old
  (#86)
Only Intruder
Master Guru
 
Only Intruder's Avatar
 
Videocard: Sapphire R9 Fury Nitro
Processor: i5 4690K
Mainboard: Z97P-D3
Memory: 16GB HyperX Savage
Soundcard: Creative X-Fi FPS
PSU: EVGA 750 GQ
Default 05-08-2017, 23:24 | posts: 773 | Location: UK

Quote:
Originally Posted by blppt View Post
"Nvidia is still a POS when it comes to HDTV scaling.
Explain to me why my 8400 gs can scale to my HDTV with zero tweaks, but my GTX 650ti, 9800GTX, or GTX 570 can not."

Curious as to what problems you're having...I also mirror to an HDTV for movies, etc, and havent had any issues with doing so for either my Nvidia or AMD setups for as long as I can remember.

Bear in mind though, I'm using a 1080p 60hz computer monitor and a 1080p 60hz TV.
I can relate to that, when I had a laptop, it has an 8600m GS and generally speaking, extending monitors was problem free with it but when I had a 9600GT, it was a pain in the arse to use a TV as an extended monitor, having to set a custom resolution which just doesn't play nice with a lot of games/programs - and this behaviour is still present in more modern cards too. A friend of mine has a GTX 770 and he has the same issues, extending to a TV, having to manually set a custom defined resolution in order to use it.

So I don't know whats different between a mobile nVidia chip and a desktop card but you'd expect the behaviour to be just the same? (though the 8600m GS is an old example now).
   
Reply With Quote
Old
  (#87)
siriq
Master Guru
 
siriq's Avatar
 
Videocard: Evga GTX 570 Classified
Processor: FX 8350@4.8&PII x6 1090T
Mainboard: GA 890FX-UD5
Memory: 16GB DDR3 1600MHz
Soundcard: Asus Xonar D2X PCIE
PSU: Seasonic
Default 05-08-2017, 23:47 | posts: 749 | Location: Earth

Is this UVD bug is like higher clock speeds? Because if i play blu-ray or any video while i have chrome or any app open , i have 300 GPU/300 Memory clock. Once i have asked already but it was long ago

One more thing, VSR works fine all the way up to 4K even with CRT monitor. Temporary , i have to use it and the VSR doing excellent job , out of the box.

Make CRT 4:3 monitor to capable to do 4K with VSR, is pricesless.

Last edited by siriq; 05-08-2017 at 23:51.
   
Reply With Quote
Old
  (#88)
siriq
Master Guru
 
siriq's Avatar
 
Videocard: Evga GTX 570 Classified
Processor: FX 8350@4.8&PII x6 1090T
Mainboard: GA 890FX-UD5
Memory: 16GB DDR3 1600MHz
Soundcard: Asus Xonar D2X PCIE
PSU: Seasonic
Default 05-09-2017, 00:12 | posts: 749 | Location: Earth

Just for fun, in MADVR i turned up image scaling options to max and still not running at max speed(RX480)(GPU/Memory). If i stop the player, everything goes back to 300/300.
   
Reply With Quote
Old
  (#89)
blppt
Member Guru
 
Videocard: 290X CF, Titan Black SLI
Processor: Core i7 / FX9590
Mainboard:
Memory:
Soundcard:
PSU: EVGA 1300G2
Default 05-09-2017, 01:42 | posts: 72

Quote:
Originally Posted by Only Intruder View Post
I can relate to that, when I had a laptop, it has an 8600m GS and generally speaking, extending monitors was problem free with it but when I had a 9600GT, it was a pain in the arse to use a TV as an extended monitor, having to set a custom resolution which just doesn't play nice with a lot of games/programs - and this behaviour is still present in more modern cards too. A friend of mine has a GTX 770 and he has the same issues, extending to a TV, having to manually set a custom defined resolution in order to use it.

So I don't know whats different between a mobile nVidia chip and a desktop card but you'd expect the behaviour to be just the same? (though the 8600m GS is an old example now).
Yeah, I dont use extended desktop, so maybe thats why I've never had issues.
   
Reply With Quote
Old
  (#90)
Romulus_ut3
Master Guru
 
Videocard: HD 7950 Vapor-X 3GB Boost
Processor: Core i5 3470 @ 3.8 GHz
Mainboard: Gigabyte GA-Z77-D3H Rev 1
Memory: Vengeance 2x4GB 1600 CL9
Soundcard: Creative 24 bit Live! 5.1
PSU: Thermaltake 650W Smart
Default 05-09-2017, 05:56 | posts: 502 | Location: Bangladesh

Quote:
Originally Posted by Krteq View Post
Wow, this myth still exists?

Frametimes are comparable since 2014 (that famous TechReport article) and every trustworthy tech side is testing frametimes in HW reviews today. Just check some reviews and stop spreading this misinformation.
Yes, just check some reviews done by Gamers Nexus and Digital Foundry, or Hardware Unboxed. You'll see who's spreading misinformation. There're lots of titles where the RX 480 yields higher maximum framerate but loses when it comes to the minimum framerate, miserably.

 Click to show spoiler



(Remove the space between Gamers Nexus)

Edit: Not sure why Guru3D filters out Gamers Nexus. They do very in depth reviews, and has been using frametime for measurements for as long as I can remember. Reviewers who stick to just average framerate are the ones that're misleading, IMO.

Scott Wasson at Tech Report came up with the conception of frametime, he's with RTG now and has appeared on numerous Gamers Nexus videos.

I'm sure I've angered a lot of AMD fanboys like you by uttering the truth. The fact remains the same: DX11/DX12, AMD's minimum framerate suffers. And if you're not using a top of the line Core i7 CPU with your AMD GPU, you'll see worse performance compared to a competing nvidia card. The GTX 1060 flies past a RX 480 when used with a Core i3, and fares better when coupled with a Core i5. When these issues are brought up, people like you tell people like me that these issues don't exist or to go buy nvidia. And neither of these actions help AMD or the consumers, to back this up, the market share stats speak for themselves.

Last edited by Romulus_ut3; 05-09-2017 at 10:03.
   
Reply With Quote
Old
  (#91)
Romulus_ut3
Master Guru
 
Videocard: HD 7950 Vapor-X 3GB Boost
Processor: Core i5 3470 @ 3.8 GHz
Mainboard: Gigabyte GA-Z77-D3H Rev 1
Memory: Vengeance 2x4GB 1600 CL9
Soundcard: Creative 24 bit Live! 5.1
PSU: Thermaltake 650W Smart
Default 05-09-2017, 06:11 | posts: 502 | Location: Bangladesh

Quote:
Originally Posted by siriq View Post
Just for fun, in MADVR i turned up image scaling options to max and still not running at max speed(RX480)(GPU/Memory). If i stop the player, everything goes back to 300/300.
Just for fun, I feel like informing you that the UVD bug is only present for GCN 1.0 mostly and 1.1 in some cases. RX 480 is unaffected.

Last edited by Romulus_ut3; 05-09-2017 at 07:31.
   
Reply With Quote
Old
  (#92)
Romulus_ut3
Master Guru
 
Videocard: HD 7950 Vapor-X 3GB Boost
Processor: Core i5 3470 @ 3.8 GHz
Mainboard: Gigabyte GA-Z77-D3H Rev 1
Memory: Vengeance 2x4GB 1600 CL9
Soundcard: Creative 24 bit Live! 5.1
PSU: Thermaltake 650W Smart
Default 05-09-2017, 07:48 | posts: 502 | Location: Bangladesh

Quote:
Originally Posted by blppt View Post
FWIR, its currently extremely difficult for AMD and Nvidia to "force" triple buffering in games due to the way Direct3d works, which is why the "enable /disable triple buffering" in the NVCP does absolutely nothing in modern DirectX games. At least AMD is honest about it---labeling the switch "Open GL Triple Buffering"

The easiest way to force Triple Buffering AFAIK, is to run the game in borderless fullscreen, which unfortunately disables CFX for those of us with Multi-GPU, though Nvidia's SLI still works in that situation. I find though, with a few exceptions, that most new games dont run quite as smoothly in Borderless Fullscreen as fullscreen exclusive.
Nvidia Control Panel's Triple Buffering has worked flawlessly on older DX9/DX10/DX11 titles since the release of Windows 10. Whereas AMD's Triple Buffering, despite being labelled for OpenGL only, doesn't even work on OpenGL titles. Tested on Rage, DOOM 3, Prey, Quake IV, ONi, Quake III, and the list goes on. Over the years people had to rely on D3DOverrider or RadeonPRO or having to switch back and forth between Windowed mode to FullScreen mode using Alt+Enter in titles that allowed this for forcing Triple Buffering on AMD cards, something I believe the end user shouldn't have to deal with. For nvidia, it was way too easy with their control panel setting which is still applicable for older operating systems.
   
Reply With Quote
Old
  (#93)
Spartan
Master Guru
 
Spartan's Avatar
 
Videocard: R9 290 PCS+
Processor: i5 4690K @ 4.2Ghz / H80i
Mainboard: ASUS Z97-PRO GAMER
Memory: 2x8GB HyperXFury 2133Mhz
Soundcard:
PSU: Corsair CS750M
Default 05-09-2017, 10:16 | posts: 675 | Location: United Kingdom

Quote:
Originally Posted by Romulus_ut3 View Post
Nvidia Control Panel's Triple Buffering has worked flawlessly on older DX9/DX10/DX11 titles since the release of Windows 10. Whereas AMD's Triple Buffering, despite being labelled for OpenGL only, doesn't even work on OpenGL titles. Tested on Rage, DOOM 3, Prey, Quake IV, ONi, Quake III, and the list goes on. Over the years people had to rely on D3DOverrider or RadeonPRO or having to switch back and forth between Windowed mode to FullScreen mode using Alt+Enter in titles that allowed this for forcing Triple Buffering on AMD cards, something I believe the end user shouldn't have to deal with. For nvidia, it was way too easy with their control panel setting which is still applicable for older operating systems.
Not sure what do you expect, when people vote for "Radeon Settings built in performance benchmark" over Triple Buffering lel
   
Reply With Quote
Old
  (#94)
Rambo
Master Guru
 
Rambo's Avatar
 
Videocard: RX 480->RX 560 w8 4 Vega
Processor: i5-4690
Mainboard: MSI Z97 GAMING 3
Memory: 4xG.SKILL 2400 CL10
Soundcard: Hail for bananas!
PSU: EVGA 550W
Default 05-09-2017, 10:30 | posts: 153

Quote:
Originally Posted by Spartan View Post
Not sure what do you expect, when people vote for "Radeon Settings built in performance benchmark" over Triple Buffering lel
Benchmarking > playing.
I would to know, is there any sense of voting. Poll has been added a few months ago.
   
Reply With Quote
Old
  (#95)
OnnA
Ancient Guru
 
OnnA's Avatar
 
Videocard: Nitro Fiji-X HBM 1150/570
Processor: ZEN x8 k17 + Nepton 280L
Mainboard: ASUS Crosshair VI Hero
Memory: 16GB 3200 CL16 1T Ripjaws
Soundcard: SB-z Nichicon + Wood 5.1
PSU: Seasonic-X 750W Platinum
Lightbulb 05-09-2017, 11:05 | posts: 2,862 | Location: HolyWater Village

Quote:
Originally Posted by Krteq View Post
Wow, this myth still exists?

Frametimes are comparable since 2014 (that famous TechReport article) and every trustworthy tech side is testing frametimes in HW reviews today. Just check some reviews and stop spreading this misinformation.
Yup it's wrong

Sometimes you need to delete Game Profile in Relive to game work properly !
Sometimes you need to disable FPS Cap (e.g. in R6 Siedge, use V-Sync 1frame)
Sometimes you need to disable Power Effic.

etc.
   
Reply With Quote
Old
  (#96)
blppt
Member Guru
 
Videocard: 290X CF, Titan Black SLI
Processor: Core i7 / FX9590
Mainboard:
Memory:
Soundcard:
PSU: EVGA 1300G2
Default 05-09-2017, 18:42 | posts: 72

Quote:
Originally Posted by Romulus_ut3 View Post
Nvidia Control Panel's Triple Buffering has worked flawlessly
It has never done anything for me in either enabled or disabled position, going back to my Titan Blacks, or even my SLI'd 680s in D3D games. Now, RadeonPro and D3DOverrider had been able to force Triple Buffering on/off in some games with varying success.

If the game engine supported Triple Buffering with VSync, Triple Buffering was on. I couldn't disable it or force enable it using the NVCP. The only way I could force TB is Borderless Windowed mode, and the reason is that all that stuff is taken over by Windows' DWM, and no longer at the mercy of the game engine or NVCP.
   
Reply With Quote
Old
  (#97)
PrMinisterGR
Ancient Guru
 
PrMinisterGR's Avatar
 
Videocard: Sapphire 7970 Quadrobake
Processor: Core i7 2600k@4.5GHz
Mainboard: Sapphire Pure Black P67
Memory: Corsair Vengeance 16GB
Soundcard: ASUS Xonar D2X
PSU: EVGA SuperNova 750 G2
Default 05-09-2017, 19:18 | posts: 6,766

Quote:
Originally Posted by blppt View Post
FWIR, its currently extremely difficult for AMD and Nvidia to "force" triple buffering in games due to the way Direct3d works, which is why the "enable /disable triple buffering" in the NVCP does absolutely nothing in modern DirectX games. At least AMD is honest about it---labeling the switch "Open GL Triple Buffering"

The easiest way to force Triple Buffering AFAIK, is to run the game in borderless fullscreen, which unfortunately disables CFX for those of us with Multi-GPU, though Nvidia's SLI still works in that situation. I find though, with a few exceptions, that most new games dont run quite as smoothly in Borderless Fullscreen as fullscreen exclusive.
No it isn't. Radeon Pro forces both TB and various Vsync methods with no issue at all, and it's abandonwere for the last three years at least.
   
Reply With Quote
Old
  (#98)
blppt
Member Guru
 
Videocard: 290X CF, Titan Black SLI
Processor: Core i7 / FX9590
Mainboard:
Memory:
Soundcard:
PSU: EVGA 1300G2
Default 05-09-2017, 20:29 | posts: 72

Quote:
Originally Posted by PrMinisterGR View Post
No it isn't. Radeon Pro forces both TB and various Vsync methods with no issue at all, and it's abandonwere for the last three years at least.
Disagree on the "no issue" thing. For one thing, forcing it on would prevent certain games from even loading (Guild Wars 2 comes to mind for me). Secondly, for a game that didnt already have triple buffering + Vsync, I was never actually able to force triple buffering on. My guess as to why AMD and Nvidia never looked into what the RadeonPro guy (or the D3DOverrider guy for 32bit/DX9 games) was doing with his forced TB method was that for the amount of games/systems it actually worked on, it wasnt worth it.

If you read various threads on the nvidia reddit, they confirm that the Triple Buffering setting in the NVCP only affects OGL games. It does absolutely nothing for D3D.

Example: https://www.reddit.com/r/nvidia/comm...ple_buffering/

They also suggest that our "Fast Sync" setting is about the closest you can get to forcing triple buffering on in games that arent coded for it, but I've never tried that.
   
Reply With Quote
Old
  (#99)
Spartan
Master Guru
 
Spartan's Avatar
 
Videocard: R9 290 PCS+
Processor: i5 4690K @ 4.2Ghz / H80i
Mainboard: ASUS Z97-PRO GAMER
Memory: 2x8GB HyperXFury 2133Mhz
Soundcard:
PSU: Corsair CS750M
Default 05-09-2017, 21:12 | posts: 675 | Location: United Kingdom

Quote:
Originally Posted by blppt View Post
My guess as to why AMD and Nvidia never looked into what the RadeonPro guy (or the D3DOverrider guy for 32bit/DX9 games) was doing with his forced TB method was that for the amount of games/systems it actually worked on, it wasnt worth it.
Guess I'm lucky that RadeonPro works on my system, in many older games. Currently playing:

Arkham Origins --> in-game vsync locks me to 30 / RadeonPro: smooth as butter
Serious Sam HD --> in-game vsync causing micro stutters / RadeonPro: smooth as butter
Portal 2 --> in-game tripple buffering causing micro stutters / RadeonPro: smooth as butter

Many other games have garbage vsync, so RP is a must have for me.
   
Reply With Quote
Old
  (#100)
blppt
Member Guru
 
Videocard: 290X CF, Titan Black SLI
Processor: Core i7 / FX9590
Mainboard:
Memory:
Soundcard:
PSU: EVGA 1300G2
Default 05-09-2017, 21:23 | posts: 72

Quote:
Originally Posted by Spartan View Post
Guess I'm lucky that RadeonPro works on my system, in many older games. Currently playing:

Arkham Origins --> in-game vsync locks me to 30 / RadeonPro: smooth as butter
Serious Sam HD --> in-game vsync causing micro stutters / RadeonPro: smooth as butter
Portal 2 --> in-game tripple buffering causing micro stutters / RadeonPro: smooth as butter

Many other games have garbage vsync, so RP is a must have for me.
Arent those all DirectX 9 games?
   
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump



Powered by vBulletin®
Copyright ©2000 - 2017, Jelsoft Enterprises Ltd.
vBulletin Skin developed by: vBStyles.com
Copyright (c) 1995-2014, All Rights Reserved. The Guru of 3D, the Hardware Guru, and 3D Guru are trademarks owned by Hilbert Hagedoorn.