Discussion in 'Videocards - AMD Radeon' started by OnnA, Jul 9, 2016.
THat is called a bait and switch. Very underhanded.
Has there been any kind of time frame announced for them yet?
I haven't seen any so far.
I don't understand how they can actually and think no one would notice especially in this age
It's amazing actually.
Asus RX Vega 64 ROG Strix Reviews
That's False. Comparing Fiji launch driver to current ones is night and day situation.
Even few months after launch AMD was fixing huge driver overhead.
Even compute tasks which are CPU independent saw 10~15% gain within few months.
BF4 saw 19% avg. fps gain within few months. WH 40k: Space Marine got 20% uplift on minimum fps and bigger gains on avg/maximum. (Not important anymore since minimum fps on 1440p was 130.)
Payday 2 had huge gains on 4k too, avg. ~28%, minimum ~26%.
And that's just comparing launch driver (June 2015) to 15.200.1023.10 (August 2015).
Yes, there were games which utilized GPU properly like Witcher 3. Those saw only minimal gains over time, but most games gained a lot over last 2 years.
I hope Vega does get a boost with improved drivers I think the FuryX boosts were down to the new 1st gen HBM and getting games to work well with it, Vega is 2nd gen HBM so you would think they know how to use the hardware but lets hope the driver team dropped the ball, and there working hard to make it better.
AMD almost always screws initial drivers. And Fiji was "just" evolution from GPU perspective. Vega has some very new and advanced things in GPU and can be called "a bit on revolutionary side" in comparison.
I'm almost certain that Vega will end up faster than Pascal, maybe with the exception of the Ti. The power consumption is still ridiculous though.
The order of purchasing for me, at this moment, would be RX 580 < Vega 56 < 1080Ti. Nothing else seems to make much sense.
Considering my recent purchase of a 28" 4K panel, I am encountering the limits of my 390 on some games. DOOM is fine, but DA:I needs some love. I'd like to see how the Vega56 compares to the 390, and maybe a 3rd party card will come with watercooling.
I don't like using this site but this is a good article here. https://m.hardocp.com/article/2017/01/30/amd_video_card_driver_performance_review_fine_wine/1
So basically only one game gained a huge chunk after launch driver Fallout 4 (**** game engine). And of course Doom Vulcan but that was to be expected and not indicative of the usual "fine wine". With the exception of games that drivers were poorly optimized for AKA AMD didn't have "game ready" drivers the Fiji did not pull a Hawaii.
Look at where it sits even now. It is a good card but still has not lived up to its theoretical performance. It should be matching the 1080 in DX12 titles IMO. The only explanation I can think of is memory bottleneck.
BTW your BF4 example is under Mantle correct?
I would've liked to see AMD's OpenGL improvements in Doom across drivers. However, overall a 5-6% difference is ok, but, this "Fine wine" thing doesn't seem much different to Nvidias improvements over time tbh.
In the case of Fiji no the improvements were comparable to the normal improvements Nvidia sees. Now Tahiti and Hawaii got massive improvements over time to the tune of some 35%. Mostly due to AMD's driver team starting to figure out GCN.
Info leaking out about a RX Vega hitting 1,980 core and 1,000 memory and beating out the Asus RoG Strix 1080Ti in many games in DX12
....even edging out the Ti in GTA V too.
Though the machine draws 532 Watts at the wall.
Very nice performance though ^^
Thinking the Vega will end up even close to the Ti is a stretch.
That's like a 30-40% gap it would have to close.
Unless drivers are just total ****, but surely they aren't that incompetent.
1980 is not possible.
Driver displaying wrong clock.
Buildzoid said the most he could do was like 1780 under LN2
Rise of the Tomb Raider
Square Enix just revealed that Rise of the Tomb Raider will have Xbox One X support with the game running at native 4K resolution. There are also a bunch of additional enhancements planned.
The update will be available on November 7th, when the Xbox One X console is launching, for free. Check out the trailer below and read our review of the game here.
Developed in partnership with Nixxes Software, the Xbox One X enhancements for the award-winning game include premiere graphics and other technical enhancements that leverage the full power of Xbox One X with settings that gamers can tailor to their taste.
Xbox One X players can choose from one of three visual modes, including:
Native 4K: (full 3840 by 2160) for highest fidelity resolution
Enriched Visuals for stunning graphic upgrades
High Frame Rate for the smoothest possible gameplay
New Xbox One X tech enhancements for Rise of the Tomb Raider include:
HDR display support for more vibrant and accurate color representation technology
Spatial audio support, including Dolby Atmos®, for true 3D audio
Enhanced texture resolution for Lara Croft®, NPCs, and environments, leveraging additional memory offered by the Xbox One X
Improved anti-aliasing for immersive realistic details
Additional visual enhancements include:
Improved volumetric lights
Upgraded polygonal detail
Amplified texture filtering
EA Live at Gamescom 2017
start at 26min
Some funny stuff, And i know it works great when Tweaked
Yes, without GimpWorks
Why oh why?
Fire Strike Custom 1.1 = 26100
~1730/1050 already puts V64 at a GPU score of approximately 25000, 10% from drivers puts it just under 28k.
To those willing to push very high wattage under water they may come close to 29/30k, IF, IFFF there are 10% gains to be had from drivers.
Also from Vega user:
"Card is now happily chugging along at sustained 1630Mhz/1100Mhz with 0% power target with core @ 1060mV. Truly amazing these cards come out of the factory wayyyyy overvolted."
wayyyyy overvolted huh? Fury users knows that
1.168v +8% and 16.234 3Dmarks GPU 1050/570
I will go for ~1.000v