If it doesn't OC well without touching the voltage, giving more voltage will not make big difference. Just more noise, heat and higher power usage. Default boost for ref GTX 970 is 1178 MHz while my card can do 1500 MHz without touching the voltage. 980 Ti overclocks very similarly. Few mV's will not give +200 MHz. It will barely make any real difference.
well you can't compare voltages like that. Or if you said Intel cpu needs no voltage up to this, while AMD only to that and doesn't OC well.
Ummm....what? ($520, $500 after rebate) Gigabyte GTX 980 G1 Gaming - 1228/1329 vs. ($580) Asus Strix R9 Fury Sorry guys, the Fury is a complete bust at that price point. It needs to be $500-$520 to be competitive, and even then just barely. And I don't think it's fair comparing an aftermarket board with a custom setup against a stock 980 rather than its aftermarket 980 competition. Either compare stock against stock, or custom competition against custom competition. This is my complaint for Ryan's review over at Anand also.
What's your point? It doesn't even come close to the Fury (especially at 4K, few exceptions due to the title/drivers), even at 1329Mhz vs. stock Fury, even from your source with very limited amount of games.
Ummm... Did you actually open the article and read the review? How many times are we going to make the excuse "titles" and/or "drivers" for AMD. What, is nearly every game now not playable on AMD cards because of everyone else's fault but AMD? Maybe we should get the AMD cards now and play the current games in three years when they are finally [maybe] able to run. :bang:
finally something AMD pushes out to compete with the 980 but a little too late? wouldnt be surprised if they start slashing the prices for these sooner rather than later considering their financial circumstances.
It does come close below 4K, which still is what most people game at, hence why it should have a pricetag that would make the 980 obsolete, else people are just going for the 980 instead regardless of the impressive results the card can hold at 4K. Besides that, most people that run 4K will demand more fps than any of the current single GPU's is able to provide, it is not even interesting on it's own. 2 390X's in CF are probably the better choice at that point?
No need for excuses. Even fact that they included Project Cars makes it far from objective. If you want to demonstrate average performance across all and most played games, then you should not use only 8 games. And if you do, then you should make sure that those games are actually representative sample of average and most played game around. So how average is game like: Project cars (nVidia Title. Using special rendering method which requires double amount of draw calls to do same thing as it is then lighter on hardware. Gameworks inside.) GTA5 (nVidia Title. Using nVidia specific rendering methods.) Witcher 3 (nVidia Title. using nVidia gameworks/benchmarketing utilities.) Only games which people actually end up playing from their selection are: GTA5, BF4, Sid Meier's Civilization: Beyond Earth (and it has like 20 times smaller player base than Civilization V). Therefore I deem this list as misrepresentative. Other than that they do great job at measuring stuff and showing how old drivers were used with each card.
Seriously? Witcher 3 isn't played? In addition, Civ: BE and BF4 are AMD titles, yet they are also used. Stop making the AMD vs. Nvidia title excuses, they don't work. You can turn off most Gameworks effect (which only enhance the graphics) if you like; and they ran the benchmarks with the effects turned off. In any case, the new 15.7 driver offers no notable performance increases for the Fury line (according to Anandtech and others), so there goes that excuse. You want the graphs without project cars? Here: Congrats, the playing field remains the same.... Seriously, how long are people going to try and make excuses for what's in the graphs and market?
I don't know what you guys are arguing about, we are on guru3d and we should comment on guru3d's results which show that at stock clocks the fury greatly outperforms the 980.
But this isn't "stock" Fury, these are aftermarket and custom board and cooler Fury's being compared. They are also being compared to a card that is currently around $100 cheaper on sites like Newegg (and $80 by MSRP). It's only fair to compare aftermarket 980 boards then like the Gigabyte G1, which then despite being significantly cheaper ($60 in this case/10%) and using less power, provides equal or better (frametime) performance.
It is as stock/reference design as it can get, this time there is no noisy blower. deal with it. And it remains fact that AMD titles are not breaking experience for nVidia users while nVidia gameworks titles break user experience on any hardware nVidia needs it to break. And what I meant by being played is simple. Game like Wither 3 is being played by 55 times lower amount of people than DoTA2. It is not going to be any better as most will put it down in under 50 hours, some will play entire 200 hours of content. But those popular multiplayer games has that player bases long before W3 came and will keep them long time after W3 is thing of past.
Based on what? People keep talking about these magical driver increases (this goes for Nvidia as well), yet I don't see any empirical evidence of this anywhere. Here, this is the 290x with launch drivers (Oct, 2013 - Catalyst 13.11 Beta v5 ) on Crysis 3, and then on the latest drivers 15.7: Where are these magical increases? Hexus (entire 2014 driver set) and other sites also did comprehensive driver performance roundups and found only minor increases (around 5% or less) for both Nvidia and AMD.
Did not he actually mean that GTX 780 which was till recently 50~60% stronger than gtx 960 ended up having same poor (28~30 fps) performance in Witcher 3? Or that while 780 Ti is like 10~15% stronger than 970 in general it is 25% weaker in Witcher 3 and like 5% weaker in Project Cars. I would be looking at that kind of longevity. Those games are basically benchmarketing projects and their buyers are victims of scam. Edit: Look at GTX680/770 and HD7970(GHz) those were quite equal. But gameworks easily puts 770 to 23fps and HD7970 to 26fps. As unusable as 780. I do not think owners of those cards appreciate it.
You are finding conspiracies where there are none. Maxwell has a new generation Polymorph engine that is much faster than Kepler's: Even Nvidia acknowledges this: Nvidia is just giving users the option to use the new and enhanced effects. Again, you do not have to turn on the effects you do not have the power to run, at least those with the cards with power to run them can enjoy them. Again, show empirical data or stop yelling out these conspiracies and predictions; it's getting old.
It is stock performance wise, and let the prices settle a bit it just came out and there is low availability. No one is insulting your beloved Nvidia they can still compete with a relatively old card like the 980 so it is good for them, and good for consumers that AMD is beating the 980 so they will have to drop its price.
It is not about hairworks, it is about gameworks as entire package. Hilbert's tests were with hairworks OFF. And gameworks is tailored to favour just one architecture. And at this moment it suits to nVidia that it is Maxwell. Coincidently it does not break AMD's GCN as much as Kepler. So, don't go far and check how even r9-290 is 10% better at this mess than GTX 780Ti and explain to 780Ti owners how nVidia takes good care for them with their gameworks. What should those people expect from future gameworks titles? It seems to be common gameworks behaviour now. Kepler is becoming obsolete by nVidia's choice. You should understand that I am not advocating Fury(X). I am simply against using broken/unpopular games as indication of general performance.
Witcher is neither broken nor unpopular and while I agree it gives strange results in benchmarks it is still right to include it in the suite since when you are going to buy a card you do not only want to compare the performance numbers against other cards but also see if some top selling games like witcher or gta5 would be playable.