8800gtx wasn't out when Oblivion launched, the top nVidia card back then was the 7800gtx. I was running a 6600gt prior to Oblivion launch, then upgraded to the second best single gpu on the market, 7800gt. The game still ran like a pile of steaming sh1t. I wouldn't draw conclusions based on Oblivion. In fact, my x1800xt and x1900xtx were a massive upgrade for Oblivion, those cards were beasts. EDIT: Actually, thinking back, the 7900gtx was the top nVidia single gpu card back then. My bad.
Ok, where is Fury (non-X) ? Maybe we can expect similiar performance at cheaper price just like 290 did vs 290X. Nano will be coming as well, it can be interesting very soon. bigosik, stop posting that links from that Poland side, its one sided and no one gives a crap
What a shame, was hoping that AMD pulled an Ace out of their sleeves and not coming with a card that doesn't suffice for 4K gaming. Not saying that the card is bad, but compared to it's competitor it doesn't pack anything extra, rather the opposite. I highly doubt that driver updates/tweaks will be able to fix anything for the card. No reason for me to switch now, but I already had the suspicion that the Fury wouldn't be as super as some fake leaks and peoples HBM performance assumptions claimed. Just didn't make sense with the rest of the cards stats, you just can't push more than what the rest of a card is up to, simple as that. Hopefully AMD manages to come with something that clearly beats NVidia with the next generation of cards. Cause they won't keep surviving with on pair performance with less features, which also could lead to even higher prices.
Driver optimizations can make all the difference, it's not that it can but will it be done. AMD already have a huge overhead compared to NVIDIA (lack of multi-threaded DX11) imagine if AMD did support it? It's all drivers. AMD has some great hardware and it's always let down by drivers. They've just made their current branch support the new cards and then they've been separated from all the previous models for some reason. There's nothing new there.
Seems to shine at 2160p. A shame it does not offer downsampling with 120Hz at that resolution, only 60Hz so that forces me to buy from the green team. (hope i'm wrong)
No we didnt expect it to be, but we all hoped it would be faster. Anyways guys, i caved and bought a reference 980ti and a ek block..
On different sites the same. CPU-Benchmarks & Frametimes Link: http://www.computerbase.de/2015-04/gta-v-grafikkarten-und-prozessor-benchmarks/2/ The frametimes on NV cards are way better, smooother gameplay. Very high spikes on 290X 10FPS more (CPU Test)
well both intel and amd has stated they will remove/phase out DVI/VGA/LVDS 5 freaking years ago... http://newsroom.intel.com/community...digital-display-technology-phasing-out-analog and here is 3 years ago from amd http://www.pcworld.com/article/248421/vga_dvi_display_interfaces_to_bow_out_in_five_years.html so please stop whining, start to use DP to DVI-DL adapters and DP1.2 to HDMI 2.0 adapters (coming this summer) you wont find DVI/VGA on intel products either in the future and i also believe some AIB partners will create cards with DVI/HDMI2 on them by using DP1.2 to HDMI 2.0 chips or different layouts for DVI-DL (no HDMI anyone) sapphire did this on the "SAPPHIRE R7 250 1GB GDDR5 EYEFINITY EDITION"
Where's your source for that? Back that up, since historically they don't. And just to repeat my evidence for that: Why do people think drivers are going to magically speed up the card by magnitudes? Here's the breakdown for the 290x from release driver (13.11 Beta 6) to driver at time of review (Catalyst 14.7 RC1, about 9 months after launch): -In 3DMark the R9 290X gained 2.5% in FireStrike and there was no change in FireStrike Extreme. -In Battlefield 4 the R9 290X gained 1.3%. -In Bioshock Infinite the R9 290X saw no change. -In Metro Last Light the R9 290X gained 1.8%. -In Sleeping Dogs the R9 290X saw no change. -In Tomb Raider the R9 290X saw no change. http://www.ete-knix.com/examining-amds-driver-progress-since-launch-drivers-r9-290x-hd-7970/ And Hexus's test came out the same for the course of a year's driver updates (Jan-Dec 2014): Average AMD 290: +4.1% Average Nvidia 780 Ti: +6.2% http://hexus.net/tech/reviews/graphics/79245-amd-nvidias-2014-driver-progress/?page=6 So where's your, and several other people here, statement of drivers massively improving performance coming from?
When the 780Ti came out, it beat the 290x in almost every game. Now we see it lags behind in almost every game. So perhaps in a years time the Fury X will beat the 980Ti...
Tomshardware (Benchmarks) http://www.tomshardware.com/reviews/amd-radeon-r9-fury-x,4196-4.html In GTA5 not even a chance, sad after all the hype. I hope 16nm comes sooner. Tired of 28nm
In FC4 it destroys even Titan X. BF4 shows more or less same performance so i'm expecting Battlefront will do to.
I don't know whether to laugh or just hold my head in my hands. I laughed first though, I'll give you that. clutching for them straws, I see.