If i was to buy a new card, i would get only GTX 970 or 980 depsite the news, it's not right that nvidia lied, but performance are superb as it is and that is not that a big of an issue...
"Did you buy a Ford Fusion because of fuel economy, performance or passenger room?" "I didn't, but I have a Ford Taurus, and might get Fusion but will probably wait for the Lux Interior 9000".
Purely performance reviews. Yes I look at the main specs (256-bit, 4gb mem) but they are not a deciding factor. I bought a 660ti (192-bit) when it came out for same reason, it did better than a 580 even at 1440p. If the 970 had come out as a 3gb 192-bit card I would have bought it just the same.
ikr, it was a knee jerk reaction . .that's why i mentioned it so my vote could be disqualified. apologizes - my bad.
Ive been busy lately and have not been able to read as much as I normally do. What this business with nvidia lieing? Can someone point me to the relevant thread? Thanks.
I don't think there's a better matching thread than this one: http://forums.guru3d.com/showthread.php?t=396471 The claim is, nvidia is lieing because of the specs advertised that don't fit the hardware. Nvidia now said that it was a mere misunderstanding between the engineering department and those who took care of reviewer's kits / pr, but some believe it to be a lie to sell more 970s, not naming the true specs on purpose.
Combination. The performance difference over my GTX660 was insane. If I OC I can bust out FC at a solid 60 with all the NVidia options cranked. Oh, and I have to turn the stupid UPlay overlay off. I don't know why that thing eats so many frames.
Combo. I'll be keeping it for now but i will no longer be purchasing a planned on 2nd for SLI,its just not worth it with the now known 3.5 gb mem limit If Nv offer some kind of compensation for their mistake (game code ect ect) id think about purchasing what ever next gen card they release,but if they sweep it under the rug i'll vote with my wallet.
I didn't even read most of what was written in the reviews I looked at. I skipped straight to the frame rate charts, then the conclusions for each review. I wanted to run this year and next year's games at 1080 and a solid 60fps. Settings, AA, and Downsampling are all secondary to me - the frame rate is the priority.
How isn't it worth it? What resolution do you run at? Honestly, this might be an unpopular opinion but the vast majority of people who bought these cards don't understand what the specifications mean anyway. On a forum like this you'll see more people that do, but for the majority it's not the case. They either bought due to recommendation in a store or they bought based upon performance reviews. There's a hell of a band wagon effect going on at the moment.
Reviews + specs + avoiding AMD. Just getting the best available in that price range while avoiding AMD was a big thing, I've had horrible experiences with AMD drivers. Despite that, properly functioning RAM is important to me as I bought the card with a lot of heavy game modding in mind, and tons of texture mods at 1440p really does need the RAM. Despite wanting to avoid AMD I'm still on the fence about getting a refund on my 970 and buying a 290X. I have about 15 more days to decide. A factor in all this is that I won the OC lottery as far as 970s goes. Very few as far as I can tell get such a high OC.
Well back in October I was thinking of buying a 970 to replace my r9 290 card due to the fact I was having issues with the card but lucky for me I was able to fix the issue that I was having and I really liked the how OCable the 970s are.
Even if people bought it based on specs, I highly doubt it was specifically on the number of ROPs/L2 Cache. I'm willing to bet that 99.9% of 970 owners couldn't even tell me the number of L2 cache on their card was before this incident and even now I bet 90% of them still can't.
I can match you on GPU.... I haven't really touched the VRAM yet. I can game at +235 GPU, +200 VRAM on stock voltage. My starting point was +200 on GPU and VRAM both. I'm still trying to find max on GPU.