Discussion in 'Videocards - NVIDIA GeForce' started by alanm, Jan 23, 2015.
Article just got edited and now it's "From the horse's mouth"
I hope things dont have to be recalled for a firmware update as my 1 of my GTX970's is an exdisplay unit and only has a 90 day warranty. So I have no idea if I will be able to send BOTH my cards back as one is brought full price retail and the other is an exdisplay unit.
From what Nvidia has said the performance drop in gaming is barely noticeable if not at all as frame rates are high enough and playable anyway.
Nvidia should at the very least give something to GTX970 users as this is kind of misleading. Selling us a 4GB GPU rated at X.XGB/s bandwidth at X.XXXMHz with a 256bit bus. We expect to get a card that has that, on paper it does, but we should of been told that once our games require over 3.5GB of VRAM that we will incur a performance hit no matter how low that hit is we will still see a hit.
Basically the first 3.5GB of VRAM has a 208bit bus, and the remaining 0.5GB of VRAM has a 48bit bus. 208+48=256bit bus. So technically they are not lying but they sure as hell as misleading customers.
Know about it? They DESIGNED the GPU and memory subsystem...
Err....and judging from the benchmark nvidia provided. It turns out driver will only allow the slow 0.5gb to be used IF certain game settings are detected!
970 does take few % hits when the game REALLY REALLY needs 3.5+ GB.
But lets keep this in perspective - they are great value and dirt cheap to begin with.
I wish Nvidia handled this more openly from the start. Yes, technically and even ethically they did nothing wrong, but still...
But I blame reviewers just as much for not catching this earlier. Then again this is not easily spotted and matters only when super-sampling, i.e. rare case usage so.... yeah there you have it folks
Based on this, we consider the matter closed
Pretty much.And it depend even on resolution.if you playing in 1080P you have in some games pretty much 3.5Gb card no matter what.
NV you dont see my money again:!!!!!!!You lied us that card is not 256bit and 4GB.
ITS 208bit 3.5Gb + 48bit 512MB.LIARS!!!!!!!!!!!
Yeah, what a disappointment when I get my GTX970 back from RMA next week.
As questioned by pcperspective; "The questions remains: does NVIDIA's response appease GTX 970 owners?"
A few posts back, @gtx980 wondered why a gtx970 was getting ~150gb/s in the dram bench and the gtx980 got 175gb/s.
I now have an answer for this. Let us assume it runs at 6ghz memory clock on the gtx970 and 980.
208bit bus @ 6ghz = 156gb/s
48bit bus @ 6ghz = 36gb/s
And with that. I'm done. Good luck to all finding a fix/solution/answer. I'm officially done.
this might be the best explanation
I have a feeling they are milking us all.... hmmz.
As an AMD user who has to read all the F.U.D. (fear, uncertainty, doubt) from Nvidia fanboys and shills, such as AMD overheats heueheuheu, AMD using moar power, AMD drivers crash
It's really amusing to hear some of you try and defend this,
"oh those games are broken, you can't use these to test with"
"who cares, its still fast"
"I play CS:GO and have NO problems"
All the stuttering posts about 970's make sense now
This thread is officially done as well.
You're just NOW getting that feeling? The entire PC industry has been milking us for decades. Every industry milks customers whenever possible.
.........You know, I kinda wanna cry right now. I spent my entire one month worth salary for this card after saving for few months.
Few years ago....when 8800gtx/s bump material is discovered, nvidia outright denied any issue.......only to admit the issue after a chipwork company tore down the gpu die and do some spectrograph. Nvidia didnt 100% admit the it is responsible. Rumour has it, nvidia already knew the problem from the start before consumers even report issue about it.
Fast forward today......I saw similar action taken by nvidia... only technically admit the issue after someone creates a bench (not to say the bench is 100% perfect, I extended my thanks to Nai, VultureX for their efforts in discovering the reason).
Do you know the moment I purchase this card, I would have it in my desktop for 4 to 5 years? Darn it......
Follow up question - are those having issues with SoM at > 3.5GB @1080 running the Ultra textures or High?
To be fair most of that fud was recycled from AMD owners after Fermi cards appeared.
Nice to see this finally looking to be resolved, Nvidia have not been great though and while I've personally not noticed this it might become more apparent in the future.
The cards are still fully functional, as designed.
You really should avoid reading the errata for your processor.....otherwise you'll feel even worse knowing what programmers have to go through to get their software to run right.
Every product has flaws. Some intentional, some not. At least with processors, erratas exist that are publicly accessible so users can find out exactly what flaws AMD and Intel knew about prior to release. You'd be surprised by some of the flaws CPUs actually have. Nehelam actually had a similar TLB bug to what AMD's original Phenom processors had. Funny enough, nobody seemed to care that Nehelam had a TLB Bug, but made a huge deal about Phenom having one..... Both were easy to work around in code, if developers had cared to do so.
Almost forgot! Why GTX980M and GTX970M with either 4gb or 8g arent affected?
Where is this official Nvidia staement being taken from? I see no reference, no mention on Geforce forum, no names attached to the statement....nothing.
All right here