Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 14, 2022.
lmao, I guess they saved themselves an even bigger embarrassment by doing this...
Imo, it should've been illegal for them to have a product called "A 4080" that is entirely different. It would've been fun to buy that on amazon (or something) and return it for a misleading description lol.
All this talk of RDNA3 being weak is seriously weird to me.
The 6900XT had 5120 cores and the 3090 had 10496.
Sure the architectures are totally different but AMD still managed to near match/beat the 3090 in general raster performance with this architecture.
7900XT is rumoured to be around the 12288 cores. This is OVER DOUBLE that of the 6900XT. Now factor in other architecture changes (mcm), clock speed increases. 6900XT could already hit 2.6GHz easily. So 3GHz+ isn't out of the question on the 7900XT.
Now go to the 4090 with 16k cores which is nowhere near double that of the 3090. Factor in architecture changes, and more importantly the huge clockspeed increases and insane power draws its easy to see where the 40 series gets its performance from.
Unless something majorly went wrong in the RDNA3 developement, given the current rumoured specs it would be extremely surprising to see them not at least match the 4090 in raster performance.
RT performance is another story but with the MCM design maybe AMD will use more of the die space for RT enhancements.
DLSS3 with its frame generation is I think where Nvidia could really pull ahead of AMD.
For fairness sake, if you review x070, you won't expect performance nearly as good as from x080. So, Nvidia wouldn't need to be that embarrassed about the performance, whatever it will be. The only mistake was calling it 4080 when there already was a stronger 4080. What comes to the price, as we have seen, they are unprecedented, so it is what it is. Nvidia alone decides the prices. Nothing prevents them from making 4070 cost 1000 euros. In such a case consumers can only pray AMD will offer something equal for a cheaper price. If virtually nobody is willing to pay the price and 1000 euros 4070s collect dust on shop/warehouse shelves, it will send Nvidia the only message that matters.
They better do that.
...hahaha...haha, haha, haha
Sorry, I'll stop.
... I can't stop. HAHAHAHAHA!
I have leaked photos of the new card:
Well, it doesn't always translate between architectures, as you know. The 2080 ti has 4352 cores, where as the 3090 has 10496 cores... however, 3090 is "only" about 70% faster, where as from what i've tested, my 4090 strix is 85% faster than my 3090 strix, despite only having 60% more cores.
7900XT might end up having faster rastarization performance than 4090, but im just saying that you can't conclude it based on the specs alone.
And fyi, the 4090 doesn't actually have crazy power draw - my 4090 strix draws less power than my 3090 strix
Now make it a 600€ 4070 and it just might sell
Are there any estimates on the sales of 4090 compared to previous launches?
Somehow I have a feeling not many are willing to shell upwards of €2000 for a GPU these days.
US is probably not faring any better...
I reckon the board partners can order stylish stickers from China, delivered in a week, that are handy to cover the "4080" and replace it with "4070". Adventurous buyers can take a hairdryer and carefully peel the sticker off, instantly feeling better about their overpriced 4070 when it transforms into a 4080, at least as far as the label goes.
The 4080 12Gb was too gimped to be in the 4080 stack, reduce memory bandwidth, reduced cores, reduce RT cores, reduce tensor cores, reduce cache, and for what? cards nVidia are already selling but with without DLSS3 which has it's own problems and is something i would not use anyway because the latency hit is too much.
I guess Jensen wanted more then he could chew, also wasn't willing to sell 2 year old 3000 series bellow msrp, suits him right.
I also think they are going to rename it 4070, like it was supposed to be initially.
After all, they are not going to throw away the chips they stored for the fake 4080.
In the end, i think Nvidia did this because of all the backlash so far regarding the card, so this means we still have (some) power!
I think it's a mix of both. The backlash, and the risk of getting hit with misleading advertising by local regulators - as the name implied the only difference was in memory capacity, which clearly wasn't the case.
There was no FE "4080 12GB" aka 4060, so it's the AIB partners that have to eat the cost of this "unlaunch" (whoever came up with that weasel word please jump off a cliff onto jagged rocks).
EVGA are probably laughing pretty hard right now.
Yea I totally agree it was more of a general comparison based on the previous generation and how each side compared in general raster performance. Very basic and rudementary analysis
OMG ! We all knew it was not an rtx 4080 all this time lol , barely can call it a rtx 4070. Mr.Leather Jacket is smoking some good crap . What en epic Fail for ngreedia ! And yes evga made the right choice at the right time and they are having the last laugh
… May I have your attention, please?
May I have your attention, please?
Will the real Forty-Eighty please stand up?
I repeat, will the real Forty-Eighty please stand up?
We're gonna have a problem here.
The reality is that none of these cards are real 4080's... one is the 4060 (Ti?) and one is the 4070 Ti, that AD103 chip is not even fully enabled SMs (which are probably 80 of them, and this has 76). Close, but no cigar.
And the performance "uplift" is a disaster, it appears to be barely on par with the 3080-12GB at anything less than max settings 4K.
EPIC FAIL indeed.
Good, I'm glad they cancelled that name for it.
Would have been confusing af for some people. Just the online benchmarks and videos of the 16gb not matching a customer's 12gb would have been a nightmare for pr.
But does this mean that we will see the most expensive 4070/ti-4060ti ever? Whatever happened to midrange cards?
At this point, I would totally understand people flocking back to the latest consoles. Hopefully AMD releases something truly competitive price and performance wise.