Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 1, 2020.
No one ever likes marketing. Unless you’re GEICO.
What part of 3090 is a titan and titan has never been a linear price increase due to its more intended workstation use (hence the ram) do you not understand?
And before someone says: That was harsher then it needed to be.
It's an honest question.
The RTX Titan cost was $2500, and it had, what, a 10% increase over the RTX 2080 ti? at best? Yet cost up to 2.5 times the cost?
The only difference between the RTX 3090 and a titan card is the naming, as well as the fact that it'll actually bring a decent bump in gaming performance if you are interested in going for it. Otherwise it's the cheapest titan class GPU we've seen in years.
So my question is honest: What is not understood about this? And realistically, what is there to complain about?
It is coming to PC but it’s unclear as of now whether or not you’ll need a motherboard that supports it or if you’ll be able to pop a drive into a PCIe slot. Here’s a Microsoft blog post about it from yesterday:
Yes, but while applying that 2x to CUDA and Tflops looks great against console: "The PS5 is 9 the Xbox s.X is 12 but 3070 is 20 Tflops!!! For the same 500usd price"
It showed their IPC going backwards like hell and it's not smart. If AMD can come close or beat the perf of the 3080 and on stage Lisa Su state that AMD is able to get the same level of performance with only half the core amount than Nvidia, yikes it won't be pretty.
Imagine Intel beating the 3950x in cinebench multi with an 8 cores CPU....
Edit : If I was working at Nvidia markt department I would have keep the usual way of counting cores and just stated than Ampere was one of the biggest IPC uplift of all computing history...
I like the 699$ price for 3080, it's 1080Ti kind of money. But i hope there won't be shortage (f-ing miners) followed by price hike this time around.
As long as you're ready to hit 'pre-order' 12am on the 17th i believe you'll stand a good chance, at least of a 3080.
Important is performance per transistor, per clock, per watt. While per clock makes "worse IPC" negative contributor, there is performance per transistor investment which may have become better in same technological change.
Like increasing transistor count by 20%, but while doing so, GPU gains 35% performance at same clock.
Therefore alone, one point of view is not as important.
Same way as nVidia's TFLOPs investment change alone does not make GPU great. It not only needs to be balanced in terms of shading, TMU, ROPs, RT, discard, dispatch, ... , it needs adequate memory bandwidth.
You're right, but I wasn't speaking of the architecture change themselves, but how Nvidia is advertising them and making those tflops claim. Btw most AIB didn't have the "2x" applied in their marketing materials so Nvidia marketing decision must have been pretty last minute.
Looks like the early pre-order pricing is there already though, roughly 1200 US Dollars for the 3080's and 2200 for the 3090's will be interesting to see what the final pricing lands on here ha ha.
Will be interesting to compare with other EU country pricing levels as well, has to be some place worse than Sweden for hardware pricing.
A few obvious placeholder prices as well for 99.999 SEK more models listed than I was expecting to see already but no 3070's yet and what these might land at initially.
But yeah at least the mining price hike isn't a thing this time.
EDIT: Ah but it does match up a bit more with 1 USD = 1 EUR converting it from that point of view.
850 - 900 EUR for the 3080 although the 3090 slips a bit to 1800 - 1900 EUR over the MSRP of 1500 compared to the 800 for the 3080 if I remember numbers at all.
(Or anything else really, yay memory.)
EDIT: Oh and I need to subtract the 25% for the Very (or Value) Added Tax too here I keep forgetting about that bit.
Yeah that works out, roughly 1500 for the 3090 then and around 700 for the 3080 which puts it on par with pricing estimates.
(Which must be around it's MSRP of 700 then not 800)
EDIT: Should just call it the quarter pounder really because it's a quarter of the pricing total and your wallet certainly takes a pounding from that additional cost increase hah.
(Eh the money is used for something, that's at least something whatever said something actually is ha ha.)
Maybe, maybe not. It's entirely possible nvidia didn't give the full information for the simple fact they didn't want it to leaked.
I wonder what that means for future driver optimization. It wouldn't be the first time that Nvidia tries to "break" an API to deliver more performance (look at Reflex and the Mantle-era DX11 driver).
Sorry, just saw this, apps for the delay in replying.
It's not about whether people are forced to buy it or not, it's just that 'in these times' someone thinks charging this much for something is absolutely and perfectly acceptable.
Stepping into this: they are marketing the 3090 as a gamer card, so, what is it about the question that you do not get?
IF it was marketed as a titan card for workstation usage...then your point has merit, but it isn't and it doesn't.
Someone didn't watch the announcement. This much is clear.
Every titan card has been marketed as the fastest gaming GPU they have, but not its intended use.
The only difference with this one is how this one is more well rounded: For workstations but also a very big bump in graphics performance.
They haven't announced the titan 30XX release, what am I missing here?
You're not missing anything. The 3090 is the top line "consumer" gaming card as I see it, the equivalent of the 2080ti. The fact that Nvidia are allowing 3rd parties to produce custom designs of it also makes it more equivalent of the 2080ti, not Titan. Price of the 3090 is also fairly similar to a 2080ti.
You mean RTX 3080, not RTX 3090 (Titan equilivent).
No, I meant what I wrote.
Just wondering, why did JHH refer to the 3080 as the 'flagship' gaming card. Its like the 3090 is in a separate segment... like the Titan.