Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Apr 29, 2020.
Those benchmarks are from January 2019, and it's only two games..
I have about £550 put aside for a gpu upgrade from my still faithful 980ti, the way nvidia prices are going I might be able to afford the 3050 ... ;_;
I rarely look at benchmarks besides first ones for a product, so normally founder card reviews, it's expected however that over time the cards perform better too.
I actually found a video comparing all 3 cards, seems mixed in results but indeed the GTX1080 is closer to a 2060.
It would be sad that the formula was always the same, either way i wanna see the prices first and ill have to become a console pleb if expensive (will keep the GTX1080 ofc)
yeah that gap is too much between 3080 & 3080TI. . But I guess if it has 1024 tensor cores then it needs to be thaat big.
Also I saw some news a while ago saying NV will drop xx70, xx80, xx80Ti price a bit, 100$., xx70 model 50$.
I was looking at normal 3080, so those spec put it ~2080ti or a little more,..imo to be expected.
RT core is good, 2x more.
Historically 30-50% uplift per gen. I don't see that changing. What was different this time was the pricing. We didn't get a 2080 Ti at $699.
PS: No they didn't.
GPU's are not CPU's.
I don't think that it's far-fetched for the 3070 to be as fast as the 2080ti, I mean the 1070 wasn't far off the 980ti (and think it was better in some games, but I'm vague on this)....so yeah it's not far-fetched for 3070 to be as fast as 2080ti, especially as 2080ti wasn't a massive leap over 1080ti, and this time we're on a die shrink & a new architecture to boot. So I would be actually very disappointed if the 3070 wasn't as fast the 2080ti...I mean I want it to be faster to make it more worth my while upgrading considering 2080ti wasn't far above 1080ti.
I agree. If we assume that cuda cores keep the same performance, then the 3070 has more cores than RTX 2080 so it will be at least better than that. But I doubt nvidia won't touch the cu, given this long span of development time. So I expect rtx 3070 to be ~rtx 2080ti, which is just ... massive.
AMD fanboys are happy that rdna 2 will bring rtx 2080 ti performance, but nvidia is moving that performance tier to mainsteam, lol. amd will not have the same luck with nvidia as they had with intel, which basically stuck 5 years with the same product. No crap they managed to beat Intel, cause Intel is still using 5 year old parts to compete with amd latest and greatest. And they still compete on performance, so I don't know. Call me a hater, but amd success on the cpu side is to a big degree related to Intel being stuck in the past.
As for the 8000+ cores behemoth, that will be reserved for HPC. the rtx 3080ti will be probably have less core, something in the region of 5000 cores, which should bring at least 50% better performance compared to rtx 2080ti, which sounds good.
Mate, look at economics of GPU production. Big means expensive to make. And nVidia has more transistors in GPUs of same performance than AMD.
And they are about to cram much more tensors and other stuff there making AMD's advantage even bigger when facing RDNA2.
My next GPU has to offer a 50% performance increase over my 1080Ti and can´t cost more than 500€!
Sorry my fault, indeed the Xbox X is 405 not 495mm2 and Mi60 is 331mm2.
Yet still Nvidia direct shrink of the 754mm2 12nm is close to 460-500mm2 at 7nm, doubling the cores according to this rumour and almost doubles in size 800mm2+ (I give some space to not direct 40% shrink).
So regardless no Nvidia GPU can be made at this size, which is my argument.
Well yeah, maybe now we"ll get an upgrade for the price, whereas the difference between the 1080(Ti) and 2080(Ti) really wasn't worth it.
3080ti looks like a beast. The 3080 also looks faster on paper than the 2080ti. I'm ready.
haha I love how you make Nvidia pay twice for their ray-tracing hw, while giving AMD a freebie for their future implementation of the same feature:
Also AMD advantage(??) will be even bigger when they are on the same node... yet they can barely compete while having a full node advantage... fascinating stuff
Here is that "advantage":
this is perf vs # transistors vs power, Turing vs RDNA, leveled ground - both without ray tracing (1660Ti vs RX 5500 XT):
If those specs are true, it's gonna wipe the floor with your bank account.
Now some reality check.
Are those GPUs which are so much praised as new mainstream cards with higher than 2080 Ti performance going to be without RT related parts? No.
Is AMD's implementation which enables RT related tasks based on separate (and therefore extra) building blocks that require its own data path with in/out caches? No.
Is it expected 1660Ti with 50% more ROPs and 50% higher memory bandwidth performs better than 5500 XT? Yes.
Is it expected that 5500 XT which compensates with higher clocks is less power efficient? Yes.
Is only real representative sample (RX 5700 XT) having fewer transistors than its closest (and more expensive) competitor RTX 2070 and still outperforming it? Yes.
When people talk about big ass GPUs, take closest representative example, not worst possible low end card you can find.
- - - -
So what are limitations here?
AMD's statement that RDNA is going deliver quite same performance per watt jump as RDNA1 over previous comparable generation. Which meant 50%.
May nVidia deliver more power efficient GPU? Yes.
Will it matter? Yes, only for 300W limit cards.
What will matter is performance per $, as always. And features. Maybe this time lower end (read mainstream) RTX cards will not be half useless for RT which people had to pay for quite some premium.
- - - -
If this rumor is to be taken seriously, then 3070 (1950MHz) will be able to deliver 90% of RT performance which 2080Ti has at 1750MHz.
But then you can look at the list and make educated guess on which of those chips will be full and which will be cut down GPUs. (Mind the cost.)
And guess relative price of those cards.
And nice example that touches current nVidia's GPUs is again RTX 3070. In terms of building blocks its around middle of the road between 2080 and 2080 Ti. May have some 16B transistors.
Take price of those 2 existing cards. Make approximation of how expensive 3070 would be on same node. Then translate it to new node and tell me how big "leap" in performance per $ 3070 will deliver over 2080 Ti.
Even if you take in exponential progression of price with transistor count and make old node approximation for 3070 quite cheap... And then add just 10% price for cutting edge node, you'll end up with only about 10% higher performance per $ than 2080 Ti.
- - - -
After comes harsh reality of this rumor. 3080 will have like 2.5B transistors more than 2080 Ti. And I doubt that anyone expects it to be cheaper on cutting edge node.
That 3080 Ti is almost a joke in terms of transistor count and therefore price unless it is chiplet based.
wall of text...
once again giving AMD a freebie: 5700(no ray tracing) vs 2070 (ray tracing) , while pretending that 2070S does not exist.
insisting on comparing oranges and apples, while glancing over perfectly leveled ground example (1660Ti vs 5500XT)...
You're unfairly counting the transistor cost of dozens of Nvidia's features over RDNA1 that you aren't factoring into the "performance" comparison of the two cards. How does NVEC quality measure up to VCE? VRS? DLSS? RayTracing? Mesh Shaders? These are all features that Nvidia has over AMD (and probably more) that would add/change the transistor count but aren't being factored into your "performance" argument here.
Until you have actual concrete general performance, RT Performance, feature set and transistor count of a discreet RDNA2 chip it's pretty meaningless.
You get what you pay for is a lie. People paid premium and had hard time enjoying RTX features anyway.
Next generation of nVidia's cards will not deliver any kind of price revolution. Lowest RTX 3060 is practically 2070S in terms of rasterization but has double RT core count + some 15% higher clock if we look only on cheapest 2070S cards.
That's nice RT performance boost. One that will be gobbled quite fast and without extra effort. And as that happens, even 2080Ti will become worse performer in RT games developed for next generation.
But at least RTX 3060 will have reasonable RT performance and people may finally get what they paid for. (That's if rumor is true.)
If you happen to look at post I replied to originally which @Noisiv kind of defends for no good reason, you would or would not agree that "specs" (true or not) will not wipe floor with anything else than wallets.
True or not, good or bad, all I have to say is if it costs more than $600 it doesn't exist for me.