Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Jun 5, 2019.
I NEED HDMI 2.1 ASAP
Well, the next release from nVidia is going to be a tricky one.
Right now, an RTX 2080 Ti can only do 60+ fps with RT on @1080p (and decent 1440p), paired with an i9-9900k that is.
Since nVidia will certainly have Ampere priced way above Turing, they must deliver 60fps@4K. I don't think people would spent ~1500k $ just to get 75fps@1440p instead of 60fps.
Plus, Samsung's 7nm process has not been tested yet (besides their own test samples), so that's a big question too.
My thoughts exactly.
My thoughts exactly.
Yeah, that's not going to happen until someone competes with them at the high end, and it doesn't look like that's happening anytime soon. The mid-range and lower is going to see a huge shake-up in the next year, but that 2080Ti is going to remain untouchable. There probably isn't enough market in the above $700 range for AMD to even bother trying (unless it also has some sort of datacenter application they can make real money from).
As to Intel Xe, I'll believe it when it's in a reviewer's hands telling me it is faster than a 1080p@60 capable card when they ship in a year or two. My bet is that is about all they are going to manage, and that's completely dependent upon them getting a process shrink working that can actually turn out enough wafers for them to make them. They already dedicate a pretty big chunk of silicon on their current CPUs to graphics, and their top-end Iris is an order of magnitude slower than even the lowest end dedicated GPUs from AMD and nVidia. It's possible that they somehow have a miracle waiting in the wings, but it just seems... unlikely.
20 to 30% may be more is possible but configuring chip design towards ray tracing improvements will cut the % down a lot. The real problem with this years new RTX gen is pricing.
Why would that be the case?
honestly if they would have made a gtx 2080ti, at the price of the rtx 2080 i would probably have upgraded from my 1080ti, but the pricing of the 2080ti makes it a no go
My 1080 Ti Gigabyte gaming oc cost me $629.00 new , double the price, 30 to 40 % gain with 2080 Ti, no thanks.
I wonder if they’ll just have an ASIC chip that does raytracing in the future for very fast performance of RT tasks.
If nvidia keep this nonsense pricing they reliably kills their new gen products.
750EUR for Lightning is good offer.
Just to nitpick, but wasn't IBM NVIDIAs main manufacturer with the infamous FX-series back in 2003?
i wonder if this ridiculous impossible idea will ever stop being postulated.
Laws of physics may sucks but are there and can't be overcome, better make peace with them...
Yep, and they got burned...
This is very exciting. Turing was a huge architectural change for NVIDIA, it made the die huge and expensive so performance per $ wasn't great but it's a very advanced piece of hardware that begs for a die shrink. The last time that we've got a real die shrink (TSMC's 12nm and 16nm is basically the same) was the 1000 series and performance jumped two folds. For instance, the 970 GTX which was on 28nm 3.49TFLOP/s on a 398mm^w die became the 1070 GTX on 16nm which was 6.46TFLOP/s on a smaller 314mm^2 die. That's 85% more performance on a 21% smaller die. So I have very high hopes for the 3000 series, the 3070RTX could have a 2080ti level of performance and I'm expecting lower prices considering the 2000 series haven't sold well. So a 3070RTX with 2080ti performance for 400$-500$? I'm in.
people claiming that RT made the die huge are wrong and need to stop speaking.
Isnt the turing die much larger than pascal? I wonder why if its not the rt.
I did and they said it's the RT.
FWIW I don't think it's the RT, I think it's the Tensor cores, which aren't required by RT.. but telling people to shut up and google the solution when 99.9% of the internet thinks the answer is RT is pretty dumb. There's been multiple threads, posts, even anandtech kind of concludes a major part of the die difference between the 1660 and 2060 is RT/Tensor. So I'm not sure what you expect them to find but RT.
The shrink will help with yields but the cost per transistor on 7nm EUV is higher than 16/14/12nm - which will drive up the cost of chips regardless. I don't think 7nm is going to be as big of a jump as people think it is, without the price significantly inflating.
RT cores are tiny fixed function units.
The size comes from the increased cache, Tensor block (which doubles as the dedicated FP16 cores) and a hell of a lot more SM's per gpc.