Unlike Intel Nvidia isn't dropping the ball. That said, for as long as AMD doesn't compete I don't think this kind of performance will replace 'current gen' but at best be sold 'beyond current gen' for a premium price, as suggested by the $3000 MSRP. The die is huge and yiel likely not the best in the world but $3000 has tons of margin in it. At this rate I'd not be surprised to have to wait another 2~3 years until this trickles into the mainstream segment (if even that, it all depends on AMD). The world has gotten so used to the frequent 10~30% jump every generation that it's easy to blow people's mind by showing something like the Titan V and a fair share of folks will find the price tag acceptable. Having jumps of 50~100%(+) per generation wasn't entirely uncommon a decade ago.
Oh dear, why are so many whining about a card that clearly isn't designed for gaming? Why not treat this story as an amusing project that none of us gamers is going to waste our money on? Let's see what elements of this prosumer card is brought into the 1080Ti replacement. Employ a GSOH and don't take an irrelevance (to us gamers) so seriously! I'm fascinated to see what the next gen of gaming TOTL NVidia card will achieve, perhaps then I can FINALLY upgrade my SLI 780's and have a single card capable of driving a 4K 144Hz monitor properly? Well, hopefully the custom Ti version will.
They were? Titan X was the best selling deep learning card. They purposely removed the Geforce branding from them. The original Titan was the only Kepler model that kept high performance FP64. They've always been a card that is the best of both, this time it happens to lean much further into the compute/workstation sector with both the Tensor cores and high performance FP64. Nvidia is going to bin their chips where they feel they can extract the most profit from them - like I said previous it's pretty clear that GV100 has an issue with failed HBM stacks, so they are simply taking those chips, rebranding them as Titan Vs and selling them for far less than GV100 - but still with a huge margin. Those margins subsidize the costs of the gaming cards that come later. Also as for Stevethegreat's 1080Ti comment - you can't run 1080Ti's in TCC mode, so they are essentially useless for large scale compute clusters. Which is why the Titan X was so popular.
And sat at the very top of gaming performance by a wide margin untill they release the Ti model of the topend...
This one does as well? People are just upset that it costs 3x more but it also is capable of nearly ~4x performance in specific workloads, it's just those workloads aren't gaming - but it's not a gaming card, it was never marketed as a gaming card and doesn't even carry the geforce branding anymore. It's been five generations of Titans and each and every release people come here and are completely baffled by the product - I just don't understand that.
I'm not baffled in the slightest. Other than maybe at nVidia's marketing department and naming schemes.
How would a Titan Card favor in a Workstation situation like CAD based stuff that a Quadro would be geared for? The reason for the question is because the Titan could be a cheaper Workstation card vs the Quadro. Also if you look at the Standard Vega cards not the RX Vega. AMD made those cards with a game developer in mind because those standard Vega cards have a game mode attached to them that you can toggle on or off in the Control center. Not sure where the Titan cards would fall into as far as usage goes.
Here are few quick tests of Titan V https://www.reddit.com/r/nvidia/comments/7iq5tk/just_got_a_titan_v_this_morning_ask_me_to/ Hopefully Otoy and Blender Foundation will update CUDA for this GPU as well, want to see how they perform there Hope this helps Thanks, Jura
From the Reddit benchmark link above: SPECviewperf 12.1 3dsmax-05 180.12 catia-04 206.16 creo-01 145.23 energy-01 27.83 maya-04 198.07 medical-01 90.15 showcase-01 164.45 snx-02 224.08 sw-03 123.48 (if you're a professional user, these numbers are better than a Quadro P6000 except for sw-03)
Oh ok thanks for the info. So that more than proves my theory that the Titan V can be used as a cheaper alternative to the Quadro as a Workstation card.
New timespy score at approx. 2.0GHz.... 35% faster than 1080 Ti at similar clocks. https://www.3dmark.com/spy/2911339
Why no 4k benchmark? I would think if anyone is crazy enough or rich to buy this "overpriced" for gaming would consider using it for 4k. I wonder when im gonna get a real 4k gpu, atm there isnt anything, 1080ti is on the edge, but got a overpriced gtx1080 that works ok. But i want more power and stable fps over 75..
unfortunately i suspect that the clock speeds are misleading, since a ref 1080ti at reported 2ghz-2.1ghz (in 3dmark) is power throttled unless using a higher tdp vbios. The problem would be considerably worse on something like the titan V. wont really know how much it has, until someone with the balls does a shunt mod. This thing could very well need in excess of 500W to hit its peak performance.
Nvidia's Titan line is expensive, water is wet, the sky is blue etc. It has been synonymous with expensive since it's inception.
Considering the 1080Ti was about 75% faster than the 980 Ti, this being about 40% faster is NOT astonishing at all. The FP32 compute bump is really nice though.