Gotta somehow sell Titan. It's for people who prefer spending extra half $k over twitching eyes. --- GTX1080Ti is interesting card. Definitely for higher refresh rate 1440 panels. Reasonable counter to SLI/Crossfire setups of same price range. I just keep thinking... unless CPUs are gonna stabily drive at 144fps and you not really into sliding all video settings to max... do you really need 1080Ti? Just by lowering MSAA, you get like 40-60 fps alone. Are GPUs are really limiting factors once we go for reasonable 144 fps? Imho, CPUs are not cutting in anymore. Not sure if its games that cant optimize full potential of CPUs or something else. We see frame drops of Intel CPUs(quad core), Ryzen doing better at this with more cores, but needs more optimization. Average FPS might be over 144, but once engine have to render lots of things, CPU just gonna drop lots of frames, while GPU is unaffected and just processing rock solid stable frames.
Very good results out of the 1080Ti. I can see this card as a 1440p card and as a 4K card with those results. Also to me the extra 500-600 bucks isnt worth it for 1 more GB of VRAM. Makes me wonder how good the OC potential would be of the custom cooled cards or even a liquid cooled 1080 Ti.
Great review. Total monster of a GPU. Still will wait for Vega since I don't have my 4K display yet but if Vega is same or better, it will be win win for everyone.
Just waiting for my Asus card to arrive, can't wait. My current 1080 is sometimes struggling in 3440x1440p when playing games like BF1. Only problem for me is that EKWB waterblocks for TitanXP is currently out of stock here in Denmark, which is according to EKWB is the only block that fits the 1080Ti. Not sure if true though. (properly not :nerd So I guess I will be sitting and watching a box with a 1080Ti in it, until local retailers get those blocks in stock :-D Thanks for the review. FINALLY I registered here on Guru3d, so I can thank you, H, for those always great reviews
The market is heading towards 32" 4K and 34" Ultrawide monitors being the new standard, the same way 23" 1080p became the standard years ago. It seems like 27" QHD is just a stepping stone along the way. So if we're going to do 4K reliably, we need the GPU horse power. Especially if we start looking at 144hz 4K panels in a few years.
People saying this is not a 1080p card, I disagree. People forget high refresh rates. More often than not, I'm getting GPU-limited with my 980 Ti and can't reach 120FPS. 60FPS doesn't cut it for me anymore. With many games, I have to play at 80Hz. In fact, I only have VERY few games that can reach 120FPS at 1080p. The rest hit 99% GPU utilization way before that. Yes, 60FPS gaming at 1080p, this new GPU is a waste of money. But if you want a good 120FPS/120Hz sync, or at least 100FPS/100Hz as a minimum, this GPU does make a lot of sense. I'm considering getting one at this point. If you've ever played a game at 120Hz/120FPS, you are simply incapable of ever perceiving 60FPS as "good" again. And no. It's not just for first person shooters. Every game where you can look around with the mouse benefits in the exact same way from 120Hz. It's just as important in an RPG like Witcher 3 as it is in an FPS like Doom 4. In addition, people also forget that new games get GPU-heavier. So frame rates will naturally decrease as newer games use more complex graphics.
^Lol, that's why ignorance is bliss. I'm sure I could appreciate a higher refresh rate but I don't want to have to upgrade my hardware as often as Intel/Nvidia/AMD would like me to. I keep modest expectations and am pleasantly surprised the vast majority of the time with an aging system and only a "newish" GPU. And no, even this card at 1080p/60fps will probably not be overkill within the next year for those who absolutely must turn up all of the proprietary nvidia shenanigans. I'm quite sure that by year's end, there'll be people with this card scratching their head and drooling over the next Ti card because they couldn't enable 8192 resolution HTFS shadows in Far Cry 10 or the highest level of hairworks in Assassin's Creed 20 that enables physx on the unseen pubic hairs of every NPC in the game without dropping below 60 fps.
Depends on the game. Sometimes the highest core is 60%, sometimes 70%, sometimes just 40%. It varies. Some games are CPU-limited and I do get 90+ CPU usage on them while the GPU isn't doing much. But in most games, it's the GPU that limits me, not the CPU. And obviously it's for those games where the 1080Ti makes sense. Yeah, sure, I can lower shadow quality, disable ambient occlusion and such and get my 120FPS, but... you know. I don't want to :cry: (That's actually a lie. I *do* lower these things if I can't get at least 100FPS going. And sometimes I can even accept lowering the resolution to 900p to get that. Which is why I'm tempted to get a 1080Ti...)
Thanks for replies, maybe I am alone who hitting CPU bottleneck on titles like BF4. GTX970 with OC can drive it on 1440@96fps without issue on high settings with med shadows, CPU starting to lag and hit limits though. CPU is oced and no thermal issues. Its just still feels weird that GPU load can be easily managed with video settings, but almost nothing with CPU. And yet some people just refuse to adjust settings and go full eye candy cause there is bar that can be pushed all the way right. Gsync just allowed gamers to go full retard with setting all graphics on MAX and enjoy smooth frames bumpfest. Maybe its a cool thing to do nowadays. Gaming is not benchmarking. Just cause reviewers run games on max settings to bench hardware, doesn't mean it's the only way the game have to be played. It's shows only capability. Regarding 4K, Even if 4K monitors start to support 144hz panels, there no way 1080Ti will come close to driving it. Its totally different level.
That's an online game, and those are usually heavy on the CPU. The only online game I play is CSGO and that's CPU limited too. The 980Ti doesn't do much in that game. It just sits there idling.
i hope so... but NVidia have already an upper GPU than GTX 1080Ti in pro segment (just in case of Volta isn't ready for Vega). Titan user are not GTX user. it's like buying an iPear's phone
The majority of Titan sales are in servers, they are the most popular deep learning GPU. Both Anandtech and PC Perspective talk about this quite frequently. They actually removed the "geforce" badge with the latest one to further emphasis that it's not targeted towards gamers.
It has no real hardware differences from their "GTX" line, for getting more compute capability you actually need a Tesla card. They just offer the "Titan" models earlier at a huge margin for them, to effectively fund the rest of their line. The delay between the Titan and the immensely more cost effective "Ti" models only serve as an excuse for over-spending IT departments. Whoever really cares about HPC will go for Tesla anyways, so Titans seem more and more like a bad expense for people who can't wait for six months.
Dont you think these are pointless debates, buy or dont buy, why do people feel the need to protest about something they cant afford or that come from a vendor they dont side with. Its all a bit daft really.