So basically I'd need two to outdo my three. At its current price, I'd have to be insane. Oh well. Here's waiting for 7XX.
because crysis and fps games are the only games right? i like to see how it stacks up to witcher 2 with ubersampling.
I havent had a chance to read much, from what I read it is very good, the pricetag is a downer. I think they are going to sell like hotcakes even at that price
My cards are watercooled and run 24/7 at 1200 core and 1800 mem. My experience with crossfire is great overall. Of course there has been problems between drivers but nothing major with games. Most of the problems through the years has been with clocks when idle. It has been resolved long time ago. When it comes to games, the only one I remember not having a way to fix was Witcher 2. It took like 2 weeks to fix it if I remember correctly. Anyway I have had crossfire since 4870x2 + 4890, 5970 (my favorite card)+ 5870 and now 7970 x2 and I am a happy customer. Support is way better nowadays and I could never see/feel microstutter. CFX is a must for me with a 2560x1600 screen. Radeonpro is great including a lot of excellent features like FXAA/SMAA, HBAO, CFX profiles, SweetFX, etc. Sorry for off topic. Titan looks like a great card but at that price is not for me. I think my cards at 1300mhz can stand a fight :nerd:
yep. i think they are gonig to sell out real quick. i got a tool to check for available stock on newegg and amazon every 20 seconds. sends me a text when there is stock so ill get one
average at 1920x1080, anything higher its still to much http://forums.guru3d.com/showpost.php?p=4534174&postcount=90
I was reading other reviews yesterday and it made me laugh when I came to the HardOCP review where they were excited to be able to play Crysis 3 maxed out at 5760x1080 at 48 fps with, wait for it, 3x Titans in SLI. That's £2,500 worth of graphics card to run a game across three screens at less than 60 fps. I don't know what they were smoking but, if that was me, I'd demand nothing less than 60 fps even at that resolution. I finally got to play Crysis 3 for myself in the early hours of this morning when it unlocked here the UK and it is clear that 2 GB of VRAM is a hindrance to this game even at 1920x1200 when using maxed out (Very High) settings and TXAA or MSAA (which, IMO, offer superior anti-aliasing to FXAA or SMAA). I saw dips to 18-22 fps during the opening section but when VRAM wasn't being stressed I was getting anything from 30-60 fps which is at least playable. This isn't enough to convince me to go out and buy a Titan but my next graphics card will almost certainly be one with 4 GB or more of VRAM.
By the way, why is the GPU clock so much lower than the GTX 670/680? Is it because of the 384-bit bus, temperatures, what? I know review samples can be overclocked to 1,100+ but why are they not clocked at least 1 GHz to start with?
That's make sense I guess but it does seem strange to release this really expensive high-end product with lower clocks than the much cheaper GTX 680. If had been 1,006 GHz out of the box then its performance would have been even more impressive, further justifying the asking price. Yeah, I know you can overclock but reviews are done mostly on the default clocks as you know and the overclocking benchmarks are added at the end.
Nice review indeed. This is very much expected, lets just wait and see how AMD and upcoming HD 7990 will do about this...i wonder about price too...
I don't know about a goldmine as I don't expect they will sell too many of these and it'll probably end up being a kind of limited edition run of 20-30,000. Of course, I'm sure it'll still be profitable for them. Me, I baulk at the idea of paying over £400 for a graphics card and the only reason I bought a second GTX 680 was because I was curious about how SLI compared with my previous HD 5870 CFX experience, having never tried it before. It's been a mostly pleasant experience I'm happy to say but I want to go back to using a single GPU for games for my next card as I cannot afford to be spending £800 on two cards every 18 months or so but I'm fine with paying up to £400 for one.
AMDs fps numbers are overstated. They figured out a trick to make runt frames, or frames which are not actually rendered to trigger the fps monitor as a real fully rendered frame. This is real problem for AMD much worse than the latency problem. Crossfire is a disaster which is why numerous reviewers including Tech Report have written that Crossfire produces higher fps but feels less smooth than Nvidia. Check this article out. http://www.pcper.com/reviews/Graphi...ance-Review-and-Frame-Rating-Update/Frame-Rat
I'd actually like to see them do a comparison to previous Nvidia cards -- 8800GTX's at the time were terrible for stutter and frame latency problems. I think the 600 series is the first time Nvidia has managed to get it right. My SLI experience with the 690 has been nothing short of fantastic.
The frametime issue is well known. It's not some trick to score higher on FPS. AMD has been pretty forward about working it out, though many users don't even notice it. http://techreport.com/review/24218/a-driver-update-to-reduce-radeon-frame-times/5 http://techreport.com/review/24218/a-driver-update-to-reduce-radeon-frame-times/2 You can see the explanations, what is being done to correct the issue, and how the fix affects three specific games. Frame times are smoothed, FPS is unaffected. EDIT: I won't edit the above, but it looks like it shaves a bit off the Guild Wars 2 benchmark. Either way, it's a problem almost across the board and it doesn't have to affect FPS, though it might. But it's definitely not a "trick"