Discussion in 'Videocards - NVIDIA GeForce' started by Netherwind, May 31, 2016.
Check the price on that one in NE and say that again.
1070 is a great card. Perfect fit for 1440p it seems. If i was building now, i'll go for it.
I tried , but cant find it
the msi one is around 530 euro (500 from europe) so hopefully this is something like that
Shop trying to take advantage of the situation
you mean 1080p :nerd:
Imo 1080GTX is that 1440p card.
The 1070 can do 1440p if you lowered the eye candy of the game. Yeah the 1080 can do it better or at higher presets than the 1070.
Well so is a 780Ti or 980Ti then, even at 4K.
Im thinking near max or max settings and for that its a 1080p card. I know what a 980Ti does, mine has even higher OC and I still feel like its a 1080p card by very demanding games.
My 1070 can hit 21.5k in gpu score when memory is overclocked by 900+. But no point because Firestrike has blinking green stars as artifacts. But it would just be a high score without substance and borderline cheating.
My Firestrike 17205 (GPU 20828) is the best possible score I get without any funny artifacts while benchmarking (Core Clock +238, Mem clock +732)
My question to fellow overclockers....
Are you benchmarking Firestrike just for e-bragging or you are being honest about it and discarding a score if there is artifacting?
Well speaking of myself, Any score that comes with ANY sort of issues (artifacts, freezes, TDR, lock up, etc.) does not qualify for a valid OC, and should not be mentioned or considered at all.
Edit: Is your 6700K at stock of oc'ed? if oc'ed what's your cpu turbo frequency? nice 14890 physics score! (also DDR4 helps with that)
Having to lower settings from highest level for the 1070 to play 1440p does make it a 1080p card indeed. And like you mentioned a 970 can do 1440p but not full settings.
I often see "moron" claims for a card to be "overkill" for 1080p when some games when you crank up original in-game settings can even bring 980ti on its knees, even for a 60fps level.
I wish the pricing were better. Seems like it's about 20-25% more expensive this round.
I bought an MSi 1070 Gaming X for $459 USD. All 1070s are in the $400+ range.
Within 3 weeks of the 970 release (11/2014), I bought an EVGA 970 SC for $329.
A month later, I bought an MSi 970 Gaming 4G for $349. You would really have to look to find a $400+ 970 back then.
Comes with free lube perhaps?
Still, the option of buying a used 980TI for $500 or more is still trumped by the very similar performing 1070 for same/cheaper.
It's overclocked to 4.6 Ghz for turbo. Anything above that causes heating issues.
DDR4 is running at 2700 Mhz (2133 Mhz is stock) I have that ill-fated RipJaws 4 DDR4 RAM which is not fully compatible with Z170 chipset and does not allow PC to boot when ran at their rated 3000 Mhz profile. G.Skill released Ripjaws 5 for Z170 after the problems with memory XMP profile emerged.
You have to ask Xenthor that. Maybe he had artifacting when he hit 21.5k gpu score, I don't know. I can hit 20.8k on my 980ti without a single artifact though. Maybe I could even push it higher. Haven't really tried plus my 980ti isn't the most amazing of overclockers (haven't tried to push it past 1450, really)
I think most will consider a score valid, artifacts or not.
Interesting. Just checked OcUK, MSI Gaming 980ti costs £389, MSI Gaming 1070 costs £419. Overall, the 1070 seems to be around £50 more expensive than the 980ti over here atm, if you look at same models from each side. 1070 needs a price drop. Still, if I was buying new now, I'd go with the 1070. I got my 980ti a fair bit cheaper though so I went with that.
Both are great cards though, no doubt and they both perform pretty much identically atm.
Here is my best GPU score so far for 980 Ti, no artefacts during run:
(for some reason says "unknown vga", iirc used some hotfix driver atm. Clocks show correctly though.)
So I got a 1070 the other day, but I'm noticing stuttering and intermittent audio loss in both Hitman (2016) and Witcher 3, the latter doesn't have the audio problem as bad as Hitman. I'm not sure what the cause was as I never experienced these issues in Witcher 3 on my old card (770), and I'd never tried Hitman on it so I don't know if it would have been different. Obviously I can't roll back on drivers since there are only one set of drivers out currently for the 1070. Any idea what could be causing this? I hope to god the hardware isn't defective because getting it sent back at this stage would probably take ages.
No, really. 980-Ti and 1070 are the go to 1440p 144Hz imo.
1080 is overkill 1440p, and not yet perfect for 4K.
Only reason to buy 1080 would be DSR 4K maybe? :/
I'm on a G-Sync 1440p 144Hz monitor for a year. And with my overclocked 980-Ti similar to overclocked 1070, there's just not a single game i cannot run maxed out with awesome playability 1440p.
Not a single of my benchmark run have artifact, i stop any run with artifact by fear of any damage to the card.
This is my permanent score for any game:
And those are the score i would have share if i wanted to brag:
which don't have artifact either, but don't feel confident using 24/7. Might be stable if only gaming, but i like to have a movie running with web browser opened on 2nd screen, and being able to record my gameplay.
Well I can't maxout Rise of TombRaider @ 1080p, this is including SSAA and get 60fps+, KillingFloor2 @ 1440p+, GTA5 with extra MSAA and grass at ultra and few more..
So yeah I still stand by that 1070GTX/980Ti a 1080p, 1080GTX a 1440p card.
Many have been buying 1440p displays since kepler and before. 95% or more of games play just fine, even with a 970 as in my case. If it doesnt, adjusting a few settings is all you need.. and it will still look better than higher settings at 1080p. If someone demands 8xAA, SSAA and 60FPS or more even in slow paced games, then yeah, I guess they will be stuck on 1080p for a long time. Personally, can never go back to smaller, lower res monitors. Even more so since gaming does not occupy the major part of my overall PC usage.
Well in case of Rise of TombRaider, I rather play at native 1080p and 2xSSAA then DSR 1440p + FXAA.. It looks a lot better with 2xSSAA then FXAA+DSR. But to each their own.
By other less demanding games I DSR to 1620p, looks the sweetspot from 1080p, but yeah very latest and future games Im sure I won't be able to use it as much, saw what 1080GTX does and that extra 10-15fps gap is just enough at 1440p by those future/or latest games.