Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Dec 10, 2020.
You want those 8c/16t cpus for this game.
hoping my setup can do consistent 50,all I'm asking for
I mean min. of 50
wish there was a 4/8 cpu in the test to compare against 9600k,6c/6t gets absoultely destroyed,10600k has a 1.45x higher min fps just for having ht
it like tries to add some kind of shadow smoothing/aa to shadows,and thats all shadows all around,def hurts fps
Great comeback you post 720p benchmarks and call me boomer,lol
would not make a great standup comedian lol
but my point is they chose a cpu limited scenario for a cpu test.
what a shocker.
must call people fanboys cause when posted it
this game will expose any cpu that comes short on threads or memory latency,even if you play at 720p you're not getting 60 in that case
is that an interpretation you like more ? it's not like you're getting 60 at 1440p when your cpu is good for 45 at 720p.
there's just 6% difference between 10400f and 10600k just cause they're running 2666 ram,are you kidding,this is the most interesting cpu bench I've seen in a long while
btw does 50 count as boomer ?
Indeed, eventhough my very old 4930k (albeit OC'ed) does very well at 60 hz
I mean I do like it, but like many my hype was through the roof. I don't think anybody could have delivered on the level I was expecting. I'm gonna play it through at least couple times and it will only get better over time with new hardware and software updates so I definitely got my money's worth.
Its not the pace that's bothering me, its the user interface. Those brain dance sequences frustrated the hell out of me. I'm gonna try my hardest just to stick with melee and gun combat.
my laptop with a 4GB 1050Ti can get "almost" 60 FPS at 720P res and EVERYTHING off or on low. Looks like poo but it's playable.
No, this game is like RDR2, its almost entirely GPU dependent.
ps4 owners are jealous
there is some very high cpu usage from what i've seen.
Well, I'm still playing this on a 1080Ti @3440x1440, mix of low/medium/high ~45 avg FPS with g-sync.
I admit it's a bit of a stretch, but given the 3000 series situation, I'll have to put up with it for my first playthrough. Good thing the game is very slow pace so it's playable even with lower FPS. The game itself so far is awesome though, so multiple playthroughs over the next year will happen for sure. Can't wait to enjoy it cranked up after I finally get my hands on a 3000 card...
I have no clue what age a boomer is,you comedic skills aint that bad.
I can say the Cyber punk 2077 has not drawn me in yet. Need more time to get hooked, performance is acceptable for me . The bugs are immersion breaking.Especially shadows from people freezing or none at all. The game looks stunning at times.
I'm saving this until I can play it properly,with rt
Not from what I've seen.
witcher 3 was a bugfest and a performance drag at first too
waiting for goty and expansions,bug and perf fixes,rtx cards.
still,day one perf review is an interesting read
from what I can gather from reviews and opinions the game is so taxing on the gpu most people won't get to see how much cpu power it takes it run it at constant,let's say,70 fps
* TURN THE BASS AND TREBLE @ MAXIMUM......WHILE READING THE REVIEW ONE MORE TIME....
Yeah I can't imagine playing it on the first gen PS4 or the Xbone. I mean I might be able to up the res 10 1080P if I cut my FPS expectations down to 30 but I'm far too much of a PC elitist to accept 30FPS in a game like this. So I'll take "looks like poo" over lower FPS with better graphics.
Now this I have to respectfully address I think we can all agree that Witcher 1 was a disaster at launch. I mean up to a minute loading time just going inside a building, but it was fixed with the Enhanced Edition free of charge. The Witcher 2 and 3 imo were flawless, I didn't have any issues with them at launch. Cyberpunk feels like Witcher 1, it needs a lot more polishing and I kinda wonder what the hell they've been doing for the last year that resulted in the delays.