Discussion in 'Games, Gaming & Game-demos' started by SerotoNiN, Jul 15, 2019.
2560 x 1440 (typical monitor resolution)
2048 x 1080 (official cinema resolution)
2k is half of 4k, i don't care what nonsense DCI wants to come up with on the matter, its their fault they made 4k both 3840x2160 and of 4096x2160
1440p is not a 2k resolution in any case.
I hate the 2k term. Should be 2.5k.
You probably cant tell difference if its on a small monitor. Once you go to a large dispay (32-40"), you begin to appreciate all the visible details 4k can bring. ie, small objects on a small 1080p screen come to life on a larger higher res screen. A red apple might appear like a small indistinct cherry on a 24" screen but appears more lifelike with all its fine textures on a large 40" 4k screen.
Res is highly tied to screen size to bring out its full potential. ie, 24" screens dont benefit much from higher res than 1080p (whats the point if things appear even smaller as higher res scales things down?). 27" has 1440p as the sweet spot. 30" and above, 4k comes alive and brings out maximum visual fidelity to anything you view or play and only then you may begin to appreciate what you have been missing. Again, 4k does not live up to its full potential on small screens.
Finally, 4k is not as difficult to run if you experiment with the game settings. Probably 95% of games you can max out even without reducing settings on a 1080ti or 2080 and still get 60+ FPS. Only really tough AAA titles you may need to reduce settings. 4k gets a bad rap as difficult when reviewers (or players) just set ultra to everything and think any reduction of settings is a big visual compromise. It. Aint.
The only game I havent been able to run properly at 4k is the horribly optimized Ark Survival. Everything else runs fine and looks far, far better than anything maxed out on low res tiny screens.
I refuse to lower my graphic options from ultra to maintain 60 anything.
Back in the day 2k was 2048x1535@60hz. I used that resolution for the last 15 years.
Dunno how a 2060 would be if you want ultra. Might be ok on high. Don't get me wrong, I am happy with what I am getting. It really depends on the game as well and how well its made/coded etc along with decent drivers.
the jump from 1080p was not drastic for me and remember what I said about reshade in another thread? You can greatly increase clarity of a game of course not on the scale of a solid 4k monitor, but a nice jump from the standard released visual of a game.
Same. This is how I play AC-Odyssey with a nice reshade with a cool DOF effect. May not be above 100 at all times, but as long as it is visually gorgeous and with frames that are playable and not a slideshow, i'm ok.
Of course with a competitive FPS this wouldn't be the case (for you also I think unless you have the patience )
been on 1080P for ages now. its actually ok for me but i think
2020 will be the year il upgrade my rig to AMD based CPU with a new monitor 27\1440P
and a newer GPU. i didn't upgrade my PC for a while now.
I game on an LG 52" 4k OLED panel lol.
Awesome. And you cant tell difference between that and 1440p display? I would assume because of its size you are sitting at a fair distance away, and not up close as one would to a regular 1440p display. I use and game on a 40" Samsung UHD at about 24" away. Can definitely tell difference between it and my 1440p display.
No I'm right in front of it on my desk lol.
With the post processing done in most games, for me I cannot tell much of a difference. Sure 4k definitely has a small edge from what I can make out, but to me it's not worth that performance degrade just for a small amount more of detail I can see in it.
Really? I have a relatively small 4K monitor (27") but I was still able to make out significant IQ difference between 2.5K and 4K (when playing a heavily modded Fallout 4). I generally agree that the performance impact isn't really worth it.
4k 3840x2160 60hz on a 43 inch 4k TV here. Play everything on ultra with AA turned down to the lowest setting(but not off) or higher if it maintains 60 still.
Too many variables really make this topic too difficult to arrive at any general consensus. No.1 of course is visual acuity, which could be all over the place with this threads participants alone .
Then we have the other variable of game settings. I think we can all agree that very difficult to run games have settings that affect performance far more than their visual benefit brings, while of course other settings are worth keeping on ultra. Then other compromises factor in, ie, how much FPS to maintain vs chosen settings. And when does motion fluidity surpass visual appreciation or vice versa, and which games it applies to more than others (fast paced or slow)?
My main argument (or preference) in terms of resolution is that it fleshes out things you cant normally see on lower resolutions (with well sized displays). It brings me greater immersion in the same way that watching a movie is more pleasurable on a large screen than a tiny one. And I think this aspect at least applies to the majority of people. If it didnt, they would not be building 4k TVs for mass entertainment! For PC users, its a whole different kettle. Most gamers just want to frag their opponents ass and win.. screen res, size, etc be damned. Then there are others that want a bit of both. These will be the most tormented lot, lol. I'm on the other end, enjoying games aesthetics at my own slow, but highly immersed pace. So each to their own basically.
I have been on 1440 for few years now, but am starting to realize 1080 might be the way to go to maintain top fps for fast paced games. Debating currently, but 1440 until then.
Depends of the game.
1080 and 1440 are the ones I chose most of the time.
I also use 120~135 resolution scale for games that support it.
But, I prioritize fps over resolution any day.