Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 4, 2019.
Intel give us IceLake instead, it's time!
Yeah, I noticed the same thing with console games - NES games looked great on a TV but looked like crap on a computer monitor. I read that it has something to do with the imperfections of the display (TV) but not really sure. Also, real life doesn't have jaggies
Lol, just a quick use of web and a ppi calculator would have shown you that at 1080p res, 24in is about as big as the screen can get, before i can see pixels messing up the image at desktop distance, and that's 91.79 ppi.
Feel free to calculate that yourself for higher res and/or larger screens, to see what does and what doesn't make sense.
And with 20 of my friends being med drs, a handful of them i know for +25y,
and having gone thru the basic numbers, i know that "we" can see up to 32K.
As others said, also depends on distance, if you can actually see the difference with content.
Because on monitors you pixels are small enough to show all the details,
when 720/1080 tvs have such large pixels that they basically are unable to show enough detail to see how crappy the game actually looks.
and partially connected to the distance, and why 10in tablets need at least 1080 for readable web browsing.
Same reason i started using dsr and record at 4/5K with shadowplay,
so not to see pixels when demoing, for ppl gaming on 4k tvs..
Finally .... someone!
I get there are industries born out of irrationality and stupidity - devil's advocate aside; entire industries aren't wasting their hard earned money on research and development for moot points. Well sometimes, but generally that's not fiscally responsible. With so many consumer reviews these days - the internet at everybody's fingertips; if 4K Monitors had a huge placebo effect going on; it'd be widely-published news item.
seems a bit to me that intel is really trying to push the "look, i'm doing 5ghz on all cores)..",
knowing about their "shortcomings" outside 720/1080 gaming or regarding cores/price,
and knowing very well that the days of increasing (max) clock with each other cycle are numbered.
at least on the long run i dont see clocks going past 5.xx, at least for +90% consumer cpus used in desktops/laptops.
maybe an exception here and there, when they release some platinum/titanium/iridium pentium dual core in space grey ;-)