LG Gets Ready for 8K Quad UHD

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Apr 2, 2015.

  1. orky87

    orky87 Guest

    Messages:
    156
    Likes Received:
    0
    GPU:
    R9_Nano
    Never mind HW that's the least of troubles, try delivering a 6GB/s signal to drive 8k to an average house lol..
     
  2. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    1st of all, this is completely wrong on so many levels.
    CRT!!! You see nearly flicker free images at 60Hz because it uses "lumifor" (material which keeps bright for certain amount of time after being hit by electron). But in 16ms (time between refreshes on 60Hz) it dims enough for people to notice.
    "Flicker free" CRTs were around 90Hz+ and only due to use of "lumifor".

    I can see flicker on my 120Hz screen if I enable backlight strobe.
    If I take real stroboscope I have no problem to distinct between flashes at over 300Hz.

    People who do not see distinct flashes at high frequencies have 2 possible causes:
    - Time for light emitting material to go dark is longer than delay between flashes
    - Person has some neurological disease/damage

    Edit: viz. bold text
     
    Last edited: Apr 2, 2015
  3. Extraordinary

    Extraordinary Guest

    Messages:
    19,558
    Likes Received:
    1,638
    GPU:
    ROG Strix 1080 OC


    ^^ If you can see flicker at 120Hz, and almost flicker free at 60Hz, how can I be wrong on so many levels when I say I remember seeing flicker at 40-50Hz?

    You just contradicted yourself
     
  4. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    I explained why people though that 60Hz crt were close to "flicker free" (they used artificial pixel light persistence).

    You can read some here: http://xcorr.net/2011/11/20/whats-the-maximal-frame-rate-humans-can-perceive/

    And that does not state that 300Hz is limit at all.
     

  5. Extraordinary

    Extraordinary Guest

    Messages:
    19,558
    Likes Received:
    1,638
    GPU:
    ROG Strix 1080 OC
    You completely read what you wanted to and failed to read my post correctly

    I was talking about STROBE LIGHTS

    Only the last sentence was about CRTs

    Read my post again, properly
     
  6. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    Then it was dead on. Because using your "Strobe Light" (read stroboscope) you should distinct between flashes even at very high frequencies.
    Unless that strobe light is using as light source material which takes to more time to dim than is delay between strobes (as I wrote earlier).

    Industrial inspection of rotatory devices is done while they are running by use of high frequency stroboscopes.
    If something rotates at 377.35 rpm, you can visually put it into stop by using strobes at such frequency.
    Yes, you can use fraction of frequency, let device to do 2/3/4/5/... rotations before flashes if device rpm is few thousands.
     
  7. Seketh

    Seketh Ancient Guru

    Messages:
    1,899
    Likes Received:
    6
    GPU:
    RX 580 8GB
    Portuguese generalist tv channels still transmit in SD (576i, even though IPTV service is through fiber...)

    No 4K content in sight.

    Shows like Walking Dead have pretty ****ty quality for 1080p.

    So, yeah...

    I wished the whole industry moved as fast as technology.
     
  8. evasiondutch

    evasiondutch Guest

    Messages:
    207
    Likes Received:
    0
    GPU:
    2x-MSI gaming 290x 4gb OC
    They just seeking new ways to get the money out of your wallet even it's all bull crap there still plenty who buy it.

    I stick with my 1080p TV for couple of years before i even concider to upgrade.

    And for PC well have 1440p and looks great my card can handle it fine but i doub if its 4k the drop in fps will be huge so 8k rediculous. There are no videocards on market that even can handle 4k at 60 fps steady let alone 8k.

    Juts a marketing trick so people upgrade again thats capitalism for yah.
     
  9. warezme

    warezme Master Guru

    Messages:
    237
    Likes Received:
    37
    GPU:
    Evga 970GTX Classified
    No one beyond proof of concept is ready for 8K

    Cable providers are already charging out the nose for HD content which for the most part isn't much beyond 1080P. Movie studios aren't producing 8K media for distribution and barely 4K because storage starts to become iffy. ISP's would love to see full on 4K and 8K since their bandwidth requirement would shoot astronomically up and they have a choke hold on already severely over priced per MB pricing.

    Even if you woke up tomorrow and suddenly ISP's were super cheap, and 4K and 8K content and storage was everywhere and media was dirt cheap, people would still have to dole out for expensive 4K and 8K screens and monitors.

    It'll happen I imagine just not soon and not without ISP's and the industry getting their act together otherwise it may never.
     
  10. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,020
    Likes Received:
    4,397
    GPU:
    Asrock 7700XT
    It isn't just a matter of how close you sit but how small the object is that you're looking at. On a phone, everything is tiny, so every increase in pixel density can be noticed. Meanwhile, my 1080p 32" display has a stuck pixel... a pink one... and I almost never notice it when watching a movie or playing a game. The finer levels of detail really don't matter unless you have something that requires that extra detail. That being said, if I'm playing a FPS and trying to snipe someone, yeah, the not-so-spectacular PPI really stands out. But in 99% of all other cases, I don't think having a higher pixel density would really improve my experience.

    From what I recall, 40FPS is the maximum amount of frames human brains can process. In other words, once more frames are added, our brains can't keep up and distinguish the individual frames. You CAN see a difference between 40-60FPS, depending on what you're viewing and how the animation is rendered. But, if an animation is done properly, you can't tell the difference between something like 60FPS and 90FPS. There's a chance you could MAYBE see a difference if you have the displays side-by-side from each other, and you can definitely see a difference depending on what kind of post-processing effects are (or aren't) used. To me, the main appeal of 90Hz+ monitors, or 4k+ monitors, is the ability to play almost any game and it will still look good. But to me, that's not a price I'm willing to pay for the performance I have to sacrifice.
     
    Last edited: Apr 2, 2015

  11. vbetts

    vbetts Don Vincenzo Staff Member

    Messages:
    15,140
    Likes Received:
    1,743
    GPU:
    GTX 1080 Ti
    My baseball games are in 1080p, that is all I care about.
     
  12. Iggyblack

    Iggyblack Guest

    Messages:
    4,407
    Likes Received:
    2
    GPU:
    PNY GTX 960 1330/1790
    Film is easily scanned to at least 10MP, that is if they are at least shot on 35mm, especially considering directors probably aren't using kodak gold quality stuff.
    Movies filmed in IMAX format can easily go over 30MP.
     
  13. Dazz

    Dazz Maha Guru

    Messages:
    1,010
    Likes Received:
    131
    GPU:
    ASUS STRIX RTX 2080
    lol 8K Quad UHD can the name get any longer? whats next 16k quad, quad UHD? lmfao

    Funny how they have 1080p look blurry like they did back in the day of 480p, i don't remember 1080p getting worse over time. all these ultra high resolutions are getting stupid only really useful on massive displays and you need to be fairly close to notice the difference anyway.
     
  14. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,750
    Likes Received:
    1,868
    GPU:
    EVGA 1070Ti Black
    that cause most stuff isnt 1080p and even less of that stuff that is 1080p is still compressed.

    uncompressed 1080p look extremal nice, providing your not watching it on 60+ inch screen.

    It great tech is advance but what is the point if most stuff is still 720p in case of most cable/satellite/ota tv with the odd ball 1080p that is horrible compressed.

    alot of the "hd" channel i seen on all the above look horrible compressed and or upscale.

    Which ask the question what is the point if the HD we do have no is is all horrible compressed or upscaled. those higher resolutions will just make it look that much worse. Now if that all changes and we actual get native feed in those resolutions with no upscaling and even better no compression that this is all great news.
     
  15. Valken

    Valken Ancient Guru

    Messages:
    2,924
    Likes Received:
    901
    GPU:
    Forsa 1060 3GB Temp GPU
    I would totally want 8K now after properly seeing 60" 4K curved TV with real 4K content at store demos. It totally blows FullHD and 2560p out of water.

    While it is technically not fully realized NOW, in about 5 years, I would predict it takes it place. Same thing with 720 and 1080 p format back then.

    And if Japan is really going to push 8k, it will actually benefit 4K faster since downsampling 8K to 4K will be really good, and probably push price down on 4K while leaving 8K highend until 8K transmission and media format becomes available and mainstream.

    Once we the right content, then the technology will be in demand, then with mass adaptation, prices will fall. Really need that 4K/8K killer app!

    I can say now I WANT 8K even if only for digital photography reasons. Its really amazing if you demo the correct content on it.

    For those countries still slothing it at less than 1080 steaming cable or TV or pay for movie service, they would probably jump straight to 4K or maybe consider 8k infrastructure instead since that cost will also allow scaling down and set them up nearly future proof (until 16K is released).

    Would not make any business sense to go from SD to FullHD when everyone knows better formats are in development.
     

  16. A M D BugBear

    A M D BugBear Ancient Guru

    Messages:
    4,424
    Likes Received:
    643
    GPU:
    4 GTX 970/4-Way Sli
    Oh please Mr. Hilbert, Please do 3/4 titan-x's(Overclocked) with this monitor and post some benches, oh my, JEEZUS.

    LOL, 8k would bury any video card ever made alive that we have currently.
     
  17. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,793
    Likes Received:
    1,396
    GPU:
    黃仁勳 stole my 4090
    That link is absolute garbage and a half. It concludes that roughly 30fps, according to some questionable study done at ultra-low resolution in 2006, is good enough for FPS games. What a joke, the feeling between 60fps and 120 is night and day, especially in fighting games and FPS games. And yes I know fighting games are typically capped at 60fps. Not just feeling, obvious visual information is there, you can visibly see more frames super clearly such as when a character turns.

    Even in something as lag-infested (super high ping, no eastern server) as League of Legends benefits from high frame rates. Character movement looks choppy at 60fps versus 200+, especially turning.

    60fps is the absolute minimum, and that crap tries to conclude that 30 is fine for humans. Maybe 90 year old ones with serious degenerative problems, but I can perfectly consistently time my responses to 1 frame at 60fps both now that I have 20-30ms total lag and before when I had close to zero. I can see individual frames just fine at a rate of 60 and I'm not even young.
     
    Last edited: Apr 2, 2015
  18. Fox2232

    Fox2232 Guest

    Messages:
    11,808
    Likes Received:
    3,371
    GPU:
    6900XT+AW@240Hz
    It is, and it is not. They took many studies and showed their conclusions. While one concluded that human eye nerves are quite insensitive to flicker over 60Hz, they have there part (which most of as know to be true) with rotary RGB filters in DLP projectors. Which is proof that you can actually perceive difference even for stuff running at 300 fps.
     
  19. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,793
    Likes Received:
    1,396
    GPU:
    黃仁勳 stole my 4090
    That article concluded 30fps is fine. Therefore the article is garbage.
     
  20. Bansaku

    Bansaku Guest

    Messages:
    159
    Likes Received:
    7
    GPU:
    Gigabyte RX Vega 64
    +1 on this and your source quote. Has anyone seen 4K in action beside a really good OLCD? Most demos of 4K involve static images with very little movement (think maybe a boat on the water in front of a city scene, or a person walking in a garden). When 4K is in motion and the viewer gets immersed in the subject matter your eyes quickly adjust and any visible gains are quickly forgotten (filtered). 4K, or 8K for that matter DOES look sharper and more detailed on larger displays, even in motion but for the average display being 46" I'd rather get me an OLED.

    Anyone who claims there is a noticeable difference on the cellphone/tablets need to consider that it is their brain telling them it's better as the human eye cannot see that fine of detail on such a small display.

    :banana:
     

Share This Page