Official requirements for new 8K standard published

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Sep 24, 2019.

  1. Denial

    Denial Ancient Guru

    Messages:
    14,206
    Likes Received:
    4,118
    GPU:
    EVGA RTX 3080
    8K TVs are the horses. Whether you agree that it's useful or not - no one is going to film/master 8K content when there are no playback devices available.
     
    MonstroMart likes this.
  2. kakiharaFRS

    kakiharaFRS Master Guru

    Messages:
    987
    Likes Received:
    370
    GPU:
    KFA2 RTX 3090
    the only thing I know is that I'm buying the largest oled tv I can buy in 4K before they start switching to that idiotic 8K

    no one almost is filming in 4K already fyi, it takes too much time in post-production, just like in the video-game industry with movies they waste time and then rush the final cut, in avengers endgame actors didnt even wear real suits but green clothing and their white gear was added later in post-prod, with such a dumb way to make movies (making suits too complicated in 2019?) I hardly see 8K be used for anything else than "direct to tv" productions with low post-processing
     
  3. Deasnutz

    Deasnutz Guest

    Messages:
    174
    Likes Received:
    0
    GPU:
    Titan X 12GB
    the TV's have to start first, then disc/streaming formats, followed by everything else. they are just getting the ball rolling, as 4k TV's are becoming ubiquitous in the market they need a new way to sell TVs.
     
  4. FeDaYin

    FeDaYin Guest

    Messages:
    72
    Likes Received:
    10
    GPU:
    MSI GTX 780 ref.
    Current OLEDs and IPS (except 8000-9000 series) have WRGB, that's less than 3K resolution, 2880x2160. Anyway, people don't care that they pay thousands of dollars for a little more than full HD resolution. They want deep black levels, diluted colors and 150 nits of brightness. I had LG OLED 55EC930V, I was shocked what colors that thing can produce, then I've seen a C6 at a friend, it looked AWFUL.

    https://www.hdtvtest.co.uk/news/rgbw-201510084189.htm
     
    Last edited: Sep 25, 2019

  5. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    5,219
    Likes Received:
    1,589
    GPU:
    RTX 3060 12GB
    That is an interesting point, yet I would add there is more than enough tech as yet unrelased by teevee manufacturers such as:

    HDMI 2.1
    HDCP 2.2
    ALLM (Auto Low Latency)
    802.11ac
    HDR
    HLG
    HDR10
    DolbyVision


    These are dispensed ad hoc by manufacturers, mostly using prior-release technology, such as HDMI 1.4, and I have been most dissappointed by televsion manufacturers in not keeping their televisions up-to-date with the television standards across their range(s).

    Some of this can be updated via firmware, yet, with regards to connectivity, cannot be updated via firmware, such as HDMI.

    When we have entire ranges using this tech, then we can all seek to find a television that suites us and, will be compliant with content providers.
     
  6. Truder

    Truder Ancient Guru

    Messages:
    2,392
    Likes Received:
    1,426
    GPU:
    RX 6700XT Nitro+
    I'm surprised people are forgetting how 1080p was marketed originally (and 720p)

    They were "FullHD" and "HD Ready" respectively.

    It's also worth noting that 4k and UHD are not the same as they are 4096x2160 and 3840x2160 respectively but then these formats are typically distinguished by panel density vs broadcast resolution etc which doesn't exactly help clarify naming usage, particlarly with this new standard either... (Reminds me of USB 3....)
     
    Fender178 likes this.
  7. Fender178

    Fender178 Ancient Guru

    Messages:
    4,194
    Likes Received:
    213
    GPU:
    GTX 1070 | GTX 1060
    Very true. I do remember 1080p being FullHD and it being on 1080p TVs and HD ready being on 720p TVs as wells. The 4096x2160 resolution is for the movies (Projectors) etc while 3840x2160 is for TVs/Monitors etc. Well 3840x2160 is technically UHD since it is 4K for other display types that are not a movie projector.
     
  8. MonstroMart

    MonstroMart Maha Guru

    Messages:
    1,397
    Likes Received:
    878
    GPU:
    RX 6800 Red Dragon
    Requirement will be 8 x 2080TI in octo sli mode ;)
     
  9. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    5,219
    Likes Received:
    1,589
    GPU:
    RTX 3060 12GB
    That is eversoslightly incorrect.

    FullHD included anything that had vertical 1080 lines - and the reason was that half were interlaced and some were not, which were really a hold-over from broadcast standards pre-DVD era where different broadcast standards in different countries had their upper-alternate lines displayed first and their lower-alternate lines display second, as the broadcast signal could not support a full-screen every 50/60 frames per second, so it broadcast half of the screen per frame via interlacing.

    So, in truth the 'FullHD' name meant 1,080 lines with either an "i" for interlacing or "p" for progressive.

    The name of "HD Ready" simply meant that when the broadcast standard was in HD (720 lines), the television had not only the screen resolution to support it, it was 'ready' for when that day arrived. This also had "i" and "p", and even to this day, broadcast television is still 720 lines.

    As a side note, some "HD Ready" television actually used the resolution of 1366×768, total PC TFT LCD panel resolution cheap-hack to push a "HD Ready" televisions onto unsuspecting purchasers of televisions into getting a slightly higher resolution, downsampled into 720, and of course, being TFTLCD, looked utterly dreadful, yet, at least "The Jones" were impressed..."look darling, it has got HD on the side of it!"

    I only mention it, because this type of conversation is being had right now, today, as we read and type this "look darling, it has got 4K written on the side of it!".

    /faceplam? Facepalm.

    [​IMG]
     
  10. Truder

    Truder Ancient Guru

    Messages:
    2,392
    Likes Received:
    1,426
    GPU:
    RX 6700XT Nitro+
    Indeed very true, though I only wanted to illustrate the terms being used applicable to their resolution since 1080p was never the marketing term. Interlacing and progressive systems being common back then due to transmission mediums being used (many solutions were using analogue component cables for example and of course, FullHD television today is still typically broadcast in 1080i) however this time around with "4K" we do indeed have separate marketing for them in that things can show either 4k or UHD or both. Nevertheless, it's undermining clarity just as you've demonstrated with how the original "HD" resolutions also had misleading or inaccurate details during their marketing.

    Edit: It's always worth remembering that marketing never truly portrays the full details of a product, typically just the ideal features.
     
    Last edited: Sep 25, 2019

  11. Loobyluggs

    Loobyluggs Ancient Guru

    Messages:
    5,219
    Likes Received:
    1,589
    GPU:
    RTX 3060 12GB
    As David Packard once said:

    [​IMG]
     
    Undying likes this.
  12. craycray

    craycray Member Guru

    Messages:
    168
    Likes Received:
    43
    GPU:
    3080 Gaming X Trio

    Wasn't this fixed in later revisions? That was about 5 years ago now and 4 TV generations ago.
     
  13. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,691
    Likes Received:
    2,671
    GPU:
    Aorus 3090 Xtreme
    I dont know about the resolution he mentioned being correct as every single pixel is WRGB, there is no loss of pixels due to WRGB.

    OLED though are still using WRGB to exceed 400nit brightness for HDR and even when doing so ABL kicks in to dim the image.
    The additional white OLED reduces colour saturation/colour volume making the image look more washed out, and the lack of overall brightness makes HDR much darker than it should be as well.
    it is not the best way to watch HDR.

    This is why I also hope uLED comes to the rescue, for PC use sake and to give us the best parts of OLED and QLED.
    Samsungs latest venture with an OLED mix could be a good stepping stone.
     
    Last edited: Oct 19, 2019
    craycray likes this.
  14. Neo Cyrus

    Neo Cyrus Ancient Guru

    Messages:
    10,780
    Likes Received:
    1,393
    GPU:
    黃仁勳 stole my 4090
    Still waiting for higher than 1080p Netflix content to actually be available... And for them not to stonewall their users with stupid DRM and allow PC users to actually view higher than 1080p content, and higher than 720p in browsers not as garbage as Edge or their crappy, crappy, absolute crap, desktop app.
     
  15. 0blivious

    0blivious Ancient Guru

    Messages:
    3,301
    Likes Received:
    824
    GPU:
    7800 XT / 5700 XT
    (1080p) 2,073,600 pixels
    (...4K..) 8,294,400 pixels
    (...8K..) 33,177,600 pixels

    That is a lot of pixels to bring a GPU to it's knees, even in a few years. Assuming we go there, which we will.
     

  16. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,691
    Likes Received:
    2,671
    GPU:
    Aorus 3090 Xtreme
    It will be more than a few years before 8K will be manageable, except high end multi card systems, and assuming games can use them.
    Made worse when the focus is on using more silicon for Ray Tracing, which at the moment brings resolution down for most and definitely reduces frame rate.
     
  17. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,449
    Likes Received:
    2,542
    GPU:
    TUF 6800XT OC
    I think Eye-Tracking will be then norm before 8K computer displays and TVs are commonplace.

    A GPU only needs to render in high detail in the location you're looking at, everything else can be with 20% shading rate or less...
    This should have been done for 4K too but I guess the tech wasn't ready.

    For 8K it will be pretty much mandatory... HFR and 8K with 100% render coverage on all the 33 million pixels is near impossible (and also completely wasteful) until we get to carbon nanotube transistors or something like that.
     
  18. alanm

    alanm Ancient Guru

    Messages:
    12,232
    Likes Received:
    4,435
    GPU:
    RTX 4080
    I doubt there will ever be a PC or gamers market for 8k. Since you will need massive screens to tell the diff vs 4k.
     
    Deleted member 271771 likes this.
  19. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,691
    Likes Received:
    2,671
    GPU:
    Aorus 3090 Xtreme
    Only if you lead a very lonely life or are wearing a VR headset, other people watching would have a crap time.
    It also depends how far away you sit from the display (as well as the size), further away puts more of the screen in your central view.
    I dont see this becoming the norm for PC displays.
     
  20. wavetrex

    wavetrex Ancient Guru

    Messages:
    2,449
    Likes Received:
    2,542
    GPU:
    TUF 6800XT OC
    People gaming on their PC are usually gaming by themselves, without somebody watching them gaming...

    Don't confuse "being alone" with "gaming alone" (which might not even be alone, but with 99 other people on a giant map that keeps shrinking or something...)

    The point was... our biological vision is not compatible with 33 megapixels all showing razor sharp image everywhere. That is not needed.
     

Share This Page