ASUS ROG Announces Worlds First HDMI 2.1-Certified with 4K 120Hz Gaming Monitor

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 5, 2020.

  1. Agonist

    Agonist Ancient Guru

    Messages:
    4,284
    Likes Received:
    1,312
    GPU:
    XFX 7900xtx Black
    Ill take any IPS or VA 21:9, or 32:9 over the best possible 16:9 monitor. 4k is over rated and 16:9 is so 2006.
     
  2. Size_Mick

    Size_Mick Master Guru

    Messages:
    630
    Likes Received:
    463
    GPU:
    Asus GTX 1070 8GB
    I wonder what CRT monitors would be like if they were still making them...

    I'm guessing they'd look fantastic, but weigh 2500 pounds and heat your entire house through the winter. Oh, and take up half of the living room.

    Still, it'd be cool if someone could make use of CRT technology but in some new form that was low power and lightweight and didn't have to be the same depth as the width.

    Seriously though, for some time I've wondered why we don't just move beyond the whole scanline thing for making images. With all digital equipment and a modern display panel, they should be able to get rid of "frames" entirely and just update individual pixels as they need to. Then we wouldn't need to deal with things like refresh rates at all.
     
    Last edited: Aug 6, 2020
    DannyD and ZXRaziel like this.
  3. kakiharaFRS

    kakiharaFRS Master Guru

    Messages:
    987
    Likes Received:
    370
    GPU:
    KFA2 RTX 3090
    gaming and VA should not be allowed to be on the same package, misleading not to say a scam
    VA cannot handle movement at all, not even a mouse cursor, great panels for a digital picture frame not much more

    @Size_Mick in 2000 or so I plugged my old 19" CRT it was awful, we all forgot how blurry they were, super mega blurry
     
  4. Michael Lang

    Michael Lang Active Member

    Messages:
    74
    Likes Received:
    31
    GPU:
    Suprim Liquid x4090
    VA panels have came a long ways over the years. Just like the latest Samsung G7 and G9 monitors @240hz. I bought 2 of the G7 32" for my son and they are great monitors, I also have 2 of the Samsung CRG9 49" 120hz monitors, with the 2 of the new G9 49" on preorder. I also have a 38" IPS right next to the VA 49" and honestly don't miss using 38" IPS for gaming at all. I know to each his own but just because its VA doesnt mean it crap and a hint of that being all the new VA panels coming out. The last 43" 4k Asus did was way over priced and I'm sure this one will be as well. To say VA cant even handle a mouse cursor is a joke, that VA your referring to must be as old as that CRT screen. I agree I have been to 4k gaming and cant say I miss it either after going wide and super ultra wide. I would honestly like to see a Super Ultra Wide equal to 2 32" 2560x1440p @ 240hz or something instead of 2 27". Anyways to each his own...
     
    Last edited: Aug 6, 2020

  5. asturur

    asturur Maha Guru

    Messages:
    1,372
    Likes Received:
    503
    GPU:
    Geforce Gtx 1080TI
    I remember them sharper than the first LCD on VGA cable.
    DVI changed the game entirely.
     
    Exodite likes this.
  6. asturur

    asturur Maha Guru

    Messages:
    1,372
    Likes Received:
    503
    GPU:
    Geforce Gtx 1080TI
    I think 1440p on a 4k screen is super blurry.
     
  7. Size_Mick

    Size_Mick Master Guru

    Messages:
    630
    Likes Received:
    463
    GPU:
    Asus GTX 1070 8GB
    Your focus may have been thrown out of wack. I know they aren't as sharp as a good LCD (especially 4k), but "super mega blurry" sounds like perhaps it needs some adjustments. I do know that shadow mask tubes aren't as sharp as aperture grill tubes, usually. But aperture grill tubes often have visible horizontal segment lines in them, which is a bummer.

    I don't know how small they can make the individual pixels on a CRT, if there are limits or what. But they can certainly make a huge tube, and maybe even one with a wide aspect ratio. It would just need as much power as fifteen or twenty comparable LCD panels. But more what I was thinking was, if they could somehow make a lot of *small* CRTs (I mean really, really small, like millimeters) and array them into one big display somehow, that'd be pretty neat. Of course this is just all random speculation from a truly bored human being with lots of idle time on his hands, and hence the pointless nature of my observations in the first place.
     
    Solfaur likes this.
  8. ZXRaziel

    ZXRaziel Master Guru

    Messages:
    425
    Likes Received:
    134
    GPU:
    Nvidia
    I wish that the CRT was still Available , I know that the tech was bulky and heavy but they could have been much better by now if they were not forgotten by most .
    I will go IPS next , as I like nice colours and viewing angles , I know they are not perfect but that is the state of the current monitors sadly .
     
    kokotas likes this.
  9. ZXRaziel

    ZXRaziel Master Guru

    Messages:
    425
    Likes Received:
    134
    GPU:
    Nvidia
    I am certain that the tech would have improved in the last 15 years, the weight and size was the biggest problem , 4K was possible imo . It’s just the motion blur on all the lcd types of displays that looks bad to me , it’s much better now but still there even on the high refresh displays sadly .
     
  10. Motion blur is more noticeable where it is noticeable on high refresh rate display LCDs unfortunately if you get what I mean.

    EDIT: I'll qualify that by adding; great strides have been made.
     
    Last edited by a moderator: Aug 6, 2020

  11. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,725
    Likes Received:
    1,855
    GPU:
    EVGA 1070Ti Black
    27" im interested but I need spec beyond what was mentioned

    Motion blur on these fixed panels and atrocious to me better then it was but still nothing compared to crt no one will change my mind of that,

    and those120/240hz panels dont come close imo
     
    Last edited: Aug 6, 2020
  12. MegaFalloutFan

    MegaFalloutFan Maha Guru

    Messages:
    1,048
    Likes Received:
    203
    GPU:
    RTX4090 24Gb
    Nice but useless, we have 48inch OLED now and im doing fine with 55 actually, you just need to get used to it.

    Samsung propaganda, nothing more, nothing less.

    Im on OLED as PC monitor since 2016 and many other people too, if you use it right there wont be any burn in.
     
  13. itpro

    itpro Maha Guru

    Messages:
    1,364
    Likes Received:
    735
    GPU:
    AMD Testing
    Yeah OK sure, Samsung is a global leader, do not need propaganda to sell products. Search google for hundreds burn-ins since OLED went public for the masses. I am a real working consumer not a bot. Bots take awareness of usage and do not care for their money spent. I cannot lose my time by hunting content, minutes and hours to not leave a soft printed image to my panel. QLED is better, I own both OLED and QLED, really, what LG has to give more than dolby atmos and dolby vision? Tell us. Did you already forget the problems Plasma TVs had in the past? Stop LG propaganda. They gave easily 5-10 years guarantee and nobody left unhurt randomly from 0,5 to 5 years. OLED users are not tv master race but testing race.
     
  14. MegaFalloutFan

    MegaFalloutFan Maha Guru

    Messages:
    1,048
    Likes Received:
    203
    GPU:
    RTX4090 24Gb

    What are you talking about? Samsung is the worst offender in corporate wars, they dont have OLED TV's yet they have videos about burn-in, or how about all their Apple attacking ads, or Sony vs MS?
    Corporations always spread misinformation about rivals.

    OLED burn-in is a myth spread by Samsung corporate shills, I had C6 from 2016 to 2019 working all day long as PC monitor, showing web browsers and tabs, then i got C9.
    Rtings has Multi-Year burn-In test google "Real Life OLED Burn-In Test on 6 TVs". These are extreme case tests.
    Only first gen 2016 B6 2016 OLEDS that been running 20 hours a day using patterns with a lot of static content developed burn-in.
    2017 C7 TV's running different content that switches between movies, news sports etc has no burn in.
    We are now at Cx generations thats 3 generations above C7 of upgrades and anti-burn in development, so if C7 has no issues then C9 and Cx for SURE wont have anything.

    I dont do anything on my TV to avoid burn in, i have Reading preset for when I do PC stuff [not Video or games] and it has OLED light at 30.
    For Movies and Games I use Movie and game presets with OLED light at 100%, same for HDR.

    I dont care about past, i never had plasma, past is past, its irrelevant, my C9 has HDMI 2.1 and right now as I type this my desktop is set to 4K/120Hz + 10Bit + RGB/Full
    I have Gsync, I have Auto Low Latency.

    This TV has every gaming feature of top end PC monitor, and i dont need to squint with tiny 32incher or ultra wide desk-taking monitor that only as tall as 27inch, i tried that for a week with ROG SWIFT PG35VQ, then i sent it back to newegg .
    And that monitor cost me 50% MORE money then i paid for my 55inch C9, for tiny VA monitor with GSYNC ultimate and 512 FALD zones
     
  15. kokotas

    kokotas Member

    Messages:
    11
    Likes Received:
    4
    GPU:
    gtx 690
    Could be, I don't have the evidence to back it up. With that said I have burnins on my own 3 years old Asus Rog swift TN panel and I have seen burnins on my friend's recently purchased 65" lg oled tv, like 3 months old. I'm sure other people could get technical as to why oleds are more prone to this.
     

  16. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,661
    Likes Received:
    594
    GPU:
    RTX3090 GB GamingOC
    Forget kidneys, you're gonna lose an arm and a leg on this "TV". ^^
     
  17. Sitting here wondering how much the new console/hardware generation will saturate these high refresh displays. Just existing ones in general for that matter. Granted graphical settings on PC have always affected that. Consoles have been more limited in that arena...
     
  18. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,691
    Likes Received:
    2,673
    GPU:
    Aorus 3090 Xtreme
    LG wont provide any warranty for burn in with OLED which makes it a no show for me.

    Another issue is the max brightness OLED can achieve.
    Its pretty low unless they use an extra white OLED to boost brightness which reduces colour volume.
    Even with that its not bright enough imo.
    I much prefer 2000nit HDR and would like even brighter but I'm waiting for it to enter mainstream pricing.
     
    Deleted member 213629 likes this.
  19. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    1) Pixel luminance decays in LCD displays and needs to be refreshed
    2) Addressing individual pixels requires that pixel's address in addition to the (existing) 8-bits per color channel (at least). Even on a 1920x1080 display, you need 22 bits for the pixel index (1920x1080 = 2,073,000 which is somewhere between 2^21 and 2^22). That's a total of 22 + 24 = 46 bits per pixel, instead of 24 bits per pixel. There goes half your bandwidth if you need to refresh most / all of the pixels most of the time, and now you can't do the high refresh rates you wanted to enable with pixel addressing so the advantage is lost.

    That's aside from the difficulties (in terms of performance mostly) in keeping track of which pixels exactly changed and which stayed the same - given the asynchronous nature of GPU workloads on massive amounts of data.

    It's just not worth it.
     
  20. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,691
    Likes Received:
    2,673
    GPU:
    Aorus 3090 Xtreme
    Pssst.
    You mean CRT :)
     

Share This Page