OLED, QD-OLED, QDEL + Mini, MLA & microLED -> Monitors Super Thread

Discussion in 'Computer Monitor Forum' started by OnnA, Apr 8, 2022.

Tags:
  1. DirtyDee

    DirtyDee Master Guru

    Messages:
    540
    Likes Received:
    297
    GPU:
    MSI RTX 4090 Gaming
    DSR at 4K won't look good as native 4K, but it is an improvement in-game over a lower native resolution. Back when I had a native 1440p 32" I'd enable 4K through DSR. This would sharpen the games images quality & scale things down a bit. As far as desktop use goes, the text clarity with DSR isn't great as it's either pixelated or blurry.
     
    kx11 likes this.
  2. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,284
    Likes Received:
    1,593
    GPU:
    NVIDIA RTX 4080 FE
    Be careful in your choice of 27" QD-OLED or OLED monitor if you want to do this because DSR is not supported on many models due to not having an option to disable DSC (display stream compression) which is needed to output 1440p at 240 and 360 Hz on these monitors.

    For example, my own Alienware AW2725DF QD-OLED model, which I updated to firmware M3B103 over the weekend, does not support NVIDIA's DSR even though the new firmware seems to disable DSC at 120 Hz and lower refresh rates, at least according to the monitor menu's own Display Info stats which show DSC in use at 360 Hz but not 120 Hz. Even after a restart of my PC, the NVIDIA DSR options did not show and there is no option on the monitor itself to disable/enable DSC manually.

    This is with DisplayPort 1.4 by the way. It is possible that it may work with HDMI but the AW2725DF is limited to HDMI 2.0 bandwidth so the maximum refresh rate available is only 120 Hz anyway. You can only get 360 Hz on both of its DisplayPort connections.
     
    Last edited: Apr 15, 2024
    tunejunky and Trunks0 like this.
  3. Trunks0

    Trunks0 Maha Guru

    Messages:
    1,389
    Likes Received:
    890
    GPU:
    PC RedDevil 7900XTX
    OOoooo odd ball requirement I was completely unaware of. Good to know!

    Little googling suggests VSR on Radeon's do not have the same requirement. But this is something we should all keep in mind if banking on downscaling.
     
    tunejunky likes this.
  4. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,284
    Likes Received:
    1,593
    GPU:
    NVIDIA RTX 4080 FE
    I've read that even if the monitor allows DSC to be disabled that it doesn't guarantee that DSR will work even at lower refresh rates. I mean technically DSR should be working on my monitor as 120 Hz does not appear to be using DSC but the DSR options just do not appear at all in the NVIDIA control panel, at least not with DisplayPort (1.4).

    Unfortunately, DSR isn't something professional reviewers test in their reviews so it is not easy to find out whether a specific monitor supports DSR or not unless you ask on a forum and get a reply from someone who owns the monitor you are interested in.

    The single most annoying thing about the AW2725DF for me, despite being an otherwise superb monitor, is the VRR flickering when using G-SYNC. It is barely covered by professional reviews, if at all, and at best most will only state whether the monitor supports G-SYNC and/or FreeSync but rarely mention how they actually perform in games. Dragon's Dogma 2 is especially bad with VRR flicker due to it being a dark game with highly variable framerates. Not only is there flickering in the loading screens and menus but also in the game itself, which has caused eye strain for me. Even if I cap the refresh rate at 60 Hz the flickering is just as bad. Unfortunately, as I've found out, this is a very common issue with OLED and QD-OLED screens yet few reviews bother to mention it.
     

  5. OnnA

    OnnA Ancient Guru

    Messages:
    18,511
    Likes Received:
    7,253
    GPU:
    TiTan RTX Ampere UV
    650USD for an OLED :) (2K)

     
    tunejunky likes this.
  6. Lurk

    Lurk Master Guru

    Messages:
    300
    Likes Received:
    72
    GPU:
    PNY RTX4080 VertoOC
    Dell just released new firmware for the AW3423DWF. Interesting release notes, including another fix for 'HDR behavior' (which looked fine to me, btw).
     
    AuerX likes this.
  7. AuerX

    AuerX Ancient Guru

    Messages:
    3,154
    Likes Received:
    3,163
    GPU:
    Moravec's paradox
    BlackFireHawk likes this.
  8. bballfreak6

    bballfreak6 Ancient Guru

    Messages:
    1,928
    Likes Received:
    497
    GPU:
    MSI RTX 4080
    Argh I only just updated to the 106 a few weeks ago haha. I wonder if the HDR behaviour fix was targeted at AMD GPU's from memory the original HDR fix update only applied to NV cards and not AMD.
     
  9. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,940
    Likes Received:
    3,601
    GPU:
    7900xtx/7900xt
    grain of salt...
    but 1st i own a hell of a lot of Asus and have LG as well. i also have a LG c2 65" and a ROG pg 42 (same gen, same panel).

    a really big grain of salt is me saying the quiet part out loud - Asus owners tend to be snobby.
    for an OLED Asus do have a unique position, they're the only brand flogging both W-OLED and QD-OLED, and they have a really, really nice passive heatsink added to both for around a +$200 premium on the LG. for QD-OLED the "deal" is better as the cost of the heatsink doesn't factor as the premium for QD-OLED takes it's place. that said the ROG QD-OLEDs are the best of all the QD-OLEDs and will have a longer lifespan (QD is still more susceptible to burn in and brightness loss over time the ROG heatsink is worth every penny in this case). the ROG QD-OLED is even price competitive w/ Alienware, Samsung, and others. note: some other brands have active (fan) cooling and some do nothing extra at all.

    until this generation it was generally true that W-OLED is more accurate, but dimmer and QD-OLED has more color volume but overdrives the reds. this is no longer true or a truism. all current gen LG based panels have MLA and are just as bright or brighter. and QD-OLED has really gotten much better in color accuracy and this gen actually shows subtlety in the red spectrum - but don't worry the colors still pop.

    however what you probably "hear" about LG vs ROG is based on the anti-image retention settings.
    with both you can control settings, the ROG has a "widget" for the desktop which is nice but ROG has a lot of bloatware for desktop software.
    the LG can be a little bit annoying if you're not used to it but the anti-image retention is the industry best. right now the only thing irritating about the LG is a firmware issue soon to be updated querying if you're at native resolution...but that goes away
     
  10. Darren Hodgson

    Darren Hodgson Ancient Guru

    Messages:
    17,284
    Likes Received:
    1,593
    GPU:
    NVIDIA RTX 4080 FE
    Nice to see that RTINGS.com have updated their testing suite to finally include proper testing for VRR, including flickering, on monitors. They scored the AW2725DF 5.3 out of 10 for VRR Flickering which I’d agree with based on my own disappointing experiences with it. This is admittedly coming from a perfect flicker-free G-SYNC experience on an ASUS PG279Q IPS monitor for 7 years but, even so, the way VRR works on this monitor is easily its weakest point by far. It’s mediocre at best with flickering occurring to varying degrees in almost every game I have personally tested (which is around 100 so far with Dragon’s Dogma 2 being notably bad in my opinion).

    https://www.rtings.com/monitor/reviews/dell/alienware-aw2725df

    Also, the AW3225QF gets just 5.5 out of 10 for the same feature, putting a final end to my question of what NVIDIA’s G-SYNC Certification actually means on these displays. It means absolutely nothing as the AW3225QF is certified, even has the sticker I believe on the stand, but the AW2725DF does not yet they both flicker and get the same score (I put the 0.2 rating difference down to random variances)! NVIDIA even claim, if I recall correctly, that the certification includes testing for flicker!!!

    So personally, knowing what I know now, I would say to anyone looking for a good VRR experience for their games, that if they are sensitive or annoyed by flickering, that they avoid OLED or QD-OLED displays and look elsewhere…
     
    Last edited: Apr 20, 2024

  11. OnnA

    OnnA Ancient Guru

    Messages:
    18,511
    Likes Received:
    7,253
    GPU:
    TiTan RTX Ampere UV
    ^^ No - Pick OLED and do Our magic Tweaks -> Great performance so far (it's almost a Year now BTW :D)
    Pure IQ as with my second great IPS 10Bit no flicker at all.

    Note:
    OLED is just too good to pass by IMhO. We are enthusiasts we do Tweaks to get the best out of PC gaming.
     
    tunejunky likes this.
  12. NvidiaFreak650

    NvidiaFreak650 Master Guru

    Messages:
    755
    Likes Received:
    683
    GPU:
    Nvidia RTX 4080 FE
    After years of using TN, IPS and VA panels I finally switch to OLED, using LG C3 48" as a PC monitor is just awesome & the HDR is top of its class even at 800 nits. no more IPS/VA panel corner leak glow, no more blooming, no more high input lag due to local dimming zones settings.
     
    tunejunky and OnnA like this.
  13. OnnA

    OnnA Ancient Guru

    Messages:
    18,511
    Likes Received:
    7,253
    GPU:
    TiTan RTX Ampere UV
    OLED is the future, Im 100% sure of that :p
     
    tunejunky likes this.
  14. OnnA

    OnnA Ancient Guru

    Messages:
    18,511
    Likes Received:
    7,253
    GPU:
    TiTan RTX Ampere UV
     
    tunejunky likes this.
  15. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,940
    Likes Received:
    3,601
    GPU:
    7900xtx/7900xt
    so while i'm waiting on my tax return (in US April 15 is tax day) i've been shopping for a replacement for the monitor in my remote system (200 miles north across the street from my mother). because this is in the outer fringes of wine country there's not at lot to do once i'm finished with the daily visit (3 weekends a month) as there's only one crappy movie theater and zero nightlife.
    so this monitor gets a lot of work for a 3rd system and all i do on it is gaming and media content. if work intrudes i just use the laptop at my mom's (just to show her i'm actually busy but take time out for her).
    but this remote rig is getting too old.
    my laptop budget usually runs $2k-2.5k, but because of the massive gulf in performance i'm building a dedicated WQHD system in a "lan box" style.
    which brings me to the topic of this thread:
    the ability to build a high refresh WQHD rig with an OLED monitor for less than my typical laptop.

    the goal is the WQHD OLED so some compromises were made to fit into the budget.
    what i'm buying/ have bought: Ryzen 5700X3D, Aorus Pro B550 itx, Radeon RX7900GRE, Hyte Revolt 2 (w/ 700 w psu), Teamgroup RAM 32Gb @ 3600mHz, Silicon Power 4Tb pcie 4.0, ID Cooling Zoomflow 280mm aio. AND the Phillips Evnia 34M2C8600 QD-OLED (same panel as Alienware)
    all of that is less than $2,300 and to buy a similar prebuilt is $3k-ish without monitor. to buy a comparable laptop one would have to buy a 4080m laptop with less performance for more money or a 4090m laptop with better performance for over $3,400

    when the Evnia arrives i will share...but i love the "ambilight" feature of the back led's...should have less eye strain on long sessions
     

  16. DirtyDee

    DirtyDee Master Guru

    Messages:
    540
    Likes Received:
    297
    GPU:
    MSI RTX 4090 Gaming
    Everyone's been ragging on LG for using a semi-gloss screen with an anti-glare coating yet surprisingly it isn't as bad as they made it out to be. For me, It's a significant improvement over the LG 27GR95QE-B that it replaced. It's brighter and the anti-glare coating doesn't look as "dirty" on white & light colors. Honestly, it''s not even that noticeable unless you're staring in one spot while scrolling on a web-page/text document. Text clarity is also fine, no fringing at all (maybe unless you're looking at things under a microscope).

    Anyway, I think I might just keep it as It suits my needs (32" 4K 240Hz OLED) and it doesn't flicker while using VRR.
     
    Last edited: Apr 20, 2024
    tunejunky likes this.
  17. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,940
    Likes Received:
    3,601
    GPU:
    7900xtx/7900xt
    yeah i understand a personal preference, but the only place a glossy monitor is suitable for me is where my 65" tv is.
    my house is an updated mid(20th)century modern with an open floor plan and a southern exposure in California so sun, sun, sun.
    even my office in the back gets tons of ambient light.

    the flip side was me going nuts when folks said OLEDs don't like bright rooms...
     
    DirtyDee likes this.
  18. Mufflore

    Mufflore Ancient Guru

    Messages:
    14,882
    Likes Received:
    2,848
    GPU:
    Aorus 3090 Xtreme
    Some info about a heat source at the rear of the Samsung S95C TV panel:

    After using a heatgun on the panel and finding a central heat source covering the central 1/3 of the screen on both axis, I placed a 230mm Bitfenix Spectre Pro fan at 1/2 speed blowing up at the centre of the TVs rear.
    Viewing SDR TV on lowish brightness (15) this gave 'max' 1C drop in central panel temp, measured from the front.
    Temps before using the fan:
    Ambient 18C
    Screen centre 27C
    Screen edges 20C

    The fan will have more effect with HDR and/or during Summer use but for now with SDR TV isnt worth the bother.
    Theres no way I can see to get airflow within the TV so only the plastic cover at the TVs rear can be cooled.


    The source of heat is very small, a few cm wide, just above the top centre of the stand.
    At 18C ambient this point was 32C, with only mid/low brightness SDR.
    This seems a poor design for a very thin heat sensitive OLED screen, the central area of the panel runs 7C hotter than the screen edges with just SDR TV.
    Also the centre of the screen sees the most use (movies with top/bottom black bars, 4:3 TV with left/right black bars) and is the brightest image area.

    This single small heat source must have a heatsink 1/9 of the total screen area covering the centre of the screen because the front panel is practically the same higher temp over that area.
    I tried to find a breakdown / tear down of the S95C to get a better look but came up with nothing.
    So at the moment I'm not sure what can be done to lower this heat source temp.

    This TV has a one connect box so its unexpected to see something in the centre affecting panel temp so much.
    The OLEDs each generate heat directly all over the panel which is minimal in comparison. The panel edges are only 2C above ambient vs the centre 9C above ambient (with low brightness TV).
    Maybe the heat is from a power converter for better local voltage stability to the OLEDs, rather than sourcing it directly from the OCB over long wires?

    Does anyone have evidence of what this heat source is or any other info that could be useful?
    This is with a view to making the panel temp more uniform over the whole screen.
    Thanks.
     
    Last edited: Apr 22, 2024
    signex and tunejunky like this.
  19. signex

    signex Ancient Guru

    Messages:
    9,150
    Likes Received:
    367
    GPU:
    RTX 3080 Ti 16GB
    Weird, ever sinds I used a 2080 ti I did not get oled care message, now with new 4070 super it came back.

    I can't get 3440x1440 175hz at 10bit color through displayport. I have msi meg 342c monitor.
     
  20. tunejunky

    tunejunky Ancient Guru

    Messages:
    4,940
    Likes Received:
    3,601
    GPU:
    7900xtx/7900xt
    possibly stupid question: is your DP cable 1.4a? reason i ask is that there's a more advanced DSC codec that both the 4070S and the MSI use.

    because you had a 2080ti i was thinking you're using the same DP cable
     
    signex likes this.

Share This Page