1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Review: Asus RoG Swift PG27UQ Finally 4K G-SYNC 144Hz Gaming!

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 10, 2018.

  1. Denial

    Denial Ancient Guru

    Messages:
    11,806
    Likes Received:
    840
    GPU:
    EVGA 1080Ti
    It seems to me like this monitor is simply too early to the market. New GPU's are coming but currently nothing can really run it in all titles. The physical connectors are limiting its functionality to an extent. The beefy FPGA required to drive the G-Sync module not only requires active cooling but apparently is directly adding $500 to the cost of the monitor. MicroLED is right around the corner and will solve the FALD resolution problems + bring a bunch of other improvements to refresh rates, contrast, blooming, etc.

    Unless you have a considerable amount of money to spend I'd avoid this monitor and derivatives built on the same panel. Nearly all the technology in it will be obsolete in two years.

    Edit: I also want to use this thread as a soapbox and say it's time for Nvidia to end G-Sync. There are currently no QHD HDR G-Sync monitors on the market but plenty of FreeSync/Adaptive ones and now even TV's. At this point G-Sync offers no features over Adaptive Sync aside from arguably making this monitor possible.. a monitor almost no one can afford. The only thing it does is limit my choice in monitor selection and increase the overall costs of monitors it's featured in. When G-Sync offered noticeable advantages I was fine with it but now it's just pointless vendor lock in. I always assumed Nvidia would add features and make G-Sync a premium option while using AdaptiveSync for budget, but thus far they've done nothing to differentiate it. I'm not happy about it.
     
    Last edited: Aug 10, 2018
    airbud7, Maddness, ubercake and 2 others like this.
  2. Eyeer

    Eyeer New Member

    Messages:
    1
    Likes Received:
    0
    GPU:
    1080
    What about Display Stream Compression (DSC)?
     
  3. Denial

    Denial Ancient Guru

    Messages:
    11,806
    Likes Received:
    840
    GPU:
    EVGA 1080Ti
    From a reddit thread where a guy explained it - I don't know enough about it to know whether he is right or not:

     
  4. Lucifer

    Lucifer Master Guru

    Messages:
    200
    Likes Received:
    3
    GPU:
    Colorful GTX 1060
    let's see, 144 Hz with *, 120 Hz with *.
    They should've advertised it as 98 Hz monitor and price it alot lower.

    As always, overpriced gimmick from asus.
     
    Solfaur likes this.

  5. Dragam1337

    Dragam1337 Master Guru

    Messages:
    905
    Likes Received:
    376
    GPU:
    1080 Gaming X SLI
    The only thing keeping me from buying it, is the fan... i won't ever buy a product with a noisy fan, a monitor least of all.
     
    ubercake likes this.
  6. vazup

    vazup Master Guru

    Messages:
    280
    Likes Received:
    2
    GPU:
    r9 280X
    Is it possible to turn local dimming on while in SDR mode?
     
  7. JamesSneed

    JamesSneed Master Guru

    Messages:
    401
    Likes Received:
    109
    GPU:
    GTX 1070
    Prince Valiant likes this.
  8. alanm

    alanm Ancient Guru

    Messages:
    8,228
    Likes Received:
    710
    GPU:
    1070 AMP!
    I'll bet in a couple years we'll see better monitors than this with full spec performance (HDMI 2.1) at less than a $1000.
     
  9. StewieTech

    StewieTech Chuck Norris

    Messages:
    2,537
    Likes Received:
    844
    GPU:
    MSI gtx 960 Gaming
    Prices pretty gangsta, im happy with my 1440p monitor just fine. I think 1440p is a sweet spot for gaming anno 2018.
     
  10. wavetrex

    wavetrex Master Guru

    Messages:
    457
    Likes Received:
    212
    GPU:
    Zotac GTX1080 AMP!
    So, the most technologically advanced display that exists, and they put that ugly thick bezel on it.
    WHY ? It's a direct-lit LED, with no need to house LED's on the side of the panel, requiring a bezel.

    Also the price is insane, it's more expensive than 55"-65" OLED UHD HDR TV's ... (yes, missing 144Hz, but still ... damn !)
     

  11. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    1,607
    Likes Received:
    97
    GPU:
    Guru3d GTX 980 G1 Gaming
    When you are first to the party of course your gonna charge an arm and a leg but having to sell a kidney also is one organ too far.

    If it was a 30/32" then maybe 1500 squid would be reasonable but 2k for 27" even with all that good stats stuff. nah thanks.
     
  12. fleggy

    fleggy Active Member

    Messages:
    93
    Likes Received:
    0
    GPU:
    MSI 2080 ti X Trio
    It is possible ONLY in SDR mode.

    EDIT: to be exact - you can turn off FALD in SDR but not in HDR. It is always on in HDR.
     
  13. RealNC

    RealNC Ancient Guru

    Messages:
    2,674
    Likes Received:
    929
    GPU:
    EVGA GTX 980 Ti FTW
    AFAIK it's always used. Local dimming is not actually controlled by HDR. If a bunch of pixels are supposed to be black somewhere, the LEDs on that location are turned off.

    But it's better to get some verification on that first.
     
  14. H83

    H83 Ancient Guru

    Messages:
    2,528
    Likes Received:
    279
    GPU:
    Asus ROG Strix GTX1070
    So it´s only a 27 screen? It uses color compression?? And it has a fan that you can hear?? And it costs how much??? This is insane from Asus and Nvidia...

    This thing makes my screen look like a super bargain despite the fact that i i almost cried paying 500€ for it...

    P.S. Hilbert can you tell Asus that those illuminated logos are silly...

    Great review as always!
     
    Last edited: Aug 10, 2018
    -Tj- likes this.
  15. Moonbogg

    Moonbogg Member Guru

    Messages:
    117
    Likes Received:
    28
    GPU:
    GTX1080Ti@2.1GHz
    They got the resolutions all backwards with this display. The LCD res is way too high and the FALD res is way too low. 3440x1440 LCD res with something like 2500 zones might be better for small, detailed highlights in dark scenes. Even then it won't be enough IMO. You need as close to per pixel light control as you can get. Even 10,000 zones would be lacking for fine HDR detail.
    As it is, I don't see how you can have bright, fine details surrounded by deep black with those huge 1 inch gaps between backlights. 300-1000 nits will bleed through the LCD's surrounding areas, right? I'm sure this looks great for what it is, but this isn't the HDR I'm hoping for. Needs a LOT more zones or OLED or something similar. Also, $2000 is absolutely hilarious for this, lol.
     

  16. Denial

    Denial Ancient Guru

    Messages:
    11,806
    Likes Received:
    840
    GPU:
    EVGA 1080Ti
    We already covered this in the last thread, it's not happening without MicroLED or OLED. You're saying the price is too high but then you're asking for them to increase the manufacturing complexity by 600%. There is a trade-off between quality and price and I'm sure some engineer sat there and said 384 zones is going to get us good enough brightness control without making the thing unmanufacturable. It's FALD density is already higher than most 4K TV's on the market @ ~1 zone per SQ Inch, the only TV i know of that's better is the ZD9. Having some zones is better than nothing - it's just not what you're looking for - doesn't make the design bad.
     
  17. Smovs

    Smovs New Member

    Messages:
    4
    Likes Received:
    0
    GPU:
    MSI Gaming X 1080TI
    Thanks for the review really nice, but i must agree the price is crazy even for 4k HDR :eek:
     
  18. zimzoid

    zimzoid Maha Guru

    Messages:
    1,386
    Likes Received:
    7
    GPU:
    2xEVGA980TiSC+(H20) Swift
    Wow that price.. And I thought my Rog1440p gsync screen was expensive
     
  19. ubercake

    ubercake Master Guru

    Messages:
    212
    Likes Received:
    44
    GPU:
    EVGA 2080 Ti XC Blk
    I agree. This whole proprietary G-sync thing has begun to irk me as a consumer. There are a few things keeping me from going back to AMD:

    1)The piss-poor drivers in the late 2000s and early 2010s
    2) Frame time issues in the late 2000s and early 2010s and the scars they left on my pocketbook.
    3) AMD mods banned me from seeking answers on their forum for what was later revealed as the frame time issues that plagued their entire lineup FOR YEARS!!! (Pisses me off just thinking about it!!! They said I was doing nothing but trolling and closed every thread I'd open prior to banning me?!).
    4) Can AMD really even come close to a 1080 ti performance with anything???

    It's hard for me to want to invest in AMD again (can they really even come close to a 1080 ti with anything???), but I can't say I haven't begun to think about it again because of the premium on these G-sync monitors.
    I wonder how little it would take on the part of Nvidia (and what a grand gesture it would be by Nvidia!?!) to update current hardware to use freesync? Simply a firmware update on the monitor and a driver update? Is this possible? Or would it take new hardware altogether?
    That said, I have had minimal issues when I switched to my GTX 580s and continued with Nvidia cards ever since. The only issues I had were when I ran SLI setups and the drivers weren't updated on a particular title's release date. Within two weeks the problem was solved. These days, with a 1080 ti and a 1440p setup, I only need one video card to tap out my 144Hz monitor with everything cranked in BF1.
     
    Last edited: Aug 10, 2018
  20. waltc3

    waltc3 Master Guru

    Messages:
    738
    Likes Received:
    123
    GPU:
    XFX 590 8GB XFire
    I really appreciate your work here HH--great review, and thank you!--comments are all mostly ones I surely agree with...;) Reminds me of the teeth-pulling I was doing a couple weeks back when trying to decide which monitor to go with, myself. Some thoughts...

    You mentioned, I thought, that this ROG monitor uses an AOC panel...? Don't know if it is of interest, but this is the panel my AOC U3277PWQU uses: http://panelone.net/en/31-5-inch/INNOLUX_M315DJJ-K30_31.5_inch-datasheet I thought that was an interesting comment in the review...!

    Agree completely about the size--my ~32" 4k (3840x2160) monitor rings the dot pitch bell @ ~.18, but my former 27" monitor (now gracing the wife's desk) has a gorgeous display even @ its ~.27 dot pitch--hers I think sometimes is superior to mine in terms of the Windows display(!)--sort of *cough*--but she maxes out @ QHD 2560x1440--imo, the *perfect* res for 27." My own preference is as yours--for 4k ~32" is just about ideal--27" is not not enough screen real estate for anything > QHD. Totally subjective opinion--as you say. As are all monitor reviews, imo...!

    But really none of that gets in front of the huge negative of the price of this thing-- ~$2700? Not a chance, Lance--not my new toy, Roy, not a--er, yeah! With a *power brick*? Gaaa! *and* a fan? double-gaa-googoo...:eek: I guess the 180W TDP is not mind bending until maybe it's compared with the 50W of the U3277PWQU--that doesn't require the brick, or the fan. *cough*

    And about the bat-cave signal projection...in the clever form of the coded ROG dragon logos...yesss-s-s-s-sss... This (below) might have happened to me had I bought one of the few of these that will ever be made available...

    ______________

    Guest: "Thanks for letting me look at this $2700 monitor you've been bragging about all week, Walt, bragging so much, in fact, that I wanted to pop you from here to the Tower of London with my thumb! Finished, yet? Ah-ha...so, uh, tell me, did you pay extra for the secret ROG bat-signal amplifiers I see here?....:p Oh, I see it on your wall--how, uh, nice--! And, oooo--a coded, ultra dragon-like bat-signal on your desk, too? *cough* Visible from miles away, is it?"

    My reply: "Please don't laugh at me-yeah, I know...please, I can't take anymore....just stop--kay?"

    _________________

    In all seriousness, I did have a thought as to the HDMI input being the only available sound input for the speakers--no extra audio input anywhere around, I guess? Well...shucks...who pays $2700 for a monitor and expects to use the speakers, right? Naaaah....;)

    Really really fine review, HH. I just get a bit tickled whenever I think about that price....
     

Share This Page