1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Review: Asus RoG Swift PG27UQ Finally 4K G-SYNC 144Hz Gaming!

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Aug 10, 2018.

  1. Denial

    Denial Ancient Guru

    Messages:
    11,452
    Likes Received:
    431
    GPU:
    EVGA 1080Ti
    It seems to me like this monitor is simply too early to the market. New GPU's are coming but currently nothing can really run it in all titles. The physical connectors are limiting its functionality to an extent. The beefy FPGA required to drive the G-Sync module not only requires active cooling but apparently is directly adding $500 to the cost of the monitor. MicroLED is right around the corner and will solve the FALD resolution problems + bring a bunch of other improvements to refresh rates, contrast, blooming, etc.

    Unless you have a considerable amount of money to spend I'd avoid this monitor and derivatives built on the same panel. Nearly all the technology in it will be obsolete in two years.

    Edit: I also want to use this thread as a soapbox and say it's time for Nvidia to end G-Sync. There are currently no QHD HDR G-Sync monitors on the market but plenty of FreeSync/Adaptive ones and now even TV's. At this point G-Sync offers no features over Adaptive Sync aside from arguably making this monitor possible.. a monitor almost no one can afford. The only thing it does is limit my choice in monitor selection and increase the overall costs of monitors it's featured in. When G-Sync offered noticeable advantages I was fine with it but now it's just pointless vendor lock in. I always assumed Nvidia would add features and make G-Sync a premium option while using AdaptiveSync for budget, but thus far they've done nothing to differentiate it. I'm not happy about it.
     
    Last edited: Aug 10, 2018
    airbud7, Maddness, ubercake and 2 others like this.
  2. Eyeer

    Eyeer New Member

    Messages:
    1
    Likes Received:
    0
    GPU:
    1080
    What about Display Stream Compression (DSC)?
     
  3. Denial

    Denial Ancient Guru

    Messages:
    11,452
    Likes Received:
    431
    GPU:
    EVGA 1080Ti
    From a reddit thread where a guy explained it - I don't know enough about it to know whether he is right or not:

     
  4. Lucifer

    Lucifer Member Guru

    Messages:
    196
    Likes Received:
    1
    GPU:
    Colorful GTX 1060
    let's see, 144 Hz with *, 120 Hz with *.
    They should've advertised it as 98 Hz monitor and price it alot lower.

    As always, overpriced gimmick from asus.
     
    Solfaur likes this.

  5. Dragam1337

    Dragam1337 Master Guru

    Messages:
    388
    Likes Received:
    137
    GPU:
    1080 Gaming X SLI
    The only thing keeping me from buying it, is the fan... i won't ever buy a product with a noisy fan, a monitor least of all.
     
    ubercake likes this.
  6. vazup

    vazup Master Guru

    Messages:
    273
    Likes Received:
    1
    GPU:
    r9 280X
    Is it possible to turn local dimming on while in SDR mode?
     
  7. JamesSneed

    JamesSneed Master Guru

    Messages:
    314
    Likes Received:
    80
    GPU:
    GTX 1070
    Prince Valiant likes this.
  8. alanm

    alanm Ancient Guru

    Messages:
    7,922
    Likes Received:
    463
    GPU:
    1070 AMP!
    I'll bet in a couple years we'll see better monitors than this with full spec performance (HDMI 2.1) at less than a $1000.
     
  9. StewieTech

    StewieTech Chuck Norris

    Messages:
    2,366
    Likes Received:
    392
    GPU:
    MSI gtx 960 Gaming
    Prices pretty gangsta, im happy with my 1440p monitor just fine. I think 1440p is a sweet spot for gaming anno 2018.
     
  10. wavetrex

    wavetrex Master Guru

    Messages:
    386
    Likes Received:
    127
    GPU:
    Zotac GTX1080 AMP!
    So, the most technologically advanced display that exists, and they put that ugly thick bezel on it.
    WHY ? It's a direct-lit LED, with no need to house LED's on the side of the panel, requiring a bezel.

    Also the price is insane, it's more expensive than 55"-65" OLED UHD HDR TV's ... (yes, missing 144Hz, but still ... damn !)
     

  11. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    1,500
    Likes Received:
    60
    GPU:
    Guru3d GTX 980 G1 Gaming
    When you are first to the party of course your gonna charge an arm and a leg but having to sell a kidney also is one organ too far.

    If it was a 30/32" then maybe 1500 squid would be reasonable but 2k for 27" even with all that good stats stuff. nah thanks.
     
  12. fleggy

    fleggy Active Member

    Messages:
    90
    Likes Received:
    0
    GPU:
    1080ti hybrid cooling
    It is possible ONLY in SDR mode.

    EDIT: to be exact - you can turn off FALD in SDR but not in HDR. It is always on in HDR.
     
  13. RealNC

    RealNC Ancient Guru

    Messages:
    2,024
    Likes Received:
    417
    GPU:
    EVGA GTX 980 Ti FTW
    AFAIK it's always used. Local dimming is not actually controlled by HDR. If a bunch of pixels are supposed to be black somewhere, the LEDs on that location are turned off.

    But it's better to get some verification on that first.
     
  14. H83

    H83 Ancient Guru

    Messages:
    2,274
    Likes Received:
    140
    GPU:
    Asus ROG Strix GTX1070
    So it´s only a 27 screen? It uses color compression?? And it has a fan that you can hear?? And it costs how much??? This is insane from Asus and Nvidia...

    This thing makes my screen look like a super bargain despite the fact that i i almost cried paying 500€ for it...

    P.S. Hilbert can you tell Asus that those illuminated logos are silly...

    Great review as always!
     
    Last edited: Aug 10, 2018
    -Tj- likes this.
  15. Moonbogg

    Moonbogg Active Member

    Messages:
    85
    Likes Received:
    12
    GPU:
    GTX1080Ti@2.1GHz
    They got the resolutions all backwards with this display. The LCD res is way too high and the FALD res is way too low. 3440x1440 LCD res with something like 2500 zones might be better for small, detailed highlights in dark scenes. Even then it won't be enough IMO. You need as close to per pixel light control as you can get. Even 10,000 zones would be lacking for fine HDR detail.
    As it is, I don't see how you can have bright, fine details surrounded by deep black with those huge 1 inch gaps between backlights. 300-1000 nits will bleed through the LCD's surrounding areas, right? I'm sure this looks great for what it is, but this isn't the HDR I'm hoping for. Needs a LOT more zones or OLED or something similar. Also, $2000 is absolutely hilarious for this, lol.
     

  16. Denial

    Denial Ancient Guru

    Messages:
    11,452
    Likes Received:
    431
    GPU:
    EVGA 1080Ti
    We already covered this in the last thread, it's not happening without MicroLED or OLED. You're saying the price is too high but then you're asking for them to increase the manufacturing complexity by 600%. There is a trade-off between quality and price and I'm sure some engineer sat there and said 384 zones is going to get us good enough brightness control without making the thing unmanufacturable. It's FALD density is already higher than most 4K TV's on the market @ ~1 zone per SQ Inch, the only TV i know of that's better is the ZD9. Having some zones is better than nothing - it's just not what you're looking for - doesn't make the design bad.
     
  17. Smovs

    Smovs New Member

    Messages:
    4
    Likes Received:
    0
    GPU:
    MSI Gaming X 1080TI
    Thanks for the review really nice, but i must agree the price is crazy even for 4k HDR :eek:
     
  18. zimzoid

    zimzoid Maha Guru

    Messages:
    1,358
    Likes Received:
    2
    GPU:
    2xEVGA980TiSC+(H20) Swift
    Wow that price.. And I thought my Rog1440p gsync screen was expensive
     
  19. ubercake

    ubercake Master Guru

    Messages:
    200
    Likes Received:
    41
    GPU:
    EVGA 1080 Ti SC2
    I agree. This whole proprietary G-sync thing has begun to irk me as a consumer. There are a few things keeping me from going back to AMD:

    1)The piss-poor drivers in the late 2000s and early 2010s
    2) Frame time issues in the late 2000s and early 2010s and the scars they left on my pocketbook.
    3) AMD mods banned me from seeking answers on their forum for what was later revealed as the frame time issues that plagued their entire lineup FOR YEARS!!! (Pisses me off just thinking about it!!! They said I was doing nothing but trolling and closed every thread I'd open prior to banning me?!).
    4) Can AMD really even come close to a 1080 ti performance with anything???

    It's hard for me to want to invest in AMD again (can they really even come close to a 1080 ti with anything???), but I can't say I haven't begun to think about it again because of the premium on these G-sync monitors.
    I wonder how little it would take on the part of Nvidia (and what a grand gesture it would be by Nvidia!?!) to update current hardware to use freesync? Simply a firmware update on the monitor and a driver update? Is this possible? Or would it take new hardware altogether?
    That said, I have had minimal issues when I switched to my GTX 580s and continued with Nvidia cards ever since. The only issues I had were when I ran SLI setups and the drivers weren't updated on a particular title's release date. Within two weeks the problem was solved. These days, with a 1080 ti and a 1440p setup, I only need one video card to tap out my 144Hz monitor with everything cranked in BF1.
     
    Last edited: Aug 10, 2018
  20. Solareus Prime

    Solareus Prime Member Guru

    Messages:
    108
    Likes Received:
    6
    GPU:
    MSI 7870 2gig x 2
    #snooze wake me when we are on wafered displays.

    Redundancy of the monitor is at the plateau .
     

Share This Page