NVIDIA GSYNC in da House

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Dec 6, 2013.

  1. harkinsteven

    harkinsteven Guest

    Messages:
    2,942
    Likes Received:
    119
    GPU:
    RTX 3080
    Hilbert are these 120hz panels? Going to sell my Benq monitor and get one if it is.
     
  2. moab600

    moab600 Ancient Guru

    Messages:
    6,660
    Likes Received:
    557
    GPU:
    PNY 4090 XLR8 24GB
    my s23a700d is soo perfect, but Gsync is game changer. if it so, then i have to pull the button.
     
  3. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,561
    Likes Received:
    18,886
    GPU:
    AMD | NVIDIA
    The previews you guys will be seeing this week are the initial ASUS panels. All samples send to press are prototype versions. Obviously over time there will be IPS screens and higher res panels as there is no real limitation, I know GSYNC can support 4K resolutions. The previews you guys will be seeing is merely the introduction to the technology. So yeah, there will be WQHD (2560x1440) IPS LED panels, no doubt there.

    And to answer one of the other questions in this thread. The monitor is actually a 144 Hz. So I have enabled it at that refresh rate.

    And as answer to another remark, running VSYNC off with GSYNC without any screen-tearing as it is eliminated ... it's just really really good.
     
  4. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,751
    Likes Received:
    1,868
    GPU:
    EVGA 1070Ti Black
    CAnt wait to see the review of the tech, I really hope it does what it claims and replaces the flawed vsync. I love my Asus VS248h I personal dont like IPS panels as there response times are just to slow, they might have better color reproduction but they cant handle fast motion. the 2ms my monitor is is to slow too but ips is just worse I will take less motion blurring over better color reproduction any day. Much like I will take no tearing game experience over better images quality and faster framerates. I will buy new monitor just to have no tearing with out the performance hit.
     
    Last edited: Dec 6, 2013

  5. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    If 2ms is slow then it's not the response time that's contributing to the blur you see, it's the sample-and-hold effect. LightBoost solves that.
     
  6. mitzi76

    mitzi76 Guest

    Messages:
    8,738
    Likes Received:
    33
    GPU:
    MSI 970 (Gaming)
    no tearing here on my benq. ofc i run with vsync 24/7...(well at least from what i can see...and have turned off vsync a few times and it's a huge difference, well on my screen anyways, so much i cant game without it).

    but on the koreans...mmm that's something i'd really be interested to see in the flesh.

    eclap when i am coming over? i'll bring you a bottle of whiskey...
     
  7. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    mitzi76, a higher refresh rate means that there is less tearing to be seen. That's why you always read 120Hz users stating "no tearing here".

    There *is* tearing in certain cases but it's much, much less. I was panning (in the most horrible of ways) around in Batman: Arkham City yesterday, FPS running over 110FPS (refresh rate = 110Hz) with a dip in one place to ~100FPS, and I was actively searching for tearing. I could see none. Panning like that would very likely introduce noticeable tearing @60Hz but I haven't tried 60Hz.

    For me, the importance is using G-Sync lies in the advantages it has in smoothness at lower FPS, and input lag at refresh rate = FPS, vs. 120 / 144Hz monitors including the Korean monitors.
     
  8. Whiplashwang

    Whiplashwang Ancient Guru

    Messages:
    2,460
    Likes Received:
    397
    GPU:
    RTX 4090 PNY
    Hilbert, do you know when the G-Sync modification kit will be released so those of us who already own this monitor can mod it?
     
  9. eclap

    eclap Banned

    Messages:
    31,468
    Likes Received:
    4
    GPU:
    Palit GR 1080 2000/11000
    Come on Tuesday, make it a whisky. Sainsbury are selling Aberlour 12yo for £20 lol, bargain. Honestly though, I run 110hz in games, the tearing in bf4 is pretty much non existent at that refresh rate, while it's horrible at 60hz. I might just run games at 120hz tbh, haven't tested much at 120hz.
     
  10. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,751
    Likes Received:
    1,868
    GPU:
    EVGA 1070Ti Black
    so people would like to think the blur and lose of detail is there on ALL LCD/LED even those that use lightboost (in this respect LCD/LED will never compare to CRT), just some people notice alot more then others. Give me your top end best LCD/LED and i will be able to point out blurring that is not there on any crt. even those precious light boost monitors

    This like say there is no tearing at 120hz with no vsync when there is. It just some PEOPLE notice it alot more then others.
     
    Last edited: Dec 6, 2013

  11. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    LightBoost with 1.4ms strobes can have less motion blur than some CRTs which have a phosphor decay quite larger than 1ms. Better low persistence modes can break below 1ms giving us motion blur less than CRTs.

    Have you tried a LightBoost monitor in person? Or anything similar, such as a projector with low-persistence mode or those proprietary techs used by Samsung / BenQ / etc... generally before LightBoost became more common?

    I doubt one pixel of motion blur would irritate anyone, and that coupled with the advantages of LCD over CRT, it's a pretty convincing argument for finally going LCD for those deeply affected by motion blur coming from CRTs.

    CRTs have other issues too, like the ghosting seen behind a pointer (trail) as the phosphors keep emitting some light even after they are considered to have "decayed".

    And no, tearing at 120Hz is not about more people noticing it than others. Tearing is so much less that even when you're looking for it, there's very little. You would be right in saying that there is tearing, but it's generally a non-issue @ 120Hz anymore.
     
    Last edited: Dec 6, 2013
  12. tsunami231

    tsunami231 Ancient Guru

    Messages:
    14,751
    Likes Received:
    1,868
    GPU:
    EVGA 1070Ti Black
    Tearing is still there 60,120hz etc, some people notice it other dont and other dont care. Blur and loss of detail is there even with 1ms monitors and light boost, some notice less some dont notice, other dont care. dont mean it not there. CRT are king when it comes to how motion is handled. There are reasons why some gamer still have and use CRT.

    LCD have there advantages SLim/light and better pq special with still, slow moving images, but handling fast motion is not one them.

    This topic is not about CRT VS LCD it about gsync
     
    Last edited: Dec 7, 2013
  13. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    I think you think you're Superman, basically :nerd:

    But I digress, this topic is indeed about G-Sync.

    Yet, G-Sync also coincides with a better low-persistence mode, starts out on TNs, and starts out with LCDs, so I bet the topic is pretty relevant to G-Sync.
     
  14. xodius80

    xodius80 Master Guru

    Messages:
    790
    Likes Received:
    11
    GPU:
    GTX780Ti
    Hi Hilbert, the most eager question of mine is, does this new tech make seeking a greater horsepower video card not viable.

    What i mean is that, if we all seek smoothness in video games, doenst this make higher end graphics card an overkill.

    Lets say i can buy a decent 660gtx or 760gtx or mid range and max out settings on a game, hook that up with Gsync and have the experience of a higher end model without spending the hard cash.

    but at the end isent nvidia shooting itself in the foot?
     
  15. yasamoka

    yasamoka Ancient Guru

    Messages:
    4,875
    Likes Received:
    259
    GPU:
    Zotac RTX 3090
    I think I can answer that. 30/60/120/144FPS locked VSync vs. G-Sync, same smoothness, but G-Sync should have less input lag.

    It's the dips if framerate that are considered "stutter", G-Sync decreases the impact of those dips by still keeping things as smooth as the FPS that was dipped to allows. Meaning, when dip to 37FPS, 37FPS feels like 37FPS would feel like at 37Hz with VSync on, smoother than 37FPS at 60Hz for sure due to desynchronization.
     

  16. Poor Tom

    Poor Tom Guest

    Messages:
    65
    Likes Received:
    5
    GPU:
    6900XT OC
    Does anyone know if this will be subject to any incompatibility issues? Is it reliant on drivers or is it completely hardware driven? Because if it just works period then it's going to be orgasmic in any of the 1440p 120hz monitors.
     
  17. CoMa666

    CoMa666 Master Guru

    Messages:
    439
    Likes Received:
    10
    GPU:
    rtx 3090 Fe
    when kits will be out?
     
  18. PhazeDelta1

    PhazeDelta1 Guest

    Messages:
    15,607
    Likes Received:
    14
    GPU:
    EVGA 1080 FTW
    It's a combo of both hardware and drivers.
     
  19. PhazeDelta1

    PhazeDelta1 Guest

    Messages:
    15,607
    Likes Received:
    14
    GPU:
    EVGA 1080 FTW
    Hey Hilbert. I was just informed that G-Sync needs Display Port to work correctly. Is there any truth to this?

    EDIT:

    did some digging and found this.

    http://www.anandtech.com/show/7436/nvidias-gsync-attempting-to-revolutionize-gaming-via-smoothness

    I thought max referesh for DP was 60Hz?
     
    Last edited: Dec 7, 2013
  20. Spets

    Spets Guest

    Messages:
    3,500
    Likes Received:
    670
    GPU:
    RTX 4090
    Looking forward to the full review :D
    Don't think Fox2232 will be too happy though.
     

Share This Page