Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Dec 6, 2013.
Hilbert are these 120hz panels? Going to sell my Benq monitor and get one if it is.
my s23a700d is soo perfect, but Gsync is game changer. if it so, then i have to pull the button.
The previews you guys will be seeing this week are the initial ASUS panels. All samples send to press are prototype versions. Obviously over time there will be IPS screens and higher res panels as there is no real limitation, I know GSYNC can support 4K resolutions. The previews you guys will be seeing is merely the introduction to the technology. So yeah, there will be WQHD (2560x1440) IPS LED panels, no doubt there.
And to answer one of the other questions in this thread. The monitor is actually a 144 Hz. So I have enabled it at that refresh rate.
And as answer to another remark, running VSYNC off with GSYNC without any screen-tearing as it is eliminated ... it's just really really good.
CAnt wait to see the review of the tech, I really hope it does what it claims and replaces the flawed vsync. I love my Asus VS248h I personal dont like IPS panels as there response times are just to slow, they might have better color reproduction but they cant handle fast motion. the 2ms my monitor is is to slow too but ips is just worse I will take less motion blurring over better color reproduction any day. Much like I will take no tearing game experience over better images quality and faster framerates. I will buy new monitor just to have no tearing with out the performance hit.
If 2ms is slow then it's not the response time that's contributing to the blur you see, it's the sample-and-hold effect. LightBoost solves that.
no tearing here on my benq. ofc i run with vsync 24/7...(well at least from what i can see...and have turned off vsync a few times and it's a huge difference, well on my screen anyways, so much i cant game without it).
but on the koreans...mmm that's something i'd really be interested to see in the flesh.
eclap when i am coming over? i'll bring you a bottle of whiskey...
mitzi76, a higher refresh rate means that there is less tearing to be seen. That's why you always read 120Hz users stating "no tearing here".
There *is* tearing in certain cases but it's much, much less. I was panning (in the most horrible of ways) around in Batman: Arkham City yesterday, FPS running over 110FPS (refresh rate = 110Hz) with a dip in one place to ~100FPS, and I was actively searching for tearing. I could see none. Panning like that would very likely introduce noticeable tearing @60Hz but I haven't tried 60Hz.
For me, the importance is using G-Sync lies in the advantages it has in smoothness at lower FPS, and input lag at refresh rate = FPS, vs. 120 / 144Hz monitors including the Korean monitors.
Hilbert, do you know when the G-Sync modification kit will be released so those of us who already own this monitor can mod it?
Come on Tuesday, make it a whisky. Sainsbury are selling Aberlour 12yo for £20 lol, bargain. Honestly though, I run 110hz in games, the tearing in bf4 is pretty much non existent at that refresh rate, while it's horrible at 60hz. I might just run games at 120hz tbh, haven't tested much at 120hz.
so people would like to think the blur and lose of detail is there on ALL LCD/LED even those that use lightboost (in this respect LCD/LED will never compare to CRT), just some people notice alot more then others. Give me your top end best LCD/LED and i will be able to point out blurring that is not there on any crt. even those precious light boost monitors
This like say there is no tearing at 120hz with no vsync when there is. It just some PEOPLE notice it alot more then others.
LightBoost with 1.4ms strobes can have less motion blur than some CRTs which have a phosphor decay quite larger than 1ms. Better low persistence modes can break below 1ms giving us motion blur less than CRTs.
Have you tried a LightBoost monitor in person? Or anything similar, such as a projector with low-persistence mode or those proprietary techs used by Samsung / BenQ / etc... generally before LightBoost became more common?
I doubt one pixel of motion blur would irritate anyone, and that coupled with the advantages of LCD over CRT, it's a pretty convincing argument for finally going LCD for those deeply affected by motion blur coming from CRTs.
CRTs have other issues too, like the ghosting seen behind a pointer (trail) as the phosphors keep emitting some light even after they are considered to have "decayed".
And no, tearing at 120Hz is not about more people noticing it than others. Tearing is so much less that even when you're looking for it, there's very little. You would be right in saying that there is tearing, but it's generally a non-issue @ 120Hz anymore.
Tearing is still there 60,120hz etc, some people notice it other dont and other dont care. Blur and loss of detail is there even with 1ms monitors and light boost, some notice less some dont notice, other dont care. dont mean it not there. CRT are king when it comes to how motion is handled. There are reasons why some gamer still have and use CRT.
LCD have there advantages SLim/light and better pq special with still, slow moving images, but handling fast motion is not one them.
This topic is not about CRT VS LCD it about gsync
I think you think you're Superman, basically :nerd:
But I digress, this topic is indeed about G-Sync.
Yet, G-Sync also coincides with a better low-persistence mode, starts out on TNs, and starts out with LCDs, so I bet the topic is pretty relevant to G-Sync.
Hi Hilbert, the most eager question of mine is, does this new tech make seeking a greater horsepower video card not viable.
What i mean is that, if we all seek smoothness in video games, doenst this make higher end graphics card an overkill.
Lets say i can buy a decent 660gtx or 760gtx or mid range and max out settings on a game, hook that up with Gsync and have the experience of a higher end model without spending the hard cash.
but at the end isent nvidia shooting itself in the foot?
I think I can answer that. 30/60/120/144FPS locked VSync vs. G-Sync, same smoothness, but G-Sync should have less input lag.
It's the dips if framerate that are considered "stutter", G-Sync decreases the impact of those dips by still keeping things as smooth as the FPS that was dipped to allows. Meaning, when dip to 37FPS, 37FPS feels like 37FPS would feel like at 37Hz with VSync on, smoother than 37FPS at 60Hz for sure due to desynchronization.
Does anyone know if this will be subject to any incompatibility issues? Is it reliant on drivers or is it completely hardware driven? Because if it just works period then it's going to be orgasmic in any of the 1440p 120hz monitors.
when kits will be out?
It's a combo of both hardware and drivers.
Hey Hilbert. I was just informed that G-Sync needs Display Port to work correctly. Is there any truth to this?
did some digging and found this.
I thought max referesh for DP was 60Hz?
Looking forward to the full review
Don't think Fox2232 will be too happy though.