On Friday NVIDIA announced G-Sync, and considering the little details available out there I wanted to write a quick follow-up on this new technology, as it really is a big announcement - a really big ... NVIDIA G-Sync explained (article)
Good article I hope this some year becomes available for everything else it won't get mainstream attention just like 3d vision things.
Nice article, but i think this is not really true: 20fps doesn't mean that your panel will flicker at 20Hz. LCDs do not flicker . Their backlit does, but not like CRTs which have physical refresh rate. And backlit is not related with screen updates at all. Even if your video card gives 3 frames per sec, it will be slideshow, but perfect one. When new frame arrives, it will be drawn in 5ms (or 2ms, or 1ms) - according to monitor specs.
This will be great when its on IPS panels otherwise it can go f*ck off like the rest of The proprietary Nvidia crap no offence.
They also stated that the minimum variable refresh rate is 30. Anything below they will duplicate frames. And yep, obviously it will not flicker.
They are saying this can be retro fitted to certain monitors. They give a diagram and everything to hard wire it to monitors. mmmmm i could prolly do it no bother, but do i wish to force open my monitor to solder this module to it. My last monitors osd button broke/stuck and the monitor was a total nightmare to get into. Front bezzles are usually locked in with really weak small pieces of plastic. I broke a few of them on the last monitor.
I said this in one other thread but monitors should already just work this way. We shouldnt need to buy extra stuff.
This looks so awesome. It should improve gaming experience by a lot for everyone whos rig can't sustain filling v-synced monitor at 60fps with no drops. Even then the input lag could be nasty. It should also make gaming @ 4k much more comfortable. Now, when will we be able to get 4k monitor with fallback to FullHD @ 120Hz, and G-Sync, at <$1.5k?
It does look awesome, but i just got a VG248QE a while back. Now i need to turn around and sell if i want yet another proprietary Nvidia feature or the feature in general? I take it you wont be able to add this piece of hardware to the panel you'll have to buy one. Which IIRC i saw them showcasing it with the VG248QE.
What makes you think i care about Mantle i just stated a fact and what needs to be there so i can buy it and make use of a great thing And i hope this is a little more open then what i'm used to seeing Nvidia do in the past with there technology At the end of the day this has nothing to with Mantle.sometimes Fantards like really do make me facepalm.
Spoken like a typical AMD owner. Much like with physX, AMD owners wanted it, then found out it's proprietary, and immediately they all seemed to have received talking points saying it does nothing, is a gimmick, and nV is greedy and should give it for free. lol.
I recall Nvidia been in the news, they did offer physx to ATI/AMD in licensing deal, but they point blank refused. Who knows if they had agreed it probably would been available in ps4/xbone an been implemented in lot's of games now. Them refusing all those years ago basically killed it from been widely adopted. What goes around comes around, really think Nvidia will license Mantle, nope can see similar fate for it See outcome for G sync been totally different, simply because it's something everyone want's....will be must have feature.
As far as I remember that talk about AMD being "offered" PhysX has been brought up a few times, basically there were some terms that AMD didn't like so they declined, or so it's rumored at least, I don't think we'll ever get a official explanation. Similarly I can fully see AMD wanting some terms for letting Nvidia use Mantle, they aren't going to just give it away to the competition. EDIT: That said it would of course be awesome for us end-users if AMD and Nvidia could somehow get along and implement CUDA, PhysX, Mantle, GSync and whatever else (3D viewing?) together but I doubt that will happen anytime soon. (I guess it's similar between AMD CPU and Intel and their x86 / x64 stuff and extensions but I don't have much insight into these things so what would I know.)