NVIDIA G-Sync explained (article)

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, Oct 20, 2013.

  1. Hilbert Hagedoorn

    Hilbert Hagedoorn Don Vito Corleone Staff Member

    Messages:
    48,392
    Likes Received:
    18,564
    GPU:
    AMD | NVIDIA
    On Friday NVIDIA announced G-Sync, and considering the little details available out there I wanted to write a quick follow-up on this new technology, as it really is a big announcement - a really big ...

    NVIDIA G-Sync explained (article)
     
  2. Ryu5uzaku

    Ryu5uzaku Ancient Guru

    Messages:
    7,547
    Likes Received:
    608
    GPU:
    6800 XT
    Good article I hope this some year becomes available for everything else it won't get mainstream attention just like 3d vision things.
     
  3. Spets

    Spets Guest

    Messages:
    3,500
    Likes Received:
    670
    GPU:
    RTX 4090
    Nice article Hilbert :)
     
  4. hasherr

    hasherr Guest

    Messages:
    6
    Likes Received:
    0
    GPU:
    GTX 770
    Nice article, but i think this is not really true:
    20fps doesn't mean that your panel will flicker at 20Hz. LCDs do not flicker :). Their backlit does, but not like CRTs which have physical refresh rate. And backlit is not related with screen updates at all.
    Even if your video card gives 3 frames per sec, it will be slideshow, but perfect one. When new frame arrives, it will be drawn in 5ms (or 2ms, or 1ms) - according to monitor specs.
     

  5. Xendance

    Xendance Guest

    Messages:
    5,555
    Likes Received:
    12
    GPU:
    Nvidia Geforce 570
    Agreed, also the whole sentence doesn't make that much sense. Is it missing few commas or what? :D
     
  6. AC_Avatar100400

    AC_Avatar100400 Guest

    Messages:
    340
    Likes Received:
    0
    GPU:
    WC GTX 780
    This will be great when its on IPS panels otherwise it can go f*ck off like the rest of The proprietary Nvidia crap no offence.
     
  7. Anarion

    Anarion Ancient Guru

    Messages:
    13,599
    Likes Received:
    386
    GPU:
    GeForce RTX 3060 Ti
    They also stated that the minimum variable refresh rate is 30. Anything below they will duplicate frames. And yep, obviously it will not flicker.
     
  8. SLI-756

    SLI-756 Guest

    Messages:
    7,604
    Likes Received:
    0
    GPU:
    760 SLI 4gb 1215/ 6800
    You'll have your Mantle teet to suck on soon.
     
  9. rflair

    rflair Don Coleus Staff Member

    Messages:
    4,854
    Likes Received:
    1,725
    GPU:
    5700XT
    Where was this shown?
     
  10. k1net1cs

    k1net1cs Ancient Guru

    Messages:
    3,783
    Likes Received:
    0
    GPU:
    Radeon HD 5650m (550/800)
    Damn...you just broke my bullsh!t-o-meter.
     

  11. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,660
    Likes Received:
    593
    GPU:
    RTX3090 GB GamingOC
    They are saying this can be retro fitted to certain monitors.
    They give a diagram and everything to hard wire it to monitors.

    mmmmm i could prolly do it no bother, but do i wish to force open my monitor to solder this module to it.

    My last monitors osd button broke/stuck and the monitor was a total nightmare to get into. Front bezzles are usually locked in with really weak small pieces of plastic.
    I broke a few of them on the last monitor.
     
  12. Anarion

    Anarion Ancient Guru

    Messages:
    13,599
    Likes Received:
    386
    GPU:
    GeForce RTX 3060 Ti
    Anandtech.
     
  13. Lowki

    Lowki Master Guru

    Messages:
    631
    Likes Received:
    14
    GPU:
    RX 7800 Xt
    I said this in one other thread but monitors should already just work this way. We shouldnt need to buy extra stuff.
     
  14. Xendance

    Xendance Guest

    Messages:
    5,555
    Likes Received:
    12
    GPU:
    Nvidia Geforce 570
    Well, they don't. ;)
     
  15. Ven0m

    Ven0m Ancient Guru

    Messages:
    1,851
    Likes Received:
    31
    GPU:
    RTX 3080
    This looks so awesome. It should improve gaming experience by a lot for everyone whos rig can't sustain filling v-synced monitor at 60fps with no drops. Even then the input lag could be nasty. It should also make gaming @ 4k much more comfortable.

    Now, when will we be able to get 4k monitor with fallback to FullHD @ 120Hz, and G-Sync, at <$1.5k?
     

  16. Blackops_2

    Blackops_2 Guest

    Messages:
    319
    Likes Received:
    0
    GPU:
    EVGA 780 Classified
    It does look awesome, but i just got a VG248QE a while back. Now i need to turn around and sell if i want yet another proprietary Nvidia feature or the feature in general? I take it you wont be able to add this piece of hardware to the panel you'll have to buy one. Which IIRC i saw them showcasing it with the VG248QE.
     
  17. AC_Avatar100400

    AC_Avatar100400 Guest

    Messages:
    340
    Likes Received:
    0
    GPU:
    WC GTX 780
    What makes you think i care about Mantle i just stated a fact and
    what needs to be there so i can buy it and make use of a great thing
    And i hope this is a little more open then what i'm used to seeing Nvidia do
    in the past with there technology

    At the end of the day this has nothing to with Mantle.sometimes Fantards like really do make me facepalm.
     
    Last edited: Oct 21, 2013
  18. Andrew LB

    Andrew LB Maha Guru

    Messages:
    1,251
    Likes Received:
    232
    GPU:
    EVGA GTX 1080@2,025
    Spoken like a typical AMD owner. Much like with physX, AMD owners wanted it, then found out it's proprietary, and immediately they all seemed to have received talking points saying it does nothing, is a gimmick, and nV is greedy and should give it for free. lol.
     
  19. k3vst3r

    k3vst3r Ancient Guru

    Messages:
    3,702
    Likes Received:
    177
    GPU:
    KP3090
    I recall Nvidia been in the news, they did offer physx to ATI/AMD in licensing deal, but they point blank refused. Who knows if they had agreed it probably would been available in ps4/xbone an been implemented in lot's of games now. Them refusing all those years ago basically killed it from been widely adopted.

    What goes around comes around, really think Nvidia will license Mantle, nope can see similar fate for it

    See outcome for G sync been totally different, simply because it's something everyone want's....will be must have feature.
     
  20. JonasBeckman

    JonasBeckman Ancient Guru

    Messages:
    17,564
    Likes Received:
    2,961
    GPU:
    XFX 7900XTX M'310
    As far as I remember that talk about AMD being "offered" PhysX has been brought up a few times, basically there were some terms that AMD didn't like so they declined, or so it's rumored at least, I don't think we'll ever get a official explanation. :)

    Similarly I can fully see AMD wanting some terms for letting Nvidia use Mantle, they aren't going to just give it away to the competition.

    EDIT: That said it would of course be awesome for us end-users if AMD and Nvidia could somehow get along and implement CUDA, PhysX, Mantle, GSync and whatever else (3D viewing?) together but I doubt that will happen anytime soon.
    (I guess it's similar between AMD CPU and Intel and their x86 / x64 stuff and extensions but I don't have much insight into these things so what would I know.)
     
    Last edited: Oct 21, 2013

Share This Page