NVIDIA: G-SYNC Certification Runs into 94% failure rates

Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 30, 2019.

  1. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    I get that but both of them add features and have their own requirements over Adaptive Sync. For example you can have an adaptive sync monitor without LFC - which still makes it adaptive sync but does not make it G-Sync Compatible.
     
  2. Michal Turlik 21

    Michal Turlik 21 Active Member

    Messages:
    97
    Likes Received:
    32
    GPU:
    Geforce RTX 2080Ti
    Just as quoted by Denial, it is not exaclty the same...if I am not getting wrong the VESA adaptive sync can be referred as a method to process vertical refresh synchronization dynamically and at the same to allow further refresh rate synchronization implementations. The VESA standard can be perceived as a set of primitives to be used by the vendor to implement its own advanced refresh rate synchronization. That's where it enters to play the AMD VRR implementation (variable refresh rate) which is part of the AMD Freesync proprietary standard.
    Technically a monitor which is VESA adaptive sync capable is not AMD Freesync capable due to the lack of the proprietary "extension".
    Take what I have written with a grain of salt, I am not 100% sure to be correct.
    The good news is however that NVIDIA does support the VESA standard and I never checked it :)
    https://www.geforce.com/hardware/technology/adaptive-vsync/technology
     
    Last edited: May 30, 2019
  3. gerardfraser

    gerardfraser Guest

    Messages:
    3,343
    Likes Received:
    764
    GPU:
    R9 290 Crossfire
    Nvidia fooling people and are doing a good job,top class propaganda.Soon there will be no Freesync and only Gsync LOL. Anyway I am glad Nvidia finally accepted adaptive sync.Good job for them doing so,trying to discredit freesync monitors when they can actually work good with Nvidia is well typical Nvidia which does not matter to me because I made an informed decision based on experience with Freesync and Gsync monitors.

    For me Freesync monitor on a Nvidia card is a better experience than Gsync monitor on a Nvidia card.Simple as that for me.
     
  4. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,029
    Likes Received:
    4,403
    GPU:
    Asrock 7700XT
    That is a shockingly high amount of fails, but, I think it's good Nvidia is so picky about certification. At this rate, the only reason to pay the high premium for G-Sync is so you know you're getting the best possible experience.
    However, I think it'd be worth it for Nvidia to have 3 separate tiers for certification. So, 94% of displays might fail for a gold rating, but, a 600 nit display should still qualify for a bronze.
     

  5. ETAxDOA

    ETAxDOA Member

    Messages:
    23
    Likes Received:
    6
    GPU:
    GTX 1080 SLI
    Who cares, ... Forked over $$$ for the nvidia gsync/rtx experience, nearly three months after buying the card, finally got a replacement for a faulty card and the metro exodus game code is no longer valid. Nvidia gave less than zero fcuks they'd enticed me in with faulty cards and offers they won't honour... I can accept the practically of the codes being expired/invalid, but the total disregard for the consumer experience has pushed me across the red/green line towards a big FU Nvidia
     
    BlackZero likes this.
  6. Michal Turlik 21

    Michal Turlik 21 Active Member

    Messages:
    97
    Likes Received:
    32
    GPU:
    Geforce RTX 2080Ti
    There have been few series in the whole NVIDIA merchandise that could be completely skipped, one of these was the 7xx series, the last one is the RTX 20xx .Really sorry for your experience. I also was thinking to throw all that money on a RTX 2080ti but after few investigations, few considerations and well pondering I decided to wait for the Intel exploit even if I am not inclined to buy Intel stuff anymore. By that time however, I am almost sure, serious gpus will magically appear.
     
    ETAxDOA likes this.
  7. Luc

    Luc Active Member

    Messages:
    94
    Likes Received:
    57
    GPU:
    RX 480 | Gt 710
    Gsync module does only one thing and it cost a premium to consumers equal to an small computer: APU 50 €, MB 50€, 2x4 GB Ram DDR4 50€, 250 GB NVMe M.2 disk 50€...

    People should have learn something about Nvidia's marketing machine after GTX 970 fiasco, or the GPP nonsense...
     
  8. RealNC

    RealNC Ancient Guru

    Messages:
    5,134
    Likes Received:
    3,398
    GPU:
    4070 Ti Super
    They would never publish that, since it would damage their partners if they've put products they sell into a shitlist.
     
  9. gerardfraser

    gerardfraser Guest

    Messages:
    3,343
    Likes Received:
    764
    GPU:
    R9 290 Crossfire
  10. ruthan

    ruthan Master Guru

    Messages:
    573
    Likes Received:
    106
    GPU:
    G1070 MSI Gaming
    It would be nice, if Nvidia could release some testing utility, i know that not everything is possible to test with SW only, but it would be good start. Otherwise im glad that someone really pushing display quality..
     

  11. sinnedone

    sinnedone Guest

    Messages:
    1
    Likes Received:
    0
    GPU:
    XFX HD 6870 1GB XFire
    What everyone is forgetting is that Intel is supporting adaptive sync as well.

    Nvidia is simply trying to stain the technology to paint their ecosystem in a better light.

    Nice try, but I'll definitely be voting with my wallet.

    To those saying adaptive sync is just a ploy to get you to purchase a new monitor you definitely need to experience it. I would put it up there as a GPU upgrade.

    I'm going to be honest, if you have a 144hz+ monitor and have your settings as such that your lowest frames are still triple digits then it won't be as noticeable.

    BUT if the games you play dip to the lowest point of the freesync range you will get very smooth gameplay. Like vsync on but without the 3 second delay.
     
  12. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    In an alternate universe where it was 2017 again and Nvidia was going to support Adaptive Sync displays after G-Sync already launched, what should they have done differently from what they've done here?
     
  13. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,029
    Likes Received:
    4,403
    GPU:
    Asrock 7700XT
    They could've worked together with VESA for Adaptive Sync from the very beginning, rather than take their usual route of exclusive proprietary tech.
     
  14. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    Okay but in 2017 AMD had already done that for them so now Nvidia wants to support VESA sync, what do they do differently?
     
  15. Reddoguk

    Reddoguk Ancient Guru

    Messages:
    2,666
    Likes Received:
    597
    GPU:
    RTX3090 GB GamingOC
    I think 144hz/144fps is still very difficult to reach for today's GFX cards in most recent AAA games.

    That's why i went for 75hz monitor because i can do 75hz/75fps in most games and somehow it's so much better than 60hz but reaching double that would need a beast of a system.

    Tell me those that have say a 2070/2080 can you reach 144fps in say Rage 2?
     

  16. Dribble

    Dribble Master Guru

    Messages:
    370
    Likes Received:
    142
    GPU:
    Geforce 1070
    I'd go the other way - the current spec can stay the minimum, no point lowering quality - I can't see why all future displays can't support the Nvidia spec. However if you want hdr, etc then they should have higher ratings for that. Get rid of original gsync altogether as it's dead now. I mean now you've got nvidia supporting freesync who will actually buy a gsync display - why would you buy a monitor that locks you into one gpu vendor?
     
  17. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,029
    Likes Received:
    4,403
    GPU:
    Asrock 7700XT
    Not go against the grain by creating their own proprietary version of something that does pretty much the same thing? I'm not really sure what you're asking here or implying.

    To put it in a different way:
    AdaptiveSync and G-Sync mostly accomplish the same goal. They're similar enough that Nvidia eventually ended up supporting Free/Adaptive Sync with good results, showing that they never needed to make G-Sync in the first place. If they cooperated with VESA to perfect Adaptive Sync from the very beginning, everyone would benefit, including Nvidia. But, now that Free/Adaptive sync are actually gaining some traction, people are now questioning "why spend extra for G-Sync?" and pretty much the only reason I can come up with is "you get a certified high-premium experience" which other displays can't guarantee.
     
  18. Denial

    Denial Ancient Guru

    Messages:
    14,207
    Likes Received:
    4,121
    GPU:
    EVGA RTX 3080
    I'm just asking how they should have entered the adaptive sync market without people thinking that they are hijacking it. You are saying they should have never made G-Sync proprietary in the first place.. but they did and that was done. After that happened, AMD was like "hey we can do that without the module and we're going to make it a standard" what should have Nvidia's answer have been? Keep in mind by this point G-Sync as a brand was already established and it had clearly defined rules in terms of supported frequencies, LFC, options above the sync range, etc.

    People, for example the poster I quoted, keep saying things like "nvidia is trying to stain the technology" because they have certifications and using their brand name.. what were they supposed to do, given that they already invented G-Sync as a proprietary technology?
     
  19. MonstroMart

    MonstroMart Maha Guru

    Messages:
    1,397
    Likes Received:
    878
    GPU:
    RX 6800 Red Dragon
    Still not working on my MSI Optix MAG27CQ
     
  20. schmidtbag

    schmidtbag Ancient Guru

    Messages:
    8,029
    Likes Received:
    4,403
    GPU:
    Asrock 7700XT
    Adaptive Sync is an industry standard; anyone accusing Nvidia of hijacking it would be a moron. If anything, AMD is the one who hijacked it, with their own Freesync branding. Since Intel already supported "vanilla" Adaptive Sync, Nvidia could have done the same, at which point it wouldn't make sense to accuse them of anything negative.
    I'm not saying G-Sync shouldn't have been proprietary, I'm saying it shouldn't have existed, period (or, if it were to exist, it should've been a fork of Adaptive Sync, in the same way Freesync is).
    We're talking about hypothetical situations here, so I don't really get why you're drawing the line at the point when G-Sync was already released. In other words, if my point all along is that G-Sync never needed to exist and Nvidia should've worked with VESA from the beginning (as in, the beginning of the technology that drives displays to dynamically adapt to frame rate), asking what Nvidia's answer should have been after G-Sync was already established doesn't really make sense to me in this context. That being said:
    You need to back up the timeline; the issue at hand isn't from 2017. I don't know exactly what was in development first or what year it was in development, but for argument's sake, let's say Nvidia was developing G-Sync before Adaptive Sync and started development in 2014. Nvidia chose to keep their development all to themselves. Considering they were footing the bill, obviously it makes sense they would do that. However, they didn't have to do that. Before they even started working on G-Sync, Nvidia could have proposed the idea to VESA, where they, along with Intel, AMD, Qualcomm, and probably other big names like Microsoft or Apple would all contribute toward the creation of Adaptive Sync. It's worth pointing out the creation of Adaptive Sync was inevitable. So, Nvidia needlessly spend all this time and money working on a technology that pretty much everyone else worked together to replicate.

    This
    is what I meant by "working with VESA from the beginning". If Nvidia went to VESA before just going solo, they wouldn't be in this situation where they've got this technology that doesn't appear to be paying itself off and is now heading toward irrelevancy.
     
    Last edited: May 30, 2019

Share This Page