Discussion in 'Frontpage news' started by Hilbert Hagedoorn, May 30, 2019.
Are you really such a fanboy you would make such a blatant false statement?
Nvidia developed G-Sync first, AMD looked (literally at the event they announced it at because I remember the articles where techsites were asking AMD what they thought of Nvidia's announcement) at it and said "we don't need a module to do this" spent a year developing and launched Freesync, minus a few key features that it's added over time. During that year, they proposed adding adaptive sync as a VESA standard DP1.2a - while 1.2a was released in 2013, the Adaptive Sync option for it didn't come until over a year later in May, 2014 a year after G-Sync. It wasn't in development prior to G-Sync and it was entirely added by AMD. You say the creation of adaptive sync was inevitable but it's creation spawned from G-Sync and Freesync being out. VBLANK has been a thing far longer than G-Sync but using it to sync the display and framerate wasn't. I guess theoretically someone would have came up with it eventually but no one was talking about it prior to Nvidia.
Regardless, Nvidia does what Nvidia does best and made the tech proprietary. I agree wit would have been better if they didn't, but they did. Now VESA standard comes out, Nvidia has to switch. My question, to the specific person I quoted (because he mentioned the hijacking part) was simply what could they have done better than they have now after they made the decision to support adaptive sync? The only thing I wish they've done is come out with support earlier.. other then that I think branding it G-Sync along with having a certification program for it are both fine.
None of that changes my point (I figured G-Sync came out first anyway, but thanks for the confirmation). Adaptive Sync's creation was inevitable for 2 reasons:
1. Because Nvidia pitched it to VESA so it could be an industry standard (rather than go solo developing G-Sync).
2. Because Nvidia wanted to keep G-Sync to themselves and competitors (like AMD) weren't about to be left behind.
So, because Adaptive Sync was going to exist no matter what, Nvidia could have just saved themselves the time and money by working together with VESA to create it, rather than create G-Sync by themselves and wait for the rest of the industry to come up with a response. Nvidia would be naive if they thought they were going to be the only ones with this type of technology.
My response to you was the answer to that question: the better option Nvidia could have done was to never have gone solo from the moment of G-Sync's inception, and instead have proposed the idea to VESA. If you're going to ask about a hypothetical alternative timeline, why does it have to begin after G-Sync was released? I still don't fully understand why that's the moment where you draw the line.
However, if we are to ask what Nvidia could have done different in 2017, I would agree there isn't anything - they already committed to their decision; a decision I personally would say was a mistake.
I agree with this.
It doesn't mean that it doesn't work.
Please make yourself more clear.
Am I a Nvidia fanboy or AMD fanboy because I own lots of gear from both companies . I also own lots of Intel stuff ,so you can included Intel also if it suits your agenda.
Please do tell me which company I am a fanboying about first or your just calling me a blatant liar because my opinion's and experience with both Gsync/Freesync does not matter. Or are such a blatant false statements that you do not agree with make you the fanboy.
EDIT: I just reread your comment,thanks I am now an AMD fanboy so I was saving some my USD/Canadian money for some old school gear.
So I dug in my USD stash
Then I dug in my Canadian stash
Then got the payoff money for with all my fanboy shill money
I don't recall saying or even implying that it doesn't work, just that Freesync and Adaptive Sync aren't identical. Freesync is an extension to adaptive sync and offers additional features - G Sync "compatible" is the same way. That's it.
Because the person I quoted is claiming that Nvidia is staining the technology - maybe I'm reading into his post a little too much but presumably he finds that the G-Sync certification program is intentionally putting down Freesync/Adaptive Sync displays in order to bolster Nvidia's proprietary G-sync solution. I'm asking how they could have done this differently? Should they not have the certification program? Should they not label the Gsync "Compatible" monitors with Gsync at all to differentiate them more? Would this including loosing the standards and allowing monitors with really bad ranges to be labeled whatever it is they come up with? My question was specific to what he was claiming and not really intended to ask the question if Nvidia should have had G-Sync proprietary to begin with - which I agree that they shouldn't have.
Ahh ok, I see what you mean now. Yeah, I agree with you there.
Gsync, like all proprietary nVidia "technologies," exists for one purpose only--to turn a profit for nVidia. AMD comes along and offers a similar technology that does the same thing but is open-sourced instead of proprietary, courtesy AMD--meaning that anyone can adopt it commercially without charge--even nVidia--and so of course nVidia bucks the trend because it's Gsync--not AMD open-sourced Freesync--that earns nVidia a profit. Notice that no one ever claims that nVidia's expensive hardware Gsync implementations are superior to AMD's open-standard FS1&2--not even nVidia makes that claim! But nVidia can't make the claim, because, of course, Gsync isn't superior to Freesync at all as both are simply different approaches to doing the very same thing.
Instead, but typically, and unfortunately for us, nVidia undertakes a 'Baghdad-Bob' type of ad campaign built around something grandly entitled "nVidia monitor certification"--a "certification" that has as its singular goal to "demonstrate" why people should purchase hardware Gsync BECAUSE "...94% of all the monitors on earth fail nVidia's certification" for open-source software Gsync support (Freesync1/2)! --While interestingly enough, that same 94% (or better) might easily pass, apparently, an AMD Freesync1/2 "certification"... nVidia's approach, called Gsync, consists of physical, proprietary monitor circuitry that is available for a price and that is permanent for the life of the monitor, meaning that although your monitor may surpass the longevity of your current Gsync hardware and/or your current GPU, or both, alas, your next GPU must be a nVidia GPU and it must support whatever new version of Gsync hardware exists, and etc. and and etc. ad infinitum! AMD's approach, called Freesync 1 or 2, is open source, requires no proprietary physical circuitry and is software-upgradable, etc. and etc. Somewhat comically, however, nVidia seems to be stretching to its limits to discover new and ever-more impractical and expensive ways in which to gouge its apparently unsophisticated customers *bur-r-r-rrrrrp!* (It is not clear whether nVidia merely thinks its customers actually are unsophisticated or whether they are indeed unsophisticated, however--[copied from internal nVidia corporation memo # 22566 5/302019--for internal consumption only].)
So, what's happening? Why all of this weirdness from nVidia about proprietary "monitor certifications" from on high...?
Seems pretty simple to understand. nVidia is comparing its proprietary hardware Gsync specs (that have no bearing whatsoever on the quality of a hardware Gsync display versus a software Freesync1/2, "software Gsync," display) to the general specification list of monitors that have no hardware Gsync circuitry onboard! It is the display quality from a monitor that counts in the end, certainly, not any proprietary specifications compliance... The sole reason for nVidia's insistence on specifications compliance instead of image quality appears to be so that nVidia can find a pretext for "failing" a monitor! So, yes, these monitors may fail nVidia's contrived spec tests--but that is not the same thing as saying that these monitors are not capable of supporting Freesync1/2 to the extent that they look just as good if not better than nVidia's costly proprietary hardware Gsync implementation would look in the same monitor!...Jeez. Ex:
Monitor #246 in today's test batch has failed compliance. Please remove it from the network line, thank you, and mark accordingly. Monitor #246 has output compliance instruction #12Ae when the contiguity factor of the non-specific instructional underwrite should be #11.75Aef. Although this error normally does not affect screen and/or pixel output, and is not germane to function, it does however signal that Monitor #246 has failed compliance-regulation testing.
So, there are a couple of points that should be absorbed here, imo:
*Failing a monitor specifications test contrived by nVidia as a non-Image-Quality test to measure proprietary hardware Gsync support tells us nothing about the quality of nVidia's hardware Gsync implementation for that monitor, compared to a Freesync 1/2 implmentation. We learn the salient but wholly unimportant facts that a given monitor fails the nVidia hardware Gsync certification compliance test, but are told nothing at all about the monitor's image quality! nVidia seemingly wishes to focus the attention of its prospective customers on specification compliance results as opposed to any actual image quality differences...
*If nVidia was to actually show how well the open-source Freesync1/2 works, all without forcing the customer to buy expensive proprietary "solutions" like Gsync, well, soon it would become apparent to even the slowest among us that the Gsync hardware approach to adaptive sync is a the far inferior approach of the two.
I mean sure, if you are fine with that then good for you.
But you have no idea what you are missing out with high refresh screens.
I dont know what you mean with early adopters, if you mean g-sync whatever okay I guess, but anyone using their pc for more than minesweeper and reading email should have high refresh rate screen if they have more than potato pc.
Thanks for the reply.
You always write the most contrived nonsense... you literally "burped" in your post, who does that?
Why would Nvidia certify a G-Sync compatible display for image quality? The entire point is that they are guaranteeing adaptive refresh to work - not that it meets some quality metric. There are people that have reported issues on monitors not certified when force enabling it - which you can do regardless to whether it's certified or not and test it. Also they are showing how well freesync works - every single monitor that is certified works fine across brands. Done, shown, I know if I buy a monitor on that list it's going to work on Nvidia and AMD and that it's not going to have any problems on either.
all i have to say about this is
People have brains so use them.
Dont listen to what a company tells you use your own head.
Yeah I understand that. Prior to upgrading recently I just couldn't see how much better a gaming monitor could be over a decent Dell Ultrasharp I was using.
How wrong I was. My new monitor is far far superior in every way. The black level, less bleed, less IPS glow, fabulous colours. And games like Doom in high refresh rate are simply amazing!
Good execution. I give it 3.5/4
You've also got to remember that in the early days the Nvidia hardware was required - none of the monitors could do it. Nvidia took a monitor and built the scaler themselves to make it work, then released it. If Nvidia hadn't done that the monitor makers wouldn't have done it themselves. It took years - first there was only Nvidia's gsync scalers, then AMD put in a spec for freesync, but we didn't instantly end up with amazing freesync displays, all the early ones sucked. Then given a few years and a fair amount of copying Nvidia (all the early decent freesync displays were gsync displays too) and we've got to where we are today.
Same with 120hz monitors - there were none till nvidia invented 3d vision and told monitor makers we need 120hz monitors to make this work. 3D vision has now died but the whole high refresh rate low latency monitor boom that's now so key to fast paced gaming was kick started by that. Before then despite there being 120hz tv's for years monitor makers hadn't lifted a finger to transfer that to the PC market.
AMD basically did the same thing with multimonitor support and eyeinfinity - it existed in the pro world but AMD pushed it into the every day pc and gamers world.
Anyway got to thank Nvidia and AMD for where we are today - multi-monitor, high refresh rate, low latency and variable sync. No point trolling them for investing in future technologies and trying to make money off it. It's the same with all tech - the company inventing the tech only does it to make money so to pay for that investment they charge the earth, but eventually over time the costs drop - however without them putting the upfront money in to invent something new we wouldn't have it today at all.
You are right, but it would never be smooth for some like me. You can see difference in mouse movement between 60 and 90Hz.
Of course. I would never give up my 144Hz GSYNC screen either. But if someone thinks this is all BS, let him suffer from a worse experience.
Some posts here, damn you need to calm down.
No wonder many monitors are failing certification, tons of freesync monitors are simply bad.
Especially the early ones. Many freesync monitors have high input lag. Limited working range and simply way more blur. Tried tons of high refresh rate monitors and Gsync monitors are pretty much always premium.
Freesync 2 tries to address this.
Meh, I've actually seen better trolling tbh. Trolling none the less. Keep it up, please.