I sold much of this Monitors from Asus and i can tell you no tearing until your fps dont hit over 144fps. But games like Ghost Racon Online which is Caped at 60Hz it have tearing because the Hz is caped! With Vsync off it looks like crap as the presentation.
No, trick of real smoothness is to have motion blur exactly as long as time between 2 frames, then it feels smooth as possible. If you remember Split Second racing game with destructible maps... It was port from consoles, had 30fps cap and motion blur exactly same as you would get with 33ms shutter camera. Game was very impressive in smoothness department. Games like crysis have motion blur set for very long time interval hoping to give real life feeling, but they fail. In general as I play CS on 120Hz I can see separate frames as I turn around, because difference is too big. Proving that there is room for 240Hz+ screen in gaming industry. But if Cs had motion blur added with time length of 1/120s = 8.33ms then I would effectively be unable (prevented) to see distinct frames. Successful game should get time between frames and use motion blur at that level, because no v-sync g-sync can give you feeling of smoothness at 40fps game without motion blur. Therefore while nV shown probably very good tearless frame pacing, they made sure to use software rendering techniques which conceal low fps effect of seeing distinctive images for too long. Reality is that no matter what v/g-sync you use at 40fps, you end up with worse experience than 120fps@120Hz/144fps@144Hz.
Well ill be looking forward to hilberts review when he gets a unit in december. (hopefully) i dont think it they fooled people , i think it works.
this will at least hopefully make amd get more serious with the frame pacing tech or get something better, for now capping fps at 60~and forcing vsync on works perfectly fine as long as fps don't drop bellow 60
It need a open Standart i always dont support exclusive stuffs, i even dont buy physx games as nvidia owner.
Open standards nearly always fail though, and Nvidia will likely see thus as another huge money maker. I'm a TV gamer though, not a chance in hell I would go back to a small monitor, will be interesting to see how this technology grows though
What happens when your fps drop bellow 60fps? Just curious because when mine drops, its stutters and skips frames like crazy if its not solid 60fps ,most games are unplayable for me. I feel i should have saved money for a PS4...
$300 monitor? no thx. my card cant even run game BF3/4 at 60fps stable (I'm using cheap AOC montior for $100)
Huge is under statement, this has potential to make them extremely rich. Imagine this been licensed, so samsung can put one in every samsung s6. Imagine apple licensing it, imagine it been stuck in tegra 6 devices. Now imagine if they could secure licensing for all 4K screens. Getting this tech into everything with a screen, all that licensing cash. They could even license it to AMD eventually :infinity: Also because no coding is required, doesn't matter what OS it works across everything. They could in theory place tech in PS4 slim or Xbone slim later on, with the televisions with the chip. This could be real winner for them.
so shortly we will have nv bundles to pick at. gtx780 + gsync monitor[tn] + 2 free games for a great deal at only $1200.00. lol
Lol - you mean like PhysX? Yeah that's been great for for Nvidia.....just look at all the games using GPU Physx...there must at least 20? and only after what 10 years----success! <sarcasm> Kinda like when they made SLI proprietary,..........they practically have to fight off the customers. Want another example of propriety/IP fail? - howabout USB vs Thunderbolt and Firewire.....
Nvidia can use Mantle, just not fully. So is there any point in Nvidia using it? Probably not. AMD can't use G-sync period.
AMD can as well make real solution to this and guess what? We would likely see it in next VESA standard for everyone. One would really wonder why tech demo had so much of motion blur when they switched down to 40fps@40Hz. I guess G-Sync would win greatly in their setup even at 30fps@30Hz vs V-Synced 30fps@60Hz while both screens should be showing same thing in such scenario. And I really loved how nicely they designated application that when v-Sync was turned off tear would be always in middle of screen so even dumb people can be pointed to it. Nice, very noticeable. But made by designers, not by technology. I am quite happy with my miraculous 120Hz screen which have noticeable tearing only if fps goes over 120 because it's 120Hz screen. Sometimes I get it on lower fps than refresh rate, but it always is result of alt+tabbing several times in Unreal Engine based game.