My Brother has an MSI GTX 980. He has owned this card in a different build for about 7 months, but on his new build he is getting a tearing effect. His FPS are fine and settings are fine. He has not over clocked this card. He ran GPU-z and at idle his GPU temps are 15C higher than my way overclocked GTX 660Ti cards. Something is wrong. He tells me his GPU fans are not spinning with this new 1151 MOBO, which tells me he must be GPU overheating on that card during game play. Any answers as to why his GPU fans are not turning on this 1151 MOBO? He has the latest Nvidia certified drivers installed so we covered that potential issue.
I'm pretty sure what he means is that the fans don't spin up in idle. As soon as the temp goes over 60°C they'll spin up.
Yes... correction. He told me his fans are spinning up when he starts any game. 34C he says during game play, but still tearing. So maybe not an overheating issue. Could this be some weird thing with the MSI 1151 MOBO? We tried another test monitor and same issue so we know that's not the issue. This exact same card did not behave this way with his older AMD quad core set up. I'm stuck on any reason for this. It's not blur refresh rate or FPS issue. It's a well working Asus 2ms gaming monitor. I even tested this monitor on my Gaming PC and all is fine. Maybe this 1151 Socket MOBO needs a bios update??? Any ideas?
Exclusive fullscreen, vsync off = tearing Exclusive fullscreen, vsync on = no tearing Windowed mode/borderless, vsync off = microstutter, no tearing, more input lag Windowed mode/borderless, vsync on = no microstutter, no tearing, more input lag Pick your poison. Also, most modern nvidia cards come with passive cooling when there's no GPU load. That means the fans are powered off when not playing, or when playing games with low GPU load. Once the temp gets to 60C, the fans are powered on.
I'd say that often the mouse lag feels much worse in exclusive fullscreen mode. In borderless windowed mode the cursor should move just like on desktop. Vsync will add input lag in all cases but for some reason at least in my opinion borderless mode seems to be better (especially if the game has shoddy vsync implementation).
Setting V-Sync did help with the tearing issue, but still grainy. He's using a GTX 980 with this new MSI Arctic 1151 MOBO. He has his GPU drivers updated and has the latest BIOS on MOBO. He's using a newer Asus 24" 1980 res 2ms gaming monitor. His FPS are fine. He's tried every adjustment on this monitor, but still more grainy than he had with his 7 year old AMD build. I told him to try his old monitor, but now it's grainy also. Any ideas???
What fps/hz you at? Try limiting fps to 1-2 less than your monitors refresh rate (the fps you're playing at). I'm assuming the fps is maxed all the time? Tearing happens because the monitor doesn't catch up with the high frames. You can limit the fps either with the games console if that exists or Afterburner (Rivatuner Statistics Server to be exact).
1 - Grainy (video?): Go to nvidia control panel -> Adjust video colour settings -> choose the "advanced" tab -> select Dynamic Range: Full (0-255) 2 - Use vsync to avoid tearing. Tearing has nothing to do with temperatures. The GPU simply starts drawing the next frame when it's ready, even if the previous hasn't fully been drawn yet. Pixel are drawn from top to bottom, so if the game is in motion, the bottom part of the screen will be somewhat ahead of the top part (as it's the new image), creating a tear. With higher than refreshrate fps you can get multiple tears on single image, because the gpu can draw multiple frames in a single hz. If your GPU overheats, you'll get artifacts caused by calculation errors. example: If the screen is "grainy" like this, the memory on the GPU is faulty:
WOW! No it's nothing like that kind if grainy effect. The tearing effect was corrected upon members suggestions of setting Vsync on. When he says, "Grainy" he means sort of like the diff between watching a movie in standard def with poor quality and watching great quality HDTV. Like the poor resolution you would get if someone text message you a 10 second video to your smart phone. It's rather poor and grainy. Compressed looking would be a better word I guess. I had him test with the HDMI output on his GTX 980 to the Asus Monitor and even tried a different monitor thinking it may be the VGA cable. Next I will try the suggestion of switching Dynamic Range values in the Nvidia Control panel. Only The CPU and MSI 1151 micro ATX mobo were changed on this upgrade. No issue until then. Maybe the GTX 980 just is not playing well with this new 1151 Mobo.
As I said earlier, use a DVI or DisplayPort cable. Not HDMI. Also make sure your desktop resolution is correct. On a 1080p monitor, anything else than 1920x1080 will look bad.
Yea, (DVI) not VGA. The HDMI was just as a test. He's moved back to VGA. He's set on 1920 Res. We've tried every imaginable setting in the NVIDIA control panel and every monitor setting with no resolve to the grainy issue. It must be the MSI 1151 Arctic Micro ATX MOBO. That was the only upgrade change. I suppose it could be his GTX 980 except that was great be for upgrade from the AMD set up. We have ruled out monitor because now even his old monitor is testing with the same grainy visual. Maybe during rebuild his GTX 980 took a dump. At this point I have no clue.
(Correction DVI) He's hooked up primary DVI. You know... The GPU output on GTX 980 that is the main output.
Was VGA used before the upgrade? VGA can look a bit "smudged" if you don't calibrate it in the OSD ("clock phase" and whatnot, or pressing the "auto adjust" button/function on a proper background pattern) but DVI-D is a digital connection and gets you the sharpest image possible with no loss and by default without the need to calibrate anything. Can it be that you got used to the analog VGA "smudgyness" and now that you're seeing a proper sharp image, you think it's "grainy"?
That's true for every game, not just that particular one. You can't fit anything else than 60, 30, 20, 15, 10, 5, 4, 3, 2 and 1FPS into 60Hz evenly. So anything between 30 and 60 is going to have microstutter. That's one of the reasons why gsync and freesync were invented.