https://imgur.com/a/oUSPFsf Here's some photos of FRTC showing perfect frame time but RTSS is showing it differently, even showing higher than 16.67ms when locked at 60fps. Can someone explain what is going on here? I also made a quick video test showing a very consistent screen tear using only FRTC indicating the stable frame timing without using RTSS.
FPS are frames per second => an average calculated over the time of 1000ms Frametime = the time in microseconds a single frame needs to be rendered. As for all calculated averages there are some below and some above this.
FRTC seems to have a lower resolution for the graph? The graph says 70ms at the top, so I assume that's not enough resolution to show small variations and a result it looks perfectly flat.
It says 70.8ms, but the frame-to-frame latency is 16.67ms~ which is better resolution than RTSS's 16.6ms report I've run the test again at 61 FPS cap and 60 FPS cap with FRTC. 61 FPS cap 60 FPS cap
Then I guess they are measuring differently, or at different locations. RTSS measures frame presentation call intervals in the process it has injected itself to. Do we know what FRTC is measuring? Maybe it measures what the driver does rather than what the game does?