andrelip wrote: ↑03 Oct 2020, 20:30
When I talk about the CPU time, I referring to the time spent in CPU in the lifetime of a given frame and not about timer resolution, HPET, and stuff.
You have three vital benchmarks every frame:
- CPU Time
- GPU Time
- Total Time
You could have a constant total time of 16ms, for example, but if the CPU time is alternating from 2ms to 14ms, then the animation will be really messed.
You can sync the presentation with your monitor, which helps to alleviate video artifacts, but it does not mean a perfect smooth animation as you cannot control the CPU time directly. The limiters will only add artificial latency to stabilize total time and help the CPU processing to start at a constant interval, but will not guarantee that the content being rendered is smooth.
Yes, I agree. I call this "Gametime-to-photons" consistency, which is how I helped game developer fix Cloudpunk stutters (even on VRR):
Unity Developers: Easy G-SYNC & FreeSync Support in only 3 lines of programming code!.
It's another one of those Blur Busters textbook "Milliseconds Matters" issues.
andrelip wrote: ↑03 Oct 2020, 20:30
- Use CSGO `startmovie` command to export images of a demo at 30 fps and then create a video from those images.
- Then run the same demo with RTSS and be sure that you have a "perfect" 33.33ms
- Compare both
The video will be butter smoother even at 30 fps while the seconds will feel weird. That's because the game will export the images with a real perfect frame pacing.
Yes, the gametime may jitter relative to photons (e.g. engine issue, VSYNC misses, other system clocks, etc). Video recording re-timing the frames can distort what was originally seen on the monitor.
andrelip wrote: ↑03 Oct 2020, 20:30
That's why I said that frame pacing,
the way it is measured, is simple an illusion.
Now that you are specific,
you is correct. RTSS can be glassfloor but the photontime may be varying, because of blackbox issues beyond Present() -- I understand Present()-to-Photons in ways most people cannot understand. You'd need photodiode analysis, and measure multiple points of the screen, to find out if gametime-to-photontime is being correctly time-relative consistent.
RTSS can be reasonably accurate only up to Present() but what happens with Present()-to-photons is not easily loggable without a photodiode.
In an ideal world, the return from a waitable-swapchained Present() VRR or VSYNC ON) is time-relative to photons for a given pixel, but in many cases there will be unexpected timing jitter in the chain between GPU and display for many obscure. Even mudane things like power management on low-GPU-utilized graphics cards, can cause photon timing jitter. I found many potential timing-jitter error conditions out during my raster beam racing experience with Tearline Jedi (
I found a way to program a modern equivalent of "raster interrupts" on GeForce/Radeons), where I use microsecond-precise timing in my C# programming to steer VSYNC OFF tearlines into exact locations.
Now, in real-world games, I can sometimes notice single-millisecond time-divergences in certain situations (e.g. ULMB when 1ms MPRT has less motion blur than the time-divergence aka stutter, of gametime-to-photons). 1ms timing error at 4000 pixels/second motion is a 4-pixels-offset; which is noticeable when motionblur is smaller than the stutterjump of the said timing error (gametime-to-photontime);
When I originally replied, I didn't realize the "
the way it is measured" portion. Even VRR bugs (e.g. monitor firmware bugs that erratically delay the start of scanout) can also produce photontime jitter that is not measurable by any software tools even deep into the GPU driver.