Special K can drastically reduce latency

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Special K can drastically reduce latency

Post by Chief Blur Buster » 02 Oct 2020, 11:28

andrelip wrote:
02 Oct 2020, 11:10
Also, frame-pacing almost always is an illusion. Your frame could be presented to the display at a fixed interval, but if the CPU timer differs too much from each frame than your animation will be all messed up.
I'd reinterpret it slightly:

Frame pacing is NOT an illusion.

It's simply drowned-out in the cacophony, like a quiet tuning fork in the middle of loud metal music.

Yes, like CPU timer problems.

But if you quiet down the metal music a lot (use 8000Hz mouse, good CPU timers, use 360Hz monitor, cherrypick/tweak the game, powerful GPU, high frame rates, high resolutions, use low motion blur features, etc), the weak links really begins to show. It's part of the Vicious Cycle Effect concept where improvements makes it easier to see the weak links.

Stutter is an onion of many causes (system stutter, game stutter, sync tech related stutter including fps-vs-Hz stutter, poll-vs-Hz stutter, disk stutter, DPC events, etc, etc). But when reduced to two tuning forks (two dominant causes above noise floor), they beat-frequency against each other. Framepacing is a major cause of stutter in the stutter onion, but it's hidden by many weak links hiding it below the noise floor, including by the game software.

That said, time-relative-accurate input-to-photons is key. Whac-a-mole all the weak links in this refresh rate race to retina refresh rates.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Special K can drastically reduce latency

Post by jorimt » 02 Oct 2020, 12:21

Chief Blur Buster wrote:
02 Oct 2020, 11:21
jorimt wrote:
01 Oct 2020, 07:56
An update, as promised; I'm on the list for the next shipment.
Me too.
Good to hear.

I'm aware LDAT is probably more straightforward and simplified than your device (LDAT is obviously more intended as a direct replacement to high speed testing, whereas yours is more multi-purpose), but it would be interesting to see how you ultimately find they compare in like-for-like conditions.

Again, Gamers Nexus did a pretty in-depth comparison of their 1000 (EDIT: 240) FPS high-speed setup vs. the LDAT a few weeks back, and it all checked out, which saves me the work of doing similar verification once I receive mine.
Chief Blur Buster wrote:
02 Oct 2020, 11:21
Also, I also have an internal BlurBusters-developed LDAT-like tool (that works on all GPUs) that I am considering commercializing, but right now it's mainly exclusively used for my contracting services and Blur Busters Approved.
As you and I have discussed previously, I'm open to testing your device myself whenever you make them available for something such as an alpha/beta phase.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Special K can drastically reduce latency

Post by Chief Blur Buster » 02 Oct 2020, 19:24

jorimt wrote:
02 Oct 2020, 12:21
As you and I have discussed previously, I'm open to testing your device myself whenever you make them available for something such as an alpha/beta phase.
Yes, you're in the waiting list of any first "out-in-the-wild" Blur Busters lag testing hardware. Mind you, cognizant of risks becoming a manufacturer ourselves. Never, never want to ever be a Failed KickStarter, Y'know?
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Special K can drastically reduce latency

Post by jorimt » 02 Oct 2020, 21:41

Chief Blur Buster wrote:
02 Oct 2020, 19:24
Never, never want to ever be a Failed KickStarter, Y'know?
Indeed :)
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

andrelip
Posts: 162
Joined: 21 Mar 2014, 17:50

Re: Special K can drastically reduce latency

Post by andrelip » 03 Oct 2020, 20:30

Chief Blur Buster wrote:
02 Oct 2020, 11:28
andrelip wrote:
02 Oct 2020, 11:10
Also, frame-pacing almost always is an illusion. Your frame could be presented to the display at a fixed interval, but if the CPU timer differs too much from each frame than your animation will be all messed up.
I'd reinterpret it slightly:

Frame pacing is NOT an illusion.

It's simply drowned-out in the cacophony, like a quiet tuning fork in the middle of loud metal music.

Yes, like CPU timer problems.

But if you quiet down the metal music a lot (use 8000Hz mouse, good CPU timers, use 360Hz monitor, cherrypick/tweak the game, powerful GPU, high frame rates, high resolutions, use low motion blur features, etc), the weak links really begins to show. It's part of the Vicious Cycle Effect concept where improvements makes it easier to see the weak links.

Stutter is an onion of many causes (system stutter, game stutter, sync tech related stutter including fps-vs-Hz stutter, poll-vs-Hz stutter, disk stutter, DPC events, etc, etc). But when reduced to two tuning forks (two dominant causes above noise floor), they beat-frequency against each other. Framepacing is a major cause of stutter in the stutter onion, but it's hidden by many weak links hiding it below the noise floor, including by the game software.

That said, time-relative-accurate input-to-photons is key. Whac-a-mole all the weak links in this refresh rate race to retina refresh rates.
When I talk about the CPU time, I referring to the time spent in CPU in the lifetime of a given frame and not about timer resolution, HPET, and stuff.

You have three vital benchmarks every frame:
- CPU Time
- GPU Time
- Total Time

You could have a constant total time of 16ms, for example, but if the CPU time is alternating from 2ms to 14ms, then the animation will be really messed.

You can sync the presentation with your monitor, which helps to alleviate video artifacts, but it does not mean a perfect smooth animation as you cannot control the CPU time directly. The limiters will only add artificial latency to stabilize total time and help the CPU processing to start at a constant interval, but will not guarantee that the content being rendered is smooth.

--- Experiment to prove my argument ---

- Use CSGO `startmovie` command to export images of a demo at 30 fps and then create a video from those images.
- Then run the same demo with RTSS and be sure that you have a "perfect" 33.33ms
- Compare both

The video will be butter smoother even at 30 fps while the seconds will feel weird. That's because the game will export the images with a real perfect frame pacing.

Just to reject any argument about video the video having the help of the encoder and motion blur, try to record the game with any capturing tool while it is running at 33.33 ms and playback the video.

That's why I said that frame pacing, the way it is measured, is simple an illusion.

OBS1: CPU time is a collection of functions and you can have the timestamps being used in different moments depending on the engine. So it's more complicated as it could not have perfect motion even at a perfect constant frame-pacing.

--- OFF TOPIC ---

Don't trust CapframeX to measure things. It always showed NV Driver cap as having faster input delay than uncapped for CSGO (avg of 0.2% flip-queue) while FrameView / GPUView / Pixel-to-bottom-delay tests have the opposite results (render-present delay going from ~0.2 to 4ms @ 240hz). They had a discussion a few months ago about the calculation: https://github.com/GameTechDev/PresentMon/issues/73

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Special K can drastically reduce latency

Post by Chief Blur Buster » 05 Oct 2020, 14:04

andrelip wrote:
03 Oct 2020, 20:30
When I talk about the CPU time, I referring to the time spent in CPU in the lifetime of a given frame and not about timer resolution, HPET, and stuff.

You have three vital benchmarks every frame:
- CPU Time
- GPU Time
- Total Time

You could have a constant total time of 16ms, for example, but if the CPU time is alternating from 2ms to 14ms, then the animation will be really messed.

You can sync the presentation with your monitor, which helps to alleviate video artifacts, but it does not mean a perfect smooth animation as you cannot control the CPU time directly. The limiters will only add artificial latency to stabilize total time and help the CPU processing to start at a constant interval, but will not guarantee that the content being rendered is smooth.
Yes, I agree. I call this "Gametime-to-photons" consistency, which is how I helped game developer fix Cloudpunk stutters (even on VRR):
Unity Developers: Easy G-SYNC & FreeSync Support in only 3 lines of programming code!.
It's another one of those Blur Busters textbook "Milliseconds Matters" issues.
andrelip wrote:
03 Oct 2020, 20:30
- Use CSGO `startmovie` command to export images of a demo at 30 fps and then create a video from those images.
- Then run the same demo with RTSS and be sure that you have a "perfect" 33.33ms
- Compare both

The video will be butter smoother even at 30 fps while the seconds will feel weird. That's because the game will export the images with a real perfect frame pacing.
Yes, the gametime may jitter relative to photons (e.g. engine issue, VSYNC misses, other system clocks, etc). Video recording re-timing the frames can distort what was originally seen on the monitor.
andrelip wrote:
03 Oct 2020, 20:30
That's why I said that frame pacing, the way it is measured, is simple an illusion.
Now that you are specific, you is correct. RTSS can be glassfloor but the photontime may be varying, because of blackbox issues beyond Present() -- I understand Present()-to-Photons in ways most people cannot understand. You'd need photodiode analysis, and measure multiple points of the screen, to find out if gametime-to-photontime is being correctly time-relative consistent.

RTSS can be reasonably accurate only up to Present() but what happens with Present()-to-photons is not easily loggable without a photodiode.

In an ideal world, the return from a waitable-swapchained Present() VRR or VSYNC ON) is time-relative to photons for a given pixel, but in many cases there will be unexpected timing jitter in the chain between GPU and display for many obscure. Even mudane things like power management on low-GPU-utilized graphics cards, can cause photon timing jitter. I found many potential timing-jitter error conditions out during my raster beam racing experience with Tearline Jedi (I found a way to program a modern equivalent of "raster interrupts" on GeForce/Radeons), where I use microsecond-precise timing in my C# programming to steer VSYNC OFF tearlines into exact locations.

Now, in real-world games, I can sometimes notice single-millisecond time-divergences in certain situations (e.g. ULMB when 1ms MPRT has less motion blur than the time-divergence aka stutter, of gametime-to-photons). 1ms timing error at 4000 pixels/second motion is a 4-pixels-offset; which is noticeable when motionblur is smaller than the stutterjump of the said timing error (gametime-to-photontime);

When I originally replied, I didn't realize the "the way it is measured" portion. Even VRR bugs (e.g. monitor firmware bugs that erratically delay the start of scanout) can also produce photontime jitter that is not measurable by any software tools even deep into the GPU driver.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

andrelip
Posts: 162
Joined: 21 Mar 2014, 17:50

Re: Special K can drastically reduce latency

Post by andrelip » 06 Oct 2020, 11:55

Chief Blur Buster wrote:
05 Oct 2020, 14:04
andrelip wrote:
03 Oct 2020, 20:30
When I talk about the CPU time, I referring to the time spent in CPU in the lifetime of a given frame and not about timer resolution, HPET, and stuff.

You have three vital benchmarks every frame:
- CPU Time
- GPU Time
- Total Time

You could have a constant total time of 16ms, for example, but if the CPU time is alternating from 2ms to 14ms, then the animation will be really messed.

You can sync the presentation with your monitor, which helps to alleviate video artifacts, but it does not mean a perfect smooth animation as you cannot control the CPU time directly. The limiters will only add artificial latency to stabilize total time and help the CPU processing to start at a constant interval, but will not guarantee that the content being rendered is smooth.
Yes, I agree. I call this "Gametime-to-photons" consistency, which is how I helped game developer fix Cloudpunk stutters (even on VRR):
Unity Developers: Easy G-SYNC & FreeSync Support in only 3 lines of programming code!.
It's another one of those Blur Busters textbook "Milliseconds Matters" issues.
andrelip wrote:
03 Oct 2020, 20:30
- Use CSGO `startmovie` command to export images of a demo at 30 fps and then create a video from those images.
- Then run the same demo with RTSS and be sure that you have a "perfect" 33.33ms
- Compare both

The video will be butter smoother even at 30 fps while the seconds will feel weird. That's because the game will export the images with a real perfect frame pacing.
Yes, the gametime may jitter relative to photons (e.g. engine issue, VSYNC misses, other system clocks, etc). Video recording re-timing the frames can distort what was originally seen on the monitor.
andrelip wrote:
03 Oct 2020, 20:30
That's why I said that frame pacing, the way it is measured, is simple an illusion.
Now that you are specific, you is correct. RTSS can be glassfloor but the photontime may be varying, because of blackbox issues beyond Present() -- I understand Present()-to-Photons in ways most people cannot understand. You'd need photodiode analysis, and measure multiple points of the screen, to find out if gametime-to-photontime is being correctly time-relative consistent.

RTSS can be reasonably accurate only up to Present() but what happens with Present()-to-photons is not easily loggable without a photodiode.

In an ideal world, the return from a waitable-swapchained Present() VRR or VSYNC ON) is time-relative to photons for a given pixel, but in many cases there will be unexpected timing jitter in the chain between GPU and display for many obscure. Even mudane things like power management on low-GPU-utilized graphics cards, can cause photon timing jitter. I found many potential timing-jitter error conditions out during my raster beam racing experience with Tearline Jedi (I found a way to program a modern equivalent of "raster interrupts" on GeForce/Radeons), where I use microsecond-precise timing in my C# programming to steer VSYNC OFF tearlines into exact locations.

Now, in real-world games, I can sometimes notice single-millisecond time-divergences in certain situations (e.g. ULMB when 1ms MPRT has less motion blur than the time-divergence aka stutter, of gametime-to-photons). 1ms timing error at 4000 pixels/second motion is a 4-pixels-offset; which is noticeable when motionblur is smaller than the stutterjump of the said timing error (gametime-to-photontime);

When I originally replied, I didn't realize the "the way it is measured" portion. Even VRR bugs (e.g. monitor firmware bugs that erratically delay the start of scanout) can also produce photontime jitter that is not measurable by any software tools even deep into the GPU driver.
I'm thinking we were still talking about different things.

What I'm stating is that the only thing that matters for the animation is the CPU time.

Consider that:

The CPU is capturing the inputs and deciding the positions and all the details of the scene. It will then give some meta params for the GPU to render the scene.

In some sense, the GPU is just a post-processing step that will make this already generated image into something humans can see. For the animation to be smooth, the CPU time needs to be smooth, and the CPU + GPU to be in sync or at least harmonic with the display.

That is what happens with video playback and video recording. That's the main reason why they are smoother at lower framerates while 3D games usually don't, even using RTSS with scanline or VRR. Blur and other things help, but the perfect paced animation is the one contributing more.

- About the CSGO example:
The startmovie command will fetch the timestamps in a perfect interval directly from the tick and will generally have an ideal interpolation. So the frames will be perfectly paced.

When you record gameplay without that command, it will re-time the frames, but the static images already have jitter in the motion.
Last edited by andrelip on 07 Oct 2020, 04:55, edited 1 time in total.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Special K can drastically reduce latency

Post by Chief Blur Buster » 06 Oct 2020, 14:48

andrelip wrote:
06 Oct 2020, 11:55
What I'm stating is that the only thing that matters for the animation is the CPU time.
(Terminology disambiguation: CPU time is the same thing as gametime.)
andrelip wrote:
06 Oct 2020, 11:55
The CPU is capturing the inputs and deciding the positions and all the details of the scene. It will then give some meta params for the GPU to render the scene.

In some sense, the GPU is just a post-processing step that will make this already generated image into something humans can see. For the animation to be smooth, the CPU time needs to be smooth, and the CPU + GPU to be in sync or at least harmonic with the display.
I definitely agree.

There are so many things that can go out of sync, including what you explain. Even mere rendertime variances can jitter gametime (CPU time) away from frame presentation time. And anything that interferes, including the reasons you mention and the reasons I mention. And tools don't necessary encompass the chain, so it often doesn't show up in recordable statistics.

Blur Busters is all about temporal technology (Hz, GtG, lag, MPRT, clocks, VRR, jitter, stutter, framepacing, synchronization, etc). It's such a big universe that it's easy to make mis-assumptions about what temporals we are talking about, since there literally are zillions of temporal factors that inject jitter everywhere.
andrelip wrote:
06 Oct 2020, 11:55
The startmovie command will fetch the timestamps in a perfect interval directly from the tick and will generally have an ideal interpolation. So the frames will be perfectly paced.

When you record gameplay without that command, it will re-time the frames, but the static images already have jitter in the motion.
Indeed, the multiple methods of recording can change the variables that increase/decrease jitter (e.g. gameplay video smoother or stutterier than original gameplay, depending on method/settings of gameplay recording).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

andrelip
Posts: 162
Joined: 21 Mar 2014, 17:50

Re: Special K can drastically reduce latency

Post by andrelip » 07 Oct 2020, 05:06

Chief Blur Buster wrote:
06 Oct 2020, 14:48
Indeed, the multiple methods of recording can change the variables that increase/decrease jitter (e.g. gameplay video smoother or stutterier than original gameplay, depending on method/settings of gameplay recording).
Exactly. Imagine having a car moving 1000 pixels per second in 10 frames per second. So if the CPU time has too much variation then instead of presenting 100 pixels of movement each frame, they will do 90/150/230/60/80. So even if you record those frames and playback them in a perfect present cycle the animation inside will still be moving at 90/150/230/60/80 pixels at each frame. You just alleviate the tearing and other video artifacts.

If someone wants to create a limiter to enhance the experience it needs to hook not only the present() but the main get_timestamp() functions used by the game. But it would be very game dependent as in each frame the engine could be fetching the time multiple times.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Special K can drastically reduce latency

Post by Chief Blur Buster » 07 Oct 2020, 20:38

Hooking Present() is frequently done by many software such as RTSS, so that's doable. Windows already provide hooks for that.

Hooking timestamp APIs will probably be a no-no, because timestamps are used for many things such as monitoring other system processes.

One exception is theoretically, proper strong virtualization software (like VMWARE) it could automatically throttle/speedup the relative time of the entire "computer", to allow safe distortion of get_timestamp(). But that's a hugely complex manoever -- like making the passage of time really slow versus really fast (vmware-style virtualized synchronous speedups/slowdowns of ALL clock sources including BIOS clocks, GPU clocks, CPU clocks, execution speeds, timestamping APIs of all kinds throughout the entire systems, so they never get out of sync with each other). One could attempt some distortions, but it would be very hard like writing a console HLE-emulator working with all console games, because of all the huge APIs (relating to timestamps) that need to be accomodated for more universal compatibility with all games.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply