grieverheart wrote: ↑04 Aug 2023, 07:44
So, if I understand correctly, there will always be an input latency of somewhere between 0 and 1 frame. This is because we need to wait for the "beam" to reach the correct position before we swap/present. In emulation you can just pause the emulation until this is achieved, so the input lag is reduced to just a frame slice, but what if you have an input source you want to present? Is there a way to e.g. reset the beam position so as to minimise the difference between the input and the desired output raster positions?
No you cannot reset a raster*
*Except on VRR displays, where if the display is idling, the next frame presentation event starts the raster incrementing immediately, only if the display was idling. But once it starts refreshing, it scans out at a constant speed based on horizontal scanrate. Once it's scanning out, you can beamrace it. (WinUAE Lagless VSYNC can beamrace VRR refresh cycles too!).
For traditional render you have to use techniques like SpecialK or RTSS Scanline Sync to hide the VSYNC OFF tearline. Delay your input read until as close as possible to rendertime/displaytime, so that the freshest control is displayed.
grieverheart wrote: ↑04 Aug 2023, 07:44
Also, I think it's been a few years since you first posted the method. Have you perhaps released the source code you mentioned before? You say it's cross-platform, but I'm not sure how you acquire the vsync timestamps on the different platforms like Linux and MacOS.
I'm releasing piecemeal for now -- there's a new VSYNC dejitterer with a raster estimator here:
https://github.com/blurbusters/RefreshRateCalculator
Possible APIs to get timestamps:
https://github.com/blurbusters/RefreshR ... 1651063875
In Linux best one to use is XRRGetCrtcInfo() which returns the XRRCrtcInfoSource which includes a timestamp.
In MacOS/iOS it is the CADisplayLink or CVDisplayLink APIs.
But you have to use the VSYNC OFF mode to eliminate the tearlines. One way to do things is to temporarily use VSYNC ON at startup to get the time interval, then switch to VSYNC OFF. A roughly 30 second startup calibration (dejittered with RefreshRateCalculator.js) tends to deadreckon/extrapolate accurately for about 30 minutes thereafter.
ad8e managed to find the Linux VSYNC listener and beamrace tearlines on Linux with this github project:
https://github.com/ad8e/vsync_blurbusters
(It compiles under Windows and Linux, with a stationary VSYNC OFF tearline capability)
For the Linux #ifdef, it uses the xrandr XRRCrtcInfoSource.
Some operating systems (Windows 11) lets me do VSYNC ON and VSYNC OFF simultaneously in separate windows, and the VSYNC ON window can stream timestamps to the VSYNC OFF context, and the dejitterer can undo any API-overhead/RPC/async jitter caused.
P.S. I recently got
Kefrens Bars working in a Google Chrome window in VSYNC OFF mode! I have not released the URL because it requires custom browser command line for the VSYNC OFF, and I haven't implemented the websocket relay yet. Not possible to screenshot it, so it's true-raster-beamraced. Needs a websocket relay between a VSYNC ON profile and a VSYNC OFF profile.
This is currently an unpaid hobby side project, so I haven't done much work in this territory lately; however, I'd love to help someone start an open source VSYNC listener with a bunch of #ifdef's for different platforms. A VSYNC timestamps daemon, if you will. I'd throw in some code contributions perhaps. Might require access to specific driver registers / etc on certain platforms, while APIs already exists on other.
Do you want to help create a new MIT / Apache 2.0 project that is kind of a cross platform VSYNC daemon?
There is a chicken and egg problem
- Modern GPU programmers don't understand rasters
- Old fashioned programmers don't understand modern GPU programming
So, it's been hard to get volunteers to help out with such hobby projects.