ezQuake: Just-in-Time VSYNC

Talk to software developers and aspiring geeks. Programming tips. Improve motion fluidity. Reduce input lag. Come Present() yourself!
Ashun
Posts: 7
Joined: 06 Jan 2014, 21:12

ezQuake: Just-in-Time VSYNC

Post by Ashun » 06 Jan 2014, 22:32

The discussions here by Mark and Ahigh prompted me to find a semi-modern PC game that used just-in-time rendering, and I came across a post by Tonik, one of the programmers of ezQuake, documenting the game's vsync lag fix:

http://www.quakeworld.nu/forum/topic/24 ... g-solution

The whole thread is interesting, but the important part is here:

The essense of the vsync fix is that the processing of each game frame is timed in such a way that rendering finishes just in time for the vertical retrace and SwapBuffers() fires without waiting.

My code keeps history of the render times of the last 5 frames, and it uses the maximum of these in calculating the right time to start processing the new frame. But if rendering takes longer than expected (say you were facing a wall and then turned around, render time soars), SwapBuffers will be called too late and it'll have to wait a whole new monitor refresh cycle until the next vertical retrace. You may notice occasional fps drops if you set cl_vsync_fix_tweak too low.


At the time (2007), rendering times may have been an issue, but now it's relatively easy to peg the game at 1000 FPS, so I wanted to see how this compared to VSYNC off.

I'm using a CRT, so I tried several different resolutions and refresh rates: 800x600 at 160 Hz, 1260x945 at 120Hz, 1600x1200 at 96 Hz, and finally 2560x1920 at 57Hz.

Playing at 160Hz with VSYNC off, hitting 1000 FPS (with a corresponding mouse polling rate), the immediacy is exhilarating. I'm not sure I could distinguish 1000 FPS from 500 or even 300, so I'd need to do some ABX testing, but it's scary good. Tearing is not an issue here. I suspect this would sound crazy anywhere other than BlurBusters, but I still want a higher framerate; I can distinguish individual frames during fast mouse movements.

ezQuake's VSYNC fix is best implementation I've experienced. Tonik actually challenged responders to check whether or not they could tell a difference in that thread. The testing cfg file is no longer there, but I can see how at lower mouse sensitivities, you might not be able to distinguish. Fast mouse flicks still feel slightly dulled.

That slight VSYNC lag remains regardless of the refresh rate, so as those rates decrease down to 57 Hz, the tradeoff is how annoyed you are with the screen tearing. I should have tried something between 57 and 96, but I found at 96+ Hz, tearing is unobtrusive. At 57Hz, the tearing is pretty conspicuous, and VSYNC on looks much better.

So if you have your old Quake files, grab ezQuake here and try it out:

http://ezquake.sourceforge.net/

User avatar
Chief Blur Buster
Site Admin
Posts: 6480
Joined: 05 Dec 2013, 15:44

Re: ezQuake: Just-in-Time VSYNC

Post by Chief Blur Buster » 06 Jan 2014, 22:47

Ashun wrote:Playing at 160Hz with VSYNC off, hitting 1000 FPS (with a corresponding mouse polling rate), the immediacy is exhilarating. I'm not sure I could distinguish 1000 FPS from 500 or even 300, so I'd need to do some ABX testing, but it's scary good. Tearing is not an issue here.
You got that right -- these numbers aren't considered crazy around here at Blur Busters.

1000fps VSYNC OFF would probably become practically indistinguishable from VSYNC ON.

VSYNC OFF microstutters have always annoyed me, even at ~200fps I can still see the VSYNC OFF microstutters. Capping framerate near refresh rate multiples (e.g. fps_max 120 or fps_max 240 or fps_max 360) solves the VSYNC OFF microstutter problem, but introduces tearing that lingers in the same places of the screen (rather than fleeting amounts of tearing). One solution is to cap it a few fps off (e.g. fps_max 118, 119, 121, 122, or near a multiple, 238, 239, 241, 242) as a compromise between VSYNC OFF microstutters, tearing, and input lag. Very game dependant.

At 200fps, during 1000 pixels/second movement (pan/strafe/turn) microstutters are as much as (200/1000)-pixel-off = 5 pixels off. So rapid microstutters that vibrate back and fourth in an amplitude of 5 pixels. At 1000fps, during 1000 pixels/second movement (pan/strafe/turn), microstutters are only 1 pixel off. That becomes pretty much unnoticeable.

If we had a 1000Hz monitor, 1 frame of VSYNC ON input lag would be only 1 millisecond.
But that's much harder to accomplish than either G-SYNC or OLED.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

User avatar
Chief Blur Buster
Site Admin
Posts: 6480
Joined: 05 Dec 2013, 15:44

Re: ezQuake: Just-in-Time VSYNC

Post by Chief Blur Buster » 06 Jan 2014, 22:50

Ashun wrote:ezQuake's VSYNC fix is best implementation I've experienced. Tonik actually challenged responders to check whether or not they could tell a difference in that thread. The testing cfg file is no longer there, but I can see how at lower mouse sensitivities, you might not be able to distinguish. Fast mouse flicks still feel slightly dulled.
The ezQuake VSYNC fix appears to be a really interesting way of doing things.
Just-in-time VSYNC is a good solution for old Quake engines, because frames render so fast, that you can simply wait until 2ms before VSYNC, render the frame in 1/1000sec, and have plenty of time (another 1ms) to deliver on time for VSYNC.

Lagless VSYNC ON has been very rare since the days of Mortal Kombat and Super Mario Brothers (believe it or not, those classic games ran VSYNC ON and we never complained about input lag).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

User avatar
RealNC
Site Admin
Posts: 2821
Joined: 24 Dec 2013, 18:32
Contact:

Re: ezQuake: Just-in-Time VSYNC

Post by RealNC » 07 Jan 2014, 02:15

Chief Blur Buster wrote:Lagless VSYNC ON has been very rare since the days of Mortal Kombat and Super Mario Brothers (believe it or not, those classic games ran VSYNC ON and we never complained about input lag).
I think that's because they were using controllers, not mice. Input lag is never a problem for me when playing anything that doesn't involve big camera movements with the mouse. For example, Skyrim with an 360 gamepad feels lag-free. The input lag is still there, of course. You just can't notice it because of the controls. I really believe the mouse is the only input device at the moment (the Occulus Rift doesn't count) that makes even the slightest of input lag apparent (as long as it's above the human perception limit, of course.)
TwitterSteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
Chief Blur Buster
Site Admin
Posts: 6480
Joined: 05 Dec 2013, 15:44

Re: ezQuake: Just-in-Time VSYNC

Post by Chief Blur Buster » 07 Jan 2014, 13:02

RealNC wrote:I think that's because they were using controllers, not mice.
Actually, it's because of 3D graphics.

Yesterday's sprite-based / tile-based games didn't need frame buffering the way 3D graphics do.

I recently did some tests (for the upcoming GSYNC review Part #2) and a gaming mouse in a modern game contributes less than 5% to the whole input lag chain (button-to-eyeballs).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

HeLLoWorld
Posts: 33
Joined: 07 Jan 2014, 18:44

Re: ezQuake: Just-in-Time VSYNC

Post by HeLLoWorld » 07 Jan 2014, 21:10

I thought I'd add the other day I successfully setup quake live at 200Hz on a cheaply acquired crt, plus solid 250fps and 500Hz mouse.

Really I think the more the better. It's amazing, really. I was able to do an phenomenal amount of almost surnatural rail shots in instagib, there's just no words. Every single other display system I have tried is just garbage once you've seen that :)

I also like to believe it's better with vsync on...At this rate we're at 4ms frames, with no tear. I'm not sure but maybe it's better to only have fully correct, non-distorted images at that point.
Suppose we got infinite generated fps, or as many as there are lines in one refresh...

I wonder if the image being distorted in real time to follow the inputs would be better (you could push the hypothesis to the point where there are as many frames as there are pixel ie "racing the beam" as Abrash or Carmack said)... However it's true that it makes sense. But here we don't reach that point yet: so we got tears. And tears are incorrect and incoherent images, that get to your eyes, period. Slightly outdated static coherent images are not as incorrect for me, even though your tracking eye could get a slight blur from this (I think your eye would have to move as fast as the vertical refresh for this, the 1-2ms phosphor fade is not negligible anymore at that point).


Then I successfully did warsow at 500Hz vsync on (ok 499 I admit, windows wouldnt let me - amazingly that was the only bottleneck so far), 500fps solid from engine, and 500Hz mouse.

I wonder how many people tried this, the full chain has cycles of 2 milliseconds. Not that much use in just in time rendering now :) , and with no tears, not even one pixel, and maybe even less stutter than vsync off since the engine is not racing like mad to fill render ahead queues that potentially get distorted. Just clean 500Hz input render wait display, input render wait display. In fact there should just be NO stutter. Zero.

With a highspeed latency tester one theroretically ~could measure a maximum 4ms total lag on top of screen, and maybe 3ms average on top of screen, and maximum 6ms on bottom of screen. Ridiculously good if you ask me :) (and still you could gain 1 with a 1000Hz mouse) (however I wouldn't be surprised if you'd get far worse maximum lag, because of os overhead and scheduler multitasking away things).

You shake the mouse in circles, and you see the number of frames forming the circles on screen...Man, so many frames...At 2ms when the next frame comes the fading of the phosphor just ended :)

This was available quite a few years ago already.

Obviously the future is smoother than we've ever seen, we're not there yet.

But we will need mips.

A lot.

User avatar
Chief Blur Buster
Site Admin
Posts: 6480
Joined: 05 Dec 2013, 15:44

Re: ezQuake: Just-in-Time VSYNC

Post by Chief Blur Buster » 07 Jan 2014, 21:52

HeLLoWorld wrote:I thought I'd add the other day I successfully setup quake live at 200Hz on a cheaply acquired crt, plus solid 250fps and 500Hz mouse.
Firstly, welcome to Blur Busters -- and excellent first post, I must say! Your thoughts in this first post of yours is already Area 51 stuff. So I'm betting you're a programmer, based on your writing, and your username!
HeLLoWorld wrote:I also like to believe it's better with vsync on...At this rate we're at 4ms frames, with no tear. I'm not sure but maybe it's better to only have fully correct, non-distorted images at that point.
That's why I'd love to see 240Hz GSYNC -- 4ms frames with the fully lovely VSYNC ON look without the lag compromises. And GSYNC would allow stutter-free during slowdowns (most GPU's won't be able to sustain 240fps, even in games like Counterstrike in moments of heat).
HeLLoWorld wrote:I wonder if the image being distorted in real time to follow the inputs would be better (you could push the hypothesis to the point where there are as many frames as there are pixel ie "racing the beam" as Abrash or Carmack said)...
Infinite framerate on a finite framerate display -- where each pixel is a real-time representation of the 3D scene, as the display scans out. Interesting thought.

Better yet, theoreteical: Infinite framerate on an infinite framerate display (if it were possible). Low persistence without motion blur and without strobing. (A scientific challenge -- it's very difficult to create low persistence without a form of flicker, phosphor, strobing, PWM, light modulation, or some other non-continuous light output)
HeLLoWorld wrote:1-2ms phosphor fade is not negligible anymore at that point).
I've done many tests, including for adjustable-persistence strobe backlights LightBoost 10% vs 50% vs 100%. LightBoost=10% is a 1.4ms strobe backlight flash, and LightBoost=100% is a 2.4ms strobe backlight flash. I can easily tell the difference -- 1 millisecond of persistence translates to 1 pixel of motion blurring during 1000 pixels/second motion. I consider this "Blur Busters Law" because it's beautifully simplified version of existing science and display experience. You probably already have read some of my other posts about this, but good animations to study include http://www.testufo.com/eyetracking and http://www.testufo.com/blackframes -- to understand the relationship of persistence versus motion blur. (LCD GtG doesn't cause motion blur; persistence does) The only way to shorten persistence: Shorten the visible frame. Either by more frames, or by adding black periods between frames, either by shorter-persistence phosphor decay, or shorter strobe backlight flash, etc. As we know, short-persistence phosphor has less motion blur on CRT's than long-persistence CRT's (ala radar scope CRTs).
HeLLoWorld wrote:maybe even less stutter than vsync off
It probably does. VSYNC always has had microstutter of (1/framerate) relative to motion. So 1000 pixels/second VSYNC OFF motion at 250fps, can have microstutters that vibrate back-and-fourth by 1/250th of 1000 pixels/second -- stuttering edges that vibrate back and fourth by an amplitude of 4 pixels.

VSYNC OFF microstutter amplitude calulcations are easier to understand when playing back VSYNC OFF in super slow motion, while having a continuously moving dot (simulated eye tracking position) as the screen continuously top-down refreshes and splices to the next refresh (tearlines) at various points in the scanout in slow-motion, while the simulated eyetracking dot is continuously moving ahead in slow-motion. Here, it then becomes easy to understand VSYNC OFF microstutter equals (1/framerate) for the amplitude of the vibration of object position relative to eye position.

-- When VSYNC ON runs, you get no microstutters (at full framerate) but you get disruption during slowdowns; a microstutter error of up to (1/Hertz)
-- When VSYNC OFF runs, you always only get microstutter error of (1/framerate)
-- When VRR tech (GSYNC/FreeSync) runs, you get no microstutter error during slowdowns, assuming frame rendertime stays in sync with refresh presentation time (view motion-interpolated gsync simulation in a stutterfree browser)

So VSYNC OFF still always has microstutters, even at 500fps. The higher framerate you go, the harder microstutters are to notice at VSYNC OFF, especially when mouse microjitters starts to become your microstutter limiting factor. At 500fps, the VSYNC OFF microstutter is only a 2 pixel amplitude during 1000 pixel/second motion. These are pretty simple to my brain, but hard for a lot of people to grasp -- further involved discussion about this, we can continue in another thread ("Mathematically Calculating VSYNC OFF microstutters") -- right here in Area 51 :) ... This topic is probably too complicated for most mainstream readers, and might distract from this topic. Sometime later this year, I plan to develop a slow-motion simulated VSYNC OFF animation, that shows this.
HeLLoWorld wrote:In fact there should just be NO stutter. Zero.
Perfect VSYNC ON framerate=refreshrate motion (when the GPU is powerful enough to have no frame rate drops), is completely stutterfree in properly written game engines, at least via keyboard strafe left/right.

Even at 60fps@60Hz CRT (strafe left/right in FPS games become as smooth and blurfree as panning in old platformers on CRT such as Super Mario Brothers). 60fps@60Hz, 90fps@90Hz, 200fps@200Hz, are all perfectly stutterfree when doing VSYNC ON framerate==refreshrate, as long as frame rate always stays matched to refresh rate. Especially on strobed displays, for the zero-motion-blur CRT effect. Even the best 1000Hz mouse will always introduce microstutters (even if it's just a fraction of a pixel off), the best mice tries to put these microstutters below human detectability threshold, but it is hard to make mouse left/right turn feel as smooth as keyboard strafe left/right.
HeLLoWorld wrote:You shake the mouse in circles, and you see the number of frames forming the circles on screen...Man, so many frames...At 2ms when the next frame comes the fading of the phosphor just ended :)
That said, stare directly at the center of the screen while waving the mouse pointer in circles without tracking eyes on the mouse. One will still see mousedropping effects even at 240Hz. Since your eyes aren't tracking, this is a problem because the pixels in between the mouse droppings aren't illuminated, so you still see mouse droppings at 240Hz, even 500Hz. This is a problem hard to solve without an infinite-framerate display or continuous-motion display. (to eliminate wagonwheel effects, stroboscopic effects, phantom array effects etc).

Very interesting reading, that your brain probably would grasp (based on my first post impression), include:
- Michael Abrash of Valve Software: Down the VR Rabbit Hole (talk of 1000fps @ 1000Hz displays)
- Why We Need 1000fps @ 1000Hz This Century (my post on AVSFORUM)
- Strobe Backlights are a light output challenge (persistence of strobed LCDs versus persistence of CRTs)
HeLLoWorld wrote:But we will need mips.
A lot.
Indeed. Tons of GPU power needed for ultra high framerates on ultra high refresh rate displays.
Strobing is a lot easier, I think a great sweet spot is 120fps@120Hz strobed -- you can achieve millisecond-league persistence on LCD's this way, and have less motion blur with LightBoost=10% than with some of the slower-persistence CRT's such as Sony GDM-W900. But as we know, there's some compromises with LightBoost (lack of brightness, colors, about half a frame extra lag), and manufacturers are working on improving this situation.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

Ashun
Posts: 7
Joined: 06 Jan 2014, 21:12

Re: ezQuake: Just-in-Time VSYNC

Post by Ashun » 08 Jan 2014, 21:06

Mark, the initial post was primarily about VSYNC on, but your subsequent posts made me reconsider if running at 1000 FPS VSYNC off was actually doing any good.

To make the math a little nicer, I capped the framerate at 960 (160 Hz refresh * 6) and compared it to framerate capped at 160 (163 actual to remove the slowly crawling tear). This was to see if the newer partial images made a perceptual difference in input latency.

At 960 FPS, the front and back buffers are swapped five times per display refresh, leading to six distinct pictures separated by tears, each newer by 1.04 ms. It looks like this:

Image

This was a very quick flick, about 3 pixels per tear or 2880 pixels/sec. Earlier I said tearing was not an issue at this framerate, but it's actually fairly distracting.

Capped at 160 FPS, the tearing is hugely improved, and most interestingly, I can't feel any difference in perceptual lag. None. It seems like latency at even moderately high refresh rates (96+) is not an issue, but stutter and tearing are where improvements need to be made.

By the way, I've been using CRTs continuously since I've been using computers, so I had no idea sample-and-hold blur was even a "thing" for the past decade. Yikes. I feel like I missed the dark age of LCDs. But the G-SYNC monitors announced at CES are pretty intriguing, and I wonder if I could just deal with the TN panel issues.

User avatar
Chief Blur Buster
Site Admin
Posts: 6480
Joined: 05 Dec 2013, 15:44

Re: ezQuake: Just-in-Time VSYNC

Post by Chief Blur Buster » 08 Jan 2014, 23:40

Ashun wrote:To make the math a little nicer, I capped the framerate at 960 (160 Hz refresh * 6) and compared it to framerate capped at 160 (163 actual to remove the slowly crawling tear). This was to see if the newer partial images made a perceptual difference in input latency.
Very interesting! You say tearing is distracting at 960 frames per second, is the most interesting part of your statement. Capping your framerate to match refresh rate is better looking.

Is ezQuake able to combine Just-in-Time VSYNC with Adaptive VSYNC? (Or can Just-In-Time VSYNC be combined with NVIDIA Adaptive VSYNC?) Technically, that is simply an algorithm that intelligently "steers" the crawling tearline off the edge of the screen automatically, while keeping rendertime very close to VSYNC for minimum input lag. Frame renders that take too long, will simply cause a tearing to show up at only the very top edge of the screen, which will automatically disappear as frame renders fall down to less than a frame cycle. I don't see why ezQuake couldn't pull it off, so you can enjoy a stutter-free 160 framecap, low latency, no tearing (except infrequent at top edge only) -- since using 163 would create a harmonic about 3 stutters per second (the beat frequency between refresh rate and frame rate).
Ashun wrote:By the way, I've been using CRTs continuously since I've been using computers, so I had no idea sample-and-hold blur was even a "thing" for the past decade. Yikes. I feel like I missed the dark age of LCDs. But the G-SYNC monitors announced at CES are pretty intriguing, and I wonder if I could just deal with the TN panel issues.
You certainly missed the Dark Ages of LCD.

GSYNC still has sample-and-hold blur, but you can use ULMB (LightBoost) instead though you now have to live with a fixed refresh rate. On the other hand, such a longtime CRT user like yourself might want to wait until GSYNC matures -- such as 240fps GSYNC, or a monitor that can do GSYNC and ULMB (LightBoost) simultaneously.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

User avatar
nimbulan
Posts: 316
Joined: 29 Dec 2013, 23:32
Location: Oregon

Re: ezQuake: Just-in-Time VSYNC

Post by nimbulan » 10 Jan 2014, 03:21

Just-in-time rendering sounds like it would be a great technique to combine with G-sync. You would have G-sync as a safety net to maintain smooth motion when framerate < refresh rate and allow input lag to decrease when framerate > refresh rate without resorting to vsync off. It definitely sounds like consistent frame times are a must for it to work well though.
Chief Blur Buster wrote:VSYNC OFF microstutters have always annoyed me, even at ~200fps I can still see the VSYNC OFF microstutters.
I've been playing around with vsync since our conversation about vsync on vs off input lag and I have to agree with you here. Even though tearing is almost imperceptible to me at 120 Hz in most of the games I've tried so far, which really surprised me since I've always found it very obnoxious on 60 Hz displays, the stutter always feels worse than vsync on.

Post Reply