Unity is one of the most popular game engines! However, not all developers realize what settings are needed for users to enjoy G-SYNC and FreeSync.
There is currently a Steam sale on Cloudpunk, so I bought it. Imagine an indie non-shooter RPG game whose visual look is sort of a mash-up of "Minecraft-look" and "Cyberpunk 2077 Night City" look (in pixellated voxels!)
The game (as of June 2020) is so new -- that before I helped the developer -- it has not yet been optimized for high-Hz or VRR displays. Recently, I just successfully assisted the developer to fix the game's stutters & VRR fluidity -- with just a few lines of code change (hopefully next version is the charm!).
It is easy to make a Unity-engined game VRR compatible with only a few lines of code changes. Here's the Steam Discussion thread where I am helping an indie developer improve VRR support of an indie game.
TL;DR Instructions to Add G-SYNC/FreeSync to your Unity App!
1. Use "Application.targetFrameRate = -1;"
2. Use "QualitySettings.vSyncCount = 0;"
3. Use "FullScreenMode.ExclusiveFullScreen"
That's it! Now your Unity game is variable refresh rate compatible, with G-SYNC and FreeSync! Everything VRR will work, VESA Adaptive-Sync, HDMI VRR, X-Box One FreeSynx, NVIDIA G-SYNC, AMD FreeSync, etc. The user might still have to enable it via their drivers/monitor, but your game will be ready. Also, Item #3 above is semi-optional (exclusive full screen), as windowed fullscreen will work if you configure NVIDIA Control Panel to "Windowed" G-SYNC, although less experienced users will not know how to configure that. So item #3 will improve likelihood of VRR working successfully.
Also, be careful not to accidentally lock your camera to fixed physics clocks. If you're using Unity's Cinemachine, do NOT sync the camera to FixedUpdate() or SmartUpdate(). Cameras during VSYNC OFF and VRR must sync to floating framerate, not a fixed clock. Physics updates can continue to go 60Hz (interpolation can help smooth things, but player movements (panning, turning, scrolling, running, etc) must adapt to floating frame rates to have stutterless framerate changes (example animation).
In reality, you'll need to add more programming for configuration (Settings Screen, Graphics Options, Config File, etc). For end users, the menus are usually labelled (1) "Frame Rate Limit" setting, (2) "VSYNC ON/OFF" setting, and (3) "Full Screen" setting respectively. You've seen these game settings in many game menus. But fundamentally, that's all you need to do in order to add variable refresh rate support to your game.
Long Version Below
Here's the long version below (including debugging instructions):
Post #1 of 2
Post #2 of 2Chief Blur Buster wrote:Founder of Blur Busters / TestUFO here.
If you've seen our UFO, you know us -- www.testufo.com
I agree -- it's actually fairly simple to support FreeSync/G-SYNC
For software developers, it's just simply supporting VSYNC OFF mode. When the drivers use VRR, it just automagically works.
Software Developer Instructions for adding FreeSync / G-SYNC Compatibility
Just make sure Direct3D Present() timestamps is time-relative to game times.
For FreeSync & G-SYNC, the monitor refreshes the instant the frame is presented by the game engine, so the frame should be time-accurate to the frame presentation time -- while not worrying about exact intervals between frame presentation times.
This erases framedrop stutters, creating seamless stutterless frame rate changes, as seen at https://www.testufo.com/vrr (software-based simulation of G-SYNC and FreeSync).
Some more detailed know-hows:
https://blurbusters.com/gsync/how-does- ... -stutters/
viewtopic.php?f=2&t=6273
viewtopic.php?f=22&t=4710
But in reality, it's really easy. Just make sure frame presentation timing is in parallel with game time, preferably to within less than a millisecond (best-effort). Most game engines that support a reliable "VSYNC OFF" operation (in the proper way), already work correctly when drivers enables VRR,
With VRR, the frame rate is the refresh rate, and the refresh rate is the frame rate -- as long as the frame intervals is within the refresh rate range of a variable refresh rate monitor. The hardware monitor is syncing to the software, instead of a fixed refresh clock.
Basically, allow the gametime to float (without fixed intervals between gametimes), and present the frames immediately on the spot. So if 12ms elapses between gametimes, 12ms should elapse between Present() or glxxSwapBuffers() timings (Direct3D or OpenGL). And if next frame interval is now 17ms, then Present() is 17ms later. Your Present() is controlling the timing of the monitor's refresh cycles in realtime, as long as the interval is within the monitor's VRR range. (But don't worry about that detail, the drivers will handle it automatically).
This is a VSYNC OFF Best Practice
This is already a developer's "VSYNC OFF best practice" anyway, and if you've followed that already, then G-SYNC and FreeSync modes will automatically work very well.
If you're using an off-the-shelf engine, most of them now already support VSYNC OFF which is necessary for proper G-SYNC / FreeSync operation. Unity, Unreal, etc.
For G-SYNC/FreeSync, framerate is refreshrate, and refreshrate is framerate, no difference
When the monitor/drivers have that enabled, the tearing disappears, and transfers over to a "perma-perfect-smooth" look despite fluctuating frame rate, just like how https://www.testufo.com/vrr can modulate framerates randomly without stutters -- that's the magic of variable refresh rate. 47fps VRR looks like perfect 47fps 47Hz VSYNC ON. And when the frame rate changes, like 53fps, then it looks identical to 53fps 53Hz VSYNC ON. And the "stutter-seams" between framerate-changes are erased by VRR.
Thus, each separate frame can have its own unique refresh rate -- which is equal to the time interval between the current Present() and the previous Present() -- and there can be hundreds of random intervals without stutter (Except during disk-freezes / system-freezes). Sometimes slight manual VRR optimization can be needced for extreme-large rendertime changes between adjacent frames, though some of that can be fixed by strategically delaying Present() to re-sync time-relativity to gametimes if the next rendertime went much faster than expected and the gametime assumed a longer rendertime -- but it looks like for this engine, it's simply just a minor "VSYNC OFF-support" modification without need for any other optimizations (initially) --
In Visual Studio, I have seen some engines suddenly go smooth with only about 15-20 lines of code change (excluding the game menu changes needed to add a "VSYNC ON/OFF" option)
Please Increase The Frame Rate Limits
Also, please increase the 120fps cap. I'm seeing the game able to run >120fps on my high end system -- I'm able to run framerates exceeding 120fps often with high end NVIDIA cards. Doubling frame rates halve motion blur, and I'd like the frame rate to organically "float" from 50fps to 200fps with the 240Hz 1ms IPS monitor "floating its refresh rate" to sync to frame rate -- VRR range of 48Hz to 240Hz in realtime (with over 200 seamless refresh rate changes per second). It's much smoother than both ordinary VSYNC ON and ordinary VSYNC OFF
P.S. 240Hz and 360Hz monitors are not just used by esports, and you should futureproof the cap feature to support 1000fps 1000Hz, see https://www.blurbusters.com/1000hz-journey since ultra-high frame rates is one way an LCD can accurately emulate a CRT tube without needing impulsing/phosphor.
P.S. I'd be happy to test G-SYNC / FreeSync behaviour changes between successive versions of the game.
Chief Blur Buster wrote: Frame Rate Caps Are Still Useful
Yes, I am aware of high framerate overheating some GPUs. That said, it's important to present the user at least the option.
P.S. Caps are still useful for VRR, to prevent framerates from exceeding the VRR range explained for various technical reasons at Blur Busters. However, let's just keep things simple -- it is already known that the Unity framerate cap doesn't always interact well with VRR and users can externally use an external frame rate cap too (RTSS). So keeping things simple:
1. Enable a Full Screen Exclusive mode (to bypass Windows compositor)
2. Framerate cap setting (which should now include "Disabled" or "Uncapped" or "Infinite")
3. Some engines work best with a separate VSYNC ON/OFF setting. However, users can force this externally through NVIDIA Control Panel especially if the game is running in full screen exclusive mode.
For users who want to enable VRR, configure full screen exclusive, uncapped, VSYNC OFF in the game menus. Then in NVIDIA Control Panel, enable G-SYNC. For AMD, enable FreeSync in both Catalyst Control Center and the monitor's menus. Then VRR works fine in Unity-engine games that has been programmed with (1)+(2)+(3)
There is a problem with using borderless full screen mode in Windows since Windows often tries to fight against VRR. However, if it is too much development effort to do (1), there is a workaround for G-SYNC users (not FreeSync users), for users doing NVIDIA Control Panel -> Setup G-SYNC -> Enable for windowed and full screen mode.
Then you only need to implement (2) and (3) as a software developer, which should be easy-peasy, since they're just Unity API calls pretty much, and Unity has improved their FreeSync/G-SYNC support over the years anyway.
When using Infinite, please use "-1" for the Unity "targetFrameRate" API, do not use 1000. The -1 is the uncapped infinite frame rate setting, and is the one that Unity triggers its G-SYNC/FreeSync behaviors.
The reason you shall not use 1000 is because even 1ms sometimes becomes human-visible, because at 4000 pixel/sec mouseturn, 1ms translates to a 4 pixel stutterjump for a 1ms gametime:photontime error margin, and that becomes human visible at stratospheric refresh rates when display motion blur is smaller than the stutterjump. It is amazing how milliseconds become human visible: viewtopic.php?f=7&t=6808
So, never, use a 1000 cap -- and if you're using any busywait techniques for framepacing, use QueryPerformanceCounter(), RTDSC, or other microsecond-quality counter, not millisecond-granularity synchronization.
However, you're probably just letting "Unity do its thing", as an indie developer, so for simplicity, this is your absolute minimum modification.
Minimum Modifications
1. Add a "Infinite" setting to the frame rate cap. When this is selected, set Unity API ".targetFrameRate = -1;" combined with "QualitySettings.vSyncCount = 0;"
Full Screen Mode
VRR works better with full screen exclusive. Use the Unity API "ExclusiveFullScreen" setting. You might want to add a third setting, "Windowed / Borderless Fullscreen / Exclusive Fullscreen" to give users the choice.
Optional Reading Below
Everything else below the line is optional, but highly educational to a software developer who wants to optimize better for less stutters & better VRR.
Optional #1: Match Hz Setting
I would ideally, like to see a "Match Hz" setting alongside with an "Infinite" Setting, which is simply Unity's "QualitySettings.vSyncCount = 1;" (this will ignore the frame rate cap and use the current monitor's Hz as the cap). This will automatically smooth things out to framerate=Hz with a little bit of input lag, but some people love the fluidness, especially when using powerful GPUs that keeps framerate near Hz, for non-VRR
Optional #2: Debug your stutters
(HINT: It will help non-VRR too!)
1. Add a gametime:photontime divergence calculator. It's mathematically simple.
It's simply comparing the gametimes versus the real system clock as in following calculation:
A. Difference between two gametime timestamps (current frame and previous frame), you can use Unity API "Time.deltaTime"
B. Difference between two QueryPerformanceCounter() of the device.Present call (current frame and previous frame), basically the moment the frame is presented. Just do a QueryPerformanceCounter() (or read other microsecond counter) IMMEDIATELY before your Unity API "device.Present" call (if that's your workflow).
C. Now, calculate difference between A and B and log this every frame. This is your gametime:photontime divergence for VRR displays (i.e. stutters not fixed by VRR, as well as stutters that become worse than it should be for VSYNC ON).
D. Render the graph every game frame, onto your game screen, as an overlay -- a rolling window of the last 500 previous instances of (C) for the last ~500 frames. As you play the game, the graph (in debug mode) will be a nice stutter-visualization.
E. This gives you realtime visual debugging of stutters. If the graph is smoothly modulating, GOOD JOB! You've done a great job, and VRR will be absolutely beautiful. If the graph has hundreds of peaks and valley, especially sudden 10ms adjacent-frame frametime changes, then there's something wrong, there's some optimizing work you need to do -- a simple turn (turning left and right) shouldn't have 10ms-timescale gametime:photontime divergences.
That will be gametime:photontime divergence for VRR -- basically VRR-unfixable stutter, as well as stutter that becomes bigger for ordinary VSYNC ON too. Optimizing this debug graph will reduce stutter for all sync technologies, while allowing VRR to do its job.
This can be a simple debug logging for adjacent-frame rendering time volatility (render time differential between two adjacent frames). Just QueryPerformanceCounter() or grab timestamps, time-difference between them. If hugely volatile (rather than a smooth modulating graph) spread out the processing to keep rendertimes reasonably consistent for same-view rendering (to de-stutter simple turning, etc) -- which may include multithreaded processing, distributing pop-in rendering and disk loading in smaller increments or in separate Unity job system thread, etc.
Normally simple changes is all you need to do; but if you're wanting to push the limits of improving VRR quality (understanding this helps a bit) -- The demanding framerate changes can still go stutter-free with the VRR becoming the shock-absorber for stutters. But this can fail if gametime-vs-Presenttime randomly diverges because of hugely volatile rendertime in between.
Future Proofing Note
1ms granularity still has human-visible effects
Now, do not assume 1000 granularity is good enough. We've since determined that 1ms frametime error can still have human-visible stutter, since 1ms = 1 pixel of misalign per 1000 pixels/sec motion. On an ultrahigh Hz display, at ultra high resolutions, when motion blur is so tiny, that tiny stutters are no longer hidden by display motion blur -- 1ms translates to 4 pixel stutterjump at 4000 pixels/second. So, milliseconds matter here. Always use microsecond timestamps, clocks, timers, etc, throughout your software, don't use legacy millisecond-granularity timers, if you use any. (Fortunately Unity no longer uses millisecond-granularity timers, but don't add your own).
How Much Render-Time Volatility Is OK?
It's OK to do things like 10ms, 11ms, 10ms, 9ms, 10ms (type of frametime rrendering modulations), and smooth turning will usually have smooth modulations in rendertimes. But peaky adjacent-rendertimes like 7ms, 20ms, 27ms, 9ms, 17ms, all will have stutters that human-visibly show through VRR, which may be indicative of a requirement to do a slight scenery-processing rearchitecture to shock-absorber adjacent-frame rendertimes. That amplifies VSYNC ON stutter, VSYNC OFF stutter, and creates stutter sometimes unfixable via VRR. Now, stutters at startup and switching between scenery is often unavoidable.
Smooth Render Time Volatility
So, just make sure your rendertimes modulate reasonably smoothly, so that gametime:photontime stays relative sync (to sub-millisecond levels, ideally). Sudden adjacent-frame rendertime changes (e.g. scenery pops, disk loads, garbage collects) can inject gametime-versus-Present()-time time non-relativities. Most engines will naturally have smooth rendertimes increase/decreases as you turn around, from less complex views to highly complex views -- it's when sudden rendertime changes (e.g. 1ms rendertimes suddenly becomes a 20ms rendertimes, because of a disk-loading stutter or a very huge garbage-collect event), then stutters will still show through VRR. But as long as gametime:photontime stays in relative sync for adjacent frames to sub-millisecond changes, stutters are generally human-invisible below the human detection noisefloor.
Final Note
There may be some other API flags you may have to change in order to get things to go suddenly smooth on VRR, but since you are Unity it should be quite straightfoward to support G-SYNC and FreeSync -- minimal programming modifications.