Unity Developers: Easy G-SYNC & FreeSync Support in only 3 lines of programming code!

Talk to software developers and aspiring geeks. Programming tips. Improve motion fluidity. Reduce input lag. Come Present() yourself!
Post Reply
User avatar
Chief Blur Buster
Site Admin
Posts: 7699
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada

Unity Developers: Easy G-SYNC & FreeSync Support in only 3 lines of programming code!

Post by Chief Blur Buster » 27 Jun 2020, 13:43

How To Make Your Unity Game VRR Compatible

Unity is one of the most popular game engines! However, not all developers realize what settings are needed for users to enjoy G-SYNC and FreeSync.

There is currently a Steam sale on Cloudpunk, so I bought it. Imagine an indie non-shooter RPG game whose visual look is sort of a mash-up of "Minecraft-look" and "Cyberpunk 2077 Night City" look (in pixellated voxels!)

The game (as of June 2020) is so new -- that before I helped the developer -- it has not yet been optimized for high-Hz or VRR displays. Recently, I just successfully assisted the developer to fix the game's stutters & VRR fluidity -- with just a few lines of code change (hopefully next version is the charm!).

It is easy to make a Unity-engined game VRR compatible with only a few lines of code changes. Here's the Steam Discussion thread where I am helping an indie developer improve VRR support of an indie game.

TL;DR Instructions to Add G-SYNC/FreeSync to your Unity App!

1. Use "Application.targetFrameRate = -1;"
2. Use "QualitySettings.vSyncCount = 0;"
3. Use "FullScreenMode.ExclusiveFullScreen"

That's it! Now your Unity game is variable refresh rate compatible, with G-SYNC and FreeSync! Everything VRR will work, VESA Adaptive-Sync, HDMI VRR, X-Box One FreeSynx, NVIDIA G-SYNC, AMD FreeSync, etc. The user might still have to enable it via their drivers/monitor, but your game will be ready. Also, Item #3 above is semi-optional (exclusive full screen), as windowed fullscreen will work if you configure NVIDIA Control Panel to "Windowed" G-SYNC, although less experienced users will not know how to configure that. So item #3 will improve likelihood of VRR working successfully.

Also, be careful not to accidentally lock your camera to fixed physics clocks. If you're using Unity's Cinemachine, do NOT sync the camera to FixedUpdate() or SmartUpdate(). Cameras during VSYNC OFF and VRR must sync to floating framerate, not a fixed clock. Physics updates can continue to go 60Hz (interpolation can help smooth things, but player movements (panning, turning, scrolling, running, etc) must adapt to floating frame rates to have stutterless framerate changes (example animation).

In reality, you'll need to add more programming for configuration (Settings Screen, Graphics Options, Config File, etc). For end users, the menus are usually labelled (1) "Frame Rate Limit" setting, (2) "VSYNC ON/OFF" setting, and (3) "Full Screen" setting respectively. You've seen these game settings in many game menus. But fundamentally, that's all you need to do in order to add variable refresh rate support to your game.

Long Version Below

Here's the long version below (including debugging instructions):

Post #1 of 2
Chief Blur Buster wrote:Founder of Blur Busters / TestUFO here.
If you've seen our UFO, you know us -- www.testufo.com

I agree -- it's actually fairly simple to support FreeSync/G-SYNC

For software developers, it's just simply supporting VSYNC OFF mode. When the drivers use VRR, it just automagically works.

Software Developer Instructions for adding FreeSync / G-SYNC Compatibility

Just make sure Direct3D Present() timestamps is time-relative to game times.

For FreeSync & G-SYNC, the monitor refreshes the instant the frame is presented by the game engine, so the frame should be time-accurate to the frame presentation time -- while not worrying about exact intervals between frame presentation times.

This erases framedrop stutters, creating seamless stutterless frame rate changes, as seen at https://www.testufo.com/vrr (software-based simulation of G-SYNC and FreeSync).

Some more detailed know-hows:

https://blurbusters.com/gsync/how-does- ... -stutters/

But in reality, it's really easy. Just make sure frame presentation timing is in parallel with game time, preferably to within less than a millisecond (best-effort). Most game engines that support a reliable "VSYNC OFF" operation (in the proper way), already work correctly when drivers enables VRR,

With VRR, the frame rate is the refresh rate, and the refresh rate is the frame rate -- as long as the frame intervals is within the refresh rate range of a variable refresh rate monitor. The hardware monitor is syncing to the software, instead of a fixed refresh clock.

Basically, allow the gametime to float (without fixed intervals between gametimes), and present the frames immediately on the spot. So if 12ms elapses between gametimes, 12ms should elapse between Present() or glxxSwapBuffers() timings (Direct3D or OpenGL). And if next frame interval is now 17ms, then Present() is 17ms later. Your Present() is controlling the timing of the monitor's refresh cycles in realtime, as long as the interval is within the monitor's VRR range. (But don't worry about that detail, the drivers will handle it automatically).

This is a VSYNC OFF Best Practice

This is already a developer's "VSYNC OFF best practice" anyway, and if you've followed that already, then G-SYNC and FreeSync modes will automatically work very well.

If you're using an off-the-shelf engine, most of them now already support VSYNC OFF which is necessary for proper G-SYNC / FreeSync operation. Unity, Unreal, etc.

For G-SYNC/FreeSync, framerate is refreshrate, and refreshrate is framerate, no difference

When the monitor/drivers have that enabled, the tearing disappears, and transfers over to a "perma-perfect-smooth" look despite fluctuating frame rate, just like how https://www.testufo.com/vrr can modulate framerates randomly without stutters -- that's the magic of variable refresh rate. 47fps VRR looks like perfect 47fps 47Hz VSYNC ON. And when the frame rate changes, like 53fps, then it looks identical to 53fps 53Hz VSYNC ON. And the "stutter-seams" between framerate-changes are erased by VRR.

Thus, each separate frame can have its own unique refresh rate -- which is equal to the time interval between the current Present() and the previous Present() -- and there can be hundreds of random intervals without stutter (Except during disk-freezes / system-freezes). Sometimes slight manual VRR optimization can be needced for extreme-large rendertime changes between adjacent frames, though some of that can be fixed by strategically delaying Present() to re-sync time-relativity to gametimes if the next rendertime went much faster than expected and the gametime assumed a longer rendertime -- but it looks like for this engine, it's simply just a minor "VSYNC OFF-support" modification without need for any other optimizations (initially) --

In Visual Studio, I have seen some engines suddenly go smooth with only about 15-20 lines of code change (excluding the game menu changes needed to add a "VSYNC ON/OFF" option)

Please Increase The Frame Rate Limits

Also, please increase the 120fps cap. I'm seeing the game able to run >120fps on my high end system -- I'm able to run framerates exceeding 120fps often with high end NVIDIA cards. Doubling frame rates halve motion blur, and I'd like the frame rate to organically "float" from 50fps to 200fps with the 240Hz 1ms IPS monitor "floating its refresh rate" to sync to frame rate -- VRR range of 48Hz to 240Hz in realtime (with over 200 seamless refresh rate changes per second). It's much smoother than both ordinary VSYNC ON and ordinary VSYNC OFF

P.S. 240Hz and 360Hz monitors are not just used by esports, and you should futureproof the cap feature to support 1000fps 1000Hz, see https://www.blurbusters.com/1000hz-journey since ultra-high frame rates is one way an LCD can accurately emulate a CRT tube without needing impulsing/phosphor.

P.S. I'd be happy to test G-SYNC / FreeSync behaviour changes between successive versions of the game.
Post #2 of 2
Chief Blur Buster wrote: Frame Rate Caps Are Still Useful
Yes, I am aware of high framerate overheating some GPUs. That said, it's important to present the user at least the option.

P.S. Caps are still useful for VRR, to prevent framerates from exceeding the VRR range explained for various technical reasons at Blur Busters. However, let's just keep things simple -- it is already known that the Unity framerate cap doesn't always interact well with VRR and users can externally use an external frame rate cap too (RTSS). So keeping things simple:
1. Enable a Full Screen Exclusive mode (to bypass Windows compositor)
2. Framerate cap setting (which should now include "Disabled" or "Uncapped" or "Infinite")
3. Some engines work best with a separate VSYNC ON/OFF setting. However, users can force this externally through NVIDIA Control Panel especially if the game is running in full screen exclusive mode.

For users who want to enable VRR, configure full screen exclusive, uncapped, VSYNC OFF in the game menus. Then in NVIDIA Control Panel, enable G-SYNC. For AMD, enable FreeSync in both Catalyst Control Center and the monitor's menus. Then VRR works fine in Unity-engine games that has been programmed with (1)+(2)+(3)

There is a problem with using borderless full screen mode in Windows since Windows often tries to fight against VRR. However, if it is too much development effort to do (1), there is a workaround for G-SYNC users (not FreeSync users), for users doing NVIDIA Control Panel -> Setup G-SYNC -> Enable for windowed and full screen mode.

Then you only need to implement (2) and (3) as a software developer, which should be easy-peasy, since they're just Unity API calls pretty much, and Unity has improved their FreeSync/G-SYNC support over the years anyway.

When using Infinite, please use "-1" for the Unity "targetFrameRate" API, do not use 1000. The -1 is the uncapped infinite frame rate setting, and is the one that Unity triggers its G-SYNC/FreeSync behaviors.

The reason you shall not use 1000 is because even 1ms sometimes becomes human-visible, because at 4000 pixel/sec mouseturn, 1ms translates to a 4 pixel stutterjump for a 1ms gametime:photontime error margin, and that becomes human visible at stratospheric refresh rates when display motion blur is smaller than the stutterjump. It is amazing how milliseconds become human visible: viewtopic.php?f=7&t=6808

So, never, use a 1000 cap -- and if you're using any busywait techniques for framepacing, use QueryPerformanceCounter(), RTDSC, or other microsecond-quality counter, not millisecond-granularity synchronization.

However, you're probably just letting "Unity do its thing", as an indie developer, so for simplicity, this is your absolute minimum modification.

Minimum Modifications

1. Add a "Infinite" setting to the frame rate cap. When this is selected, set Unity API ".targetFrameRate = -1;" combined with "QualitySettings.vSyncCount = 0;"

Full Screen Mode

VRR works better with full screen exclusive. Use the Unity API "ExclusiveFullScreen" setting. You might want to add a third setting, "Windowed / Borderless Fullscreen / Exclusive Fullscreen" to give users the choice.

Optional Reading Below

Everything else below the line is optional, but highly educational to a software developer who wants to optimize better for less stutters & better VRR.

Optional #1: Match Hz Setting
I would ideally, like to see a "Match Hz" setting alongside with an "Infinite" Setting, which is simply Unity's "QualitySettings.vSyncCount = 1;" (this will ignore the frame rate cap and use the current monitor's Hz as the cap). This will automatically smooth things out to framerate=Hz with a little bit of input lag, but some people love the fluidness, especially when using powerful GPUs that keeps framerate near Hz, for non-VRR

Optional #2: Debug your stutters
(HINT: It will help non-VRR too!)

1. Add a gametime:photontime divergence calculator. It's mathematically simple.
It's simply comparing the gametimes versus the real system clock as in following calculation:
A. Difference between two gametime timestamps (current frame and previous frame), you can use Unity API "Time.deltaTime"
B. Difference between two QueryPerformanceCounter() of the device.Present call (current frame and previous frame), basically the moment the frame is presented. Just do a QueryPerformanceCounter() (or read other microsecond counter) IMMEDIATELY before your Unity API "device.Present" call (if that's your workflow).
C. Now, calculate difference between A and B and log this every frame. This is your gametime:photontime divergence for VRR displays (i.e. stutters not fixed by VRR, as well as stutters that become worse than it should be for VSYNC ON).
D. Render the graph every game frame, onto your game screen, as an overlay -- a rolling window of the last 500 previous instances of (C) for the last ~500 frames. As you play the game, the graph (in debug mode) will be a nice stutter-visualization.
E. This gives you realtime visual debugging of stutters. If the graph is smoothly modulating, GOOD JOB! You've done a great job, and VRR will be absolutely beautiful. If the graph has hundreds of peaks and valley, especially sudden 10ms adjacent-frame frametime changes, then there's something wrong, there's some optimizing work you need to do -- a simple turn (turning left and right) shouldn't have 10ms-timescale gametime:photontime divergences.

That will be gametime:photontime divergence for VRR -- basically VRR-unfixable stutter, as well as stutter that becomes bigger for ordinary VSYNC ON too. Optimizing this debug graph will reduce stutter for all sync technologies, while allowing VRR to do its job.

This can be a simple debug logging for adjacent-frame rendering time volatility (render time differential between two adjacent frames). Just QueryPerformanceCounter() or grab timestamps, time-difference between them. If hugely volatile (rather than a smooth modulating graph) spread out the processing to keep rendertimes reasonably consistent for same-view rendering (to de-stutter simple turning, etc) -- which may include multithreaded processing, distributing pop-in rendering and disk loading in smaller increments or in separate Unity job system thread, etc.

Normally simple changes is all you need to do; but if you're wanting to push the limits of improving VRR quality (understanding this helps a bit) -- The demanding framerate changes can still go stutter-free with the VRR becoming the shock-absorber for stutters. But this can fail if gametime-vs-Presenttime randomly diverges because of hugely volatile rendertime in between.

Future Proofing Note

1ms granularity still has human-visible effects
Now, do not assume 1000 granularity is good enough. We've since determined that 1ms frametime error can still have human-visible stutter, since 1ms = 1 pixel of misalign per 1000 pixels/sec motion. On an ultrahigh Hz display, at ultra high resolutions, when motion blur is so tiny, that tiny stutters are no longer hidden by display motion blur -- 1ms translates to 4 pixel stutterjump at 4000 pixels/second. So, milliseconds matter here. Always use microsecond timestamps, clocks, timers, etc, throughout your software, don't use legacy millisecond-granularity timers, if you use any. (Fortunately Unity no longer uses millisecond-granularity timers, but don't add your own).

How Much Render-Time Volatility Is OK?

It's OK to do things like 10ms, 11ms, 10ms, 9ms, 10ms (type of frametime rrendering modulations), and smooth turning will usually have smooth modulations in rendertimes. But peaky adjacent-rendertimes like 7ms, 20ms, 27ms, 9ms, 17ms, all will have stutters that human-visibly show through VRR, which may be indicative of a requirement to do a slight scenery-processing rearchitecture to shock-absorber adjacent-frame rendertimes. That amplifies VSYNC ON stutter, VSYNC OFF stutter, and creates stutter sometimes unfixable via VRR. Now, stutters at startup and switching between scenery is often unavoidable.

Smooth Render Time Volatility

So, just make sure your rendertimes modulate reasonably smoothly, so that gametime:photontime stays relative sync (to sub-millisecond levels, ideally). Sudden adjacent-frame rendertime changes (e.g. scenery pops, disk loads, garbage collects) can inject gametime-versus-Present()-time time non-relativities. Most engines will naturally have smooth rendertimes increase/decreases as you turn around, from less complex views to highly complex views -- it's when sudden rendertime changes (e.g. 1ms rendertimes suddenly becomes a 20ms rendertimes, because of a disk-loading stutter or a very huge garbage-collect event), then stutters will still show through VRR. But as long as gametime:photontime stays in relative sync for adjacent frames to sub-millisecond changes, stutters are generally human-invisible below the human detection noisefloor.

Final Note

There may be some other API flags you may have to change in order to get things to go suddenly smooth on VRR, but since you are Unity it should be quite straightfoward to support G-SYNC and FreeSync -- minimal programming modifications.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

User avatar
Chief Blur Buster
Site Admin
Posts: 7699
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada

Re: Unity Developers: Easy G-SYNC & FreeSync Support in only 3 lines of programming code!

Post by Chief Blur Buster » 27 Jun 2020, 18:56

UPDATE: Successful Aid of Indie Developer

Permalink to success report to developer on Steam
The indie developer on Steam -- just added the necessary in-game menu modifications to support VRR!

So, these instructions successfully got an indie developer to add G-SYNC / FreeSync support in less than 24 hours with these instructions! With these instructions relayed to an indie developer of the Cloudpunk game on Steam, VRR works.

For many Unity engine games that don't support Full Screen Exclusive mode, it's sometimes possible to add it via a command line option:


Ideally, developers needs to add it into the game menus. However, as long as the game has a configurable VSYNC OFF option with a high framerate limit or uncapped setting, an indie Unity game can gain variable refresh rate support!

Moral of the story:
- The game needs to implement uncapped-framerate VSYNC OFF support (or fully configurable high framerate caps)
- The game should support full screen exclusive, to make VRR easier and more reliable for end users, but at least there's a command line option available for some Unity-engine games.

Now, I still had to do a combo of game-settings change AND a command line option, when it should only be game-settings changes, so I followed up with a 3rd post to the developer of Cloudpunk:
Chief Blur Buster wrote: Recommendations For Future

Further improvement for next version (no rush):

Add Full Screen Exclusive Option To Game Menus

(A) Other modifications recommended; please add "-window-mode exclusive" behavior to the in-game menus if possible. This will make it easier for future users to successfully enable VRR. If possible, please have three settings for DISPLAY MODE in the game menus, if possible:


You can use alternate terminology (some games uses this), such as:


There are pros/cons of exclusive versus borderless, in many contexts (streaming behaviours, latency behaviors, VRR compatibility, ability to move mouse to a 2nd monitor, changes to Alt+Tab behavior, etc), which necessitiates providing both options.

Don't Assume Millisecond Accuracy Is Enough

(B) Internally in your game engine, please make sure that the 1000 setting represents ".targetFrameRate = -1" (uncapped) rather than ".targetFrameRate = 1000"; this may be able to reduce stutters even further.

See my explanation of how 1ms sometimes mathematically becomes human visible (previous post) on some ultra-high-resolution displays that has unusually low motion blur, where blurwidth (thickness of motion blur) can be smaller than stutterwidth (stutter jump distance) -- this futureproofs your game against a well-known Blur Busters "Vicious Cycle Effect" (bigger displays, higher resolutions, faster GPUs, less motion blur, faster pixel response, higher frame rates, wider FOVs) that collaboratively amplifies stutter visibility of formerly-invisible stutters.
For those developers surprised that a single millisecond matters, please read The Amazing Human Visible Feats Of The Millisecond. Scientifically, do not make assumptions. You'd be surprised. Assuming a millisecond is meaningless, is often a myth like "Humans Can't See 30fps vs 60fps" -- that millisecond is a butterfly in a Chaos Theory that can cascade into human visibility!

Fortunately -- (Insert Farnsworth "Good News, Everyone") -- it's just a single-line code change in many cases; to remove the millisecond assumptions. Make sure your programing best-practices don't make millisecond assumptions when you can use microsecond timestamps throughout your game code. Easy peasy futureproofing best-practice! :D
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

User avatar
Chief Blur Buster
Site Admin
Posts: 7699
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada

Re: Unity Developers: Easy G-SYNC & FreeSync Support in only 3 lines of programming code!

Post by Chief Blur Buster » 30 Jun 2020, 01:01

I just helped the developer fix even more stutters.

Camera movements must not be fixed updates

1. Do NOT configure Unity's Cinemachine to SmartUpdate, it needs to sync to Update instead of FixedUpdate. Camera can’t be FixedUpdate or SmartUpdate for VRR

2. Use a temporary 55fps cap for stutter debugging VRR. Any framerate not divisible by Physics Rate (60) is a great visual VRR-debugging framerate. If stutters disappears during mouse/keyboard movements, good job! (55fps cap should looks like perfect 55Hz on a VRR display)

Most cheap gaming monitors at least supports FreeSync 48Hz-75Hz, so 55fps is a good cheap office debugging / QA department standard, because many testers are issued only garden-variety 60Hz screens, but some of those already support FreeSync 48Hz-75Hz. If you absolutely cannot afford a high-Hz monitor for your testers or yourself, then the 55fps test is a good stand-in for high-Hz compatibility.

If you cannot afford a gaming monitor, do the 50 Hz or 55 Hz Test As a Good Stand-In

The moral of the story is if 55fps looks smooth despite 60Hz physics, it probably will look smooth at all frame rates & VRR framerate fluctuations, so the "55fps test" is a good Unity-engine predictor of High-Hz compatibility & VRR compatibility. This is important to make sure you don't have stutter bugs resulting from camera frame rate mismatched with physics update rate.

If you don't support FreeSync/G-SYNC, your QA gamers can even create this as a custom refresh rate on most 60 Hz LCDs:

This will work on most common 60 Hz computer monitors
1. Go to NVIDIA Control Panel or AMD Catalyst Control Center
2. Create a 50 Hz refresh rate or a 55 Hz refresh rate
3. Playtest your game VSYNC ON at 50 Hz or 55 Hz

Test Cases To Save Your Gaming Company Money & Raise User Reviews To More Stars

You can do most of your tests on a specific odd-Hz such as 55Hz or 144Hz. However, you should spot-check other frame rates, since physics can begin malfunctioning at excessively low frame rates, or excessively high frame rates:
  • Standard 60fps
    The common denominator, goes without saying. :)
  • 50 or 55Hz
    Most 60Hz monitors will do 50Hz and 55Hz via creating a Custom Resolution in NVIDIA Control Panel or AMD Catalyst Control Center. This is an easy spot check that works on most generic monitors (60Hz) of making sure you don't have stutters from fixed physics clocks (FixedUpdate-vs-Update stutter). No need to purchase a gaming monitor, a fixed-Hz 55Hz test is a reasonable predictor of VRR compatibility.
  • 75Hz
    Most newer office monitors have a hidden 75 Hz setting. Many 60fps-optimized games can show unexpected stutters at 75Hz. You'd be shocked at how 75Hz looks worse than 30Hz, if you've never stutter-debugged anything other than 60Hz...
    Emulate low-end users via RTSS cap of 5fps, 10fps, 20fps
    This helps you emulate an underpowered GPU on a powerful GPU. Your purchasers on Steam or your favourite app store can also come from countries with low GDPs, computers with integrated GPUs (like cheap laptop and PCs). Make sure you don't bug out low frame rates, so please spot-check your low frame rates. There are often 1-line code mistakes that can create low-Hz bugs.
  • VSYNC OFF Uncapped frame rates
    You can use VSYNC OFF to enable insane frame rates on 60Hz monitors. There are already advantages of frame rates higher than refresh rates, so please test VSYNC OFF to get high frame rates on any monitor -- this will help you reveal bugs. VSYNC OFF is a good poor-developer's helper for making sure you don't variable refresh rate problems (FreeSync, G-SYNC). Yes, it has nasty tearing, but it is a good debugger for VRR on non-VRR displays!
  • Variable refresh rate
    Your generic white-brand office monitor might already support it. For only ~$25-$40 more than a office monitor, a monitor can add FreeSync (48Hz-75Hz range) support if you have to purchase a monitor (or few) for your playtesting people. Give different playtesters different setting, such as full range, or capped.
  • Variable refresh rate
    Enable variable refresh rate, and enable RTSS. Watch the framerate fluctuate without stutter. Framerate-changes can be stutterless on VRR, as demo'd at www.testufo.com/vrr ....
  • High refresh rate
    If you can afford a high-Hz monitor for your indie game, PLEASE get the highest Hz you can afford. 240Hz monitors also double as 144Hz monitors too, so you can playtest both 144Hz and 240Hz on a single 240Hz monitor. Occasionally, 240Hz monitors can be found for just barely above $200 during big Amazon sales. 240Hz can do fixed-Hz tests and VRR tests, and at multiple refresh rates. 240 Hz is not just for esports anymore, 240Hz can look good for casual gaming too with 1/4th the motion blur at 240fps versus 60fps.
  • Test all camera views
    If your game has multiple camera views, test all of them. Driving view, first person view, third person view, etc. Sometimes stutter only shows up with one camera view. For example at 50Hz, 75Hz and 144Hz, Cloudpunk didn't stutter during car-driving view (HOVA vehicles) but stuttered during walking on foot.
  • Big Game Company? Don't Skimp!
    If you seriously don't have these test cases, your departments (and managers) may be making flawed assumptions that give inertia to purchasing monitor upgrades for improved developer testing & play testing. You MUST give some of your developers at least one high-Hz monitor; a 144Hz monitor actually improves 60fps debugging more than forcing the developer to develop with 60Hz (classic "force developers to develop on underpowered displays to make sure game performs well" moves actually are counterproductive with refresh rate debugging because of unexpected consequences).
  • Beyond ~30fps, most stutters can be generated by harmonic frequencies / beat frequencies, NOT from GPU performance
    Such as physics clocks versus frame rate. 60Hz hides stutter issues that appear at 50Hz and 75Hz. Also, stutters are not always caused by GPU performance issues -- if you are a big game company that has not yet purchased high-Hz monitors for the main developers, then this is a big mistake even for casual games. Many of us play casual solo games even on 240Hz monitors because of beautiful visuals. Do you have a 120Hz phone or iPad? You notice how beautiful scrolling is, half the motion blur? Did you know that 240 Hz has one-quarter the motion blur of 60Hz and looks even better than your iPad? The new 2020 models of 165Hz and 240Hz IPS are visual treats. Tell your purchase department to order 240Hz monitors, stat! Give different developers a random mix of different-model monitors at different refresh rates, sync technologies (FreeSync vs G-SYNC), and spread your refresh-rate stutter debugging around across your team. Your users will add another star to your game by not complaining about stutter issues. I know some indie developer may not be able to afford gaming monitors -- and might be stuck with their only 60Hz monitor -- so just simply do the 55Hz test and capped tests near the top of this list. But bottom line, if you can afford high-Hz, definitely buy high-Hz for your development machine.

    Third Person View Stutter Complications

    You usually won't need to worry about this if your Unity game doesn't have a 3rd person view. This complication sometimes only becomes visible when you play at framerates not divisible by physics clocks (e.g. 55fps test).
    -- World scrolling stutter while player avatar (3rd person) moves smoothly
    -- World scrolling smooth while player avatar (3rd person) stutters

    This can be caused by one of the two ticking on Update() and the other ticking on FixedUpdate(), or a similar asynchronous-ticking situation. You will have to work to make sure player movements AND camera movements are in sync with each other in an organic varying-frametime way. While FixedUpdate is fine for external physics and NPCs if you standardize on a fixed physics clock -- it is hugely problematic for 3rd person view for the player character.

    Watch any collision handling, especially if player character is ticking on floating frametime, and NPCs are ticking on a fixed-Hz internal clock. Solving issues with this is left as an exercise for the game developer.

    Thank you for Hz-futureproofing your game.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

User avatar
Chief Blur Buster
Site Admin
Posts: 7699
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada

Re: Unity Developers: Easy G-SYNC & FreeSync Support in only 3 lines of programming code!

Post by Chief Blur Buster » 01 Jul 2020, 11:24

The new version of Cloudpunk is now out:
Cloudpunk Update wrote:Performance & Smoothness

We're calling this update the Butter Update. Butter?!
Yes, because it was our intention to make everything smooth like butter and remove all remaining micro-stutters happening in either low or high frames per second.

What has changed:
  • Futureproofed the game to all display technologies (high-Hz, G-SYNC, FreeSync, odd-Hz)
  • Better experience for VRR (variable refresh rate), FreeSync, G-SYNC, and high-Hz
  • Better experience for monitors with refresh rates not divisible by 60 (e.g. 75Hz, 144Hz)
  • Far less stuttering/jittering while walking
  • The game behaves better with fluctuating framerates, and framerates not divisible by 60, even on 60Hz monitors.
  • More consistent mousefeel experience, no more "lagfeel yo-yo" effect from the sync issues
In the announcement, the developer has credited my assistance:
Cloudpunk Update wrote:We would also like to thank community member Mark Rejhon of Blur Busters for his assistance and comprehensive testing of the stuttering issues.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

Post Reply