FreeSync Seems Not Working only in Apex Legends

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
Post Reply
BazzerBiller
Posts: 2
Joined: 02 Feb 2020, 17:42

FreeSync Seems Not Working only in Apex Legends

Post by BazzerBiller » 02 Feb 2020, 18:01

Hello everyone, about 2 week ago I bought rx580 due to my old GPU ( gtx760 ) died. And I had XG2401 ( 144hz w/freesync ) which I bought only for CSGO. Since I bought rx580, I wanted to test my FreeSync and fps performance in two games; Pubg, Apex Legends. Firstly, in Pubg I enable FreeSync from my monitor,Radeon Display settings and specified display settings for games, which is for Pubg. I capped my fps at 141 and started to play. My framerate was not constant and it goes between 70-141 according to the situations ( fight, or in building etc . ) but I didnt notice any stutter ( sometimes but only 1 or 2 times in 2 hours of gameplay ), laggy and not smooth gameplay at all. Everything was pleasant from FreeSync and my monitor. But... Apex Legends was horrible in terms of gameplay. So, in Apex Legends my fps pattern almost same in Pubg ( 70 min on cities and 141 maxed in building or more optimized locations ). However, these frametime fluctations caused me a lot of stutter and sometimes screen tearing in every 2 - 5 seconds. I looked at my monitor and saw that my GPU usage was quite low when fps drops and when stutter happens. So, about a week I searched about everything that can cause this stutter problems. I did Windows Optimizations ( Disabled FSO and Game bar etc ), I even UV/OC my GPU which I didnt know before to avoid throttle and better performance ( Ended up lower Temps and %5-7 performance gain on most game ) but the stutter is there. At the end I really lost what was Freesync, why capping FPS matter and what is VSYCN at all. But I notice sth and fixed problem temp. When I cap my fps between 80-90, I get stutter free gameplay and stutters happens when my fps goes lower than - lets say capped fps was 88 - 88. But the thing that I dont understand is this problem not happening in Pubg while I get 70 fps in sometimes and 141 fps in sometimes. So, why this inconsistent FreeSync working happening? Why Apex Legends stutters but Pubg not in same fps ? Am I so lost that I just dont understand what is FreeSync ? So, final, how should I properly use FreeSync on games ( that I get fps lower than 144 fps ) and Apex Legends ?
- I tested my MSI ARMOR RX 580 GB on SuperPosition Benchmark and GPU usages were fine and got 2590-2650 scores on 1080 Extreme with several times, so I assume my GPU is fine - ( Mentioned due to thought that GPU is faulty because stutters happening when GPU usage drops to really low from %99.9
Sorry if that kind of thread already opened and explained before, just discovered this forum and loved response and came here to help. So, thank you for any reply, suggestions and advices!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

FreeSync Seems Not Working only in Apex Legends

Post by Chief Blur Buster » 12 Feb 2020, 21:02

Simple Short Answer
For end users
BazzerBiller wrote:
02 Feb 2020, 18:01
My framerate was not constant and it goes between 70-141 according to the situations ( fight, or in building etc . ) but I didnt notice any stutter
FreeSync and G-SYNC is quite miraculous in erasing stutters in PUBG
BazzerBiller wrote:
02 Feb 2020, 18:01
But... Apex Legends was horrible in terms of gameplay. So, in Apex Legends my fps pattern almost same in Pubg ( 70 min on cities and 141 maxed in building or more optimized locations ).
...But Apex has been very variable refresh rate unfriendly.
Sometimes it works well, but often it doesn't want to play well. You're not the only person who experienced this.

Different games are less friendly to GSYNC/FreeSync than other games. Some games are also optimized to specific sync technologies (e.g. consoles are optimized for fixed-Hz VSYNC ON), making it difficult to port it to VRR compatibility.

Long Complex Answer
For software developers

(Just because many game developers lurk Blur Busters to read stuff -- many view counts from game studios).

The below is textbook reading for software developers who want to make their game engine more compatible with VRR. Stop reading the below if you're not a software developer. The below is mainy for other technical readers that visit these forums.

<Technical>

VRR can does amazing work to keep gametime:photontime consistent. When the game time is in perfect sync with the frames becoming visible to human eyes -- even with erratic delivery of frames -- things look smooth despite nearly random frametimes -- as seen in the demo at www.testufo.com/vrr

However, sometimes there are games that just really skew the frametimes around in ways that VRR cannot fix. For example, erratic frame rendering times, or erratic input-read times, or erratic game clocks. Even with Google search, typing "Apex doesn", it autocompletes to "Apex doesn't feel smooth" -- so everybody has the same problem with Apex.

The best thing game developers can do is to make sure that frame presentation is in sync with gametime, and that rendertimes are consistent.

Even 1 millisecond errors in gametimes generates visible stutters. 4000 pixels per second turning = 1ms stutter = 4 pixeljump stutter! Not all game developers realize how much milliseconds matters for stutter. Game clocks need to be microsecond accurate wherever possible --

Preliminary virtual reality headset tests show that 0.2ms gametime errors sometimes still create human visible stutter -- during low-persistence 8K (e.g. strobed ultra-high-resolution virtual reality during fast head turning speeds -- 8K virtual reality head-turning 2 screenwidths per second = 16,000 pixels per second = one 0.2ms stutter is a 3.2 pixel stutterjump, which can be visible if the strobed MPRT persistence is sufficiently low, such as 0.1ms-0.2ms -- the Valve Index doesn't quite go low, but it does 0.33ms persistence). The higher the resolution, the lower the persistence, and the faster the eye-trackable motionspeeds, with more eyetracking time (wide 180 FOV virtual reality headset), the more visible tiny gametime errors become visible. So if strobed persistence is really low (e.g. 0.3ms persistence found in Valve Index virtual reality) and resolution is really high (e.g. nearly 4K), sub-millisecond sudden stutter begins to become human visible.

The point being, Most software developers do not realize how important the millisecond is in this refresh rate race to retina refresh rates.

Now, variable refresh rate cannot fix certain kinds of stutters (e.g. game clocks greatly divering from frame presentation clock, as well as ultra-volatile rendering times).

Stutters UNFIXABLE by G-SYNC and FreeSync
Stutters caused by frame presentation (aka frame visibility to eyes) diverging away from game clock (aka gametime).
Sometimes it diverges because of a programming bug
sometimes it diverges because engine is prioritizing to netcode (prone to network quality)
sometimes it diverges because of power management
sometimes it diverges because of wildly fluctuating frame rendering times
sometimes it diverges because the game engine is optimized for VSYNC ON
sometimes it diverges because of a buggy frame rate cap (switch to RTSS cap instead of game engine cap)
Etc.

Games that have a built-in "VSYNC OFF" feature usually are generally more compatible with VRR, since VSYNC OFF has behaviours that are inherently compatible with variable refresh rate technology.

I don't know if Apex netcode has any effect on panning fluidity -- but it varies by game. The original early DOOM (1994) often prioritized local strafing and lateral movements to netcode, which is why modem errors sometimes created weird strafing speed effects, like your movements suddenly lurching etc. While today, CS:GO prioritized local movements to gametime:presenttime sync, which is why it still feels relatively smooth (mouseturns, strafing, etc) independently of network volatility. Your fragging might be erratic (hitreg varies with network volatility, argh), but your panning/turning/strafing/etc never gets erratic. Battle(non)sense YouTube channel covers ton of netcode material. Great textbook study. But local gametimes for local movements (camera view) should NOT be slaved to network -- CS:GO doesn't make that mistake -- and CS:GO is still popular today.

Most of the time, game times are naturally in sync with frame presentation times. That's why variable refresh is retroactively with many games made long before GSYNC/FreeSync. But unfortunately, not all game engines play well with variable refresh.

Public Service Advisory To Game Developers / Game Studios
Best-practice guideline for game developers and game studios: Do your best to make your gametime clocks & frame presentation times, as microsecond-accurate as possible. Please. In order to futureproof your game in the refresh rate race to retina resolutions. Today's 1ms-visible microstutter may be tomorrow's 0.4ms-visible microstutter (at increased resolutions & lowered persistence.

Stutter Math:

2000 pixels/second screen panning
+
2ms sudden divergence in game time away from frame Present() time
=
4 pixel stutter-jump even in G-SYNC and FreeSync


regardless of Hz! regardless of framerate! Though higher Hz & higher frame rate will amplify visibility of momentary gametime divergences.

Human-Visible Millisecond Errors

Let's say your code frequently inserts timing errors ("game time-vs-frame present time" divergence for any reason), at very frequent or regular intervals:
--> A 10ms gametime-presenttime sudden error at 33ms 30fps is no big deal in the Classic Olden Days. I didn't see it.
--> A 10ms gametime-presenttime sudden error 3ms 300fps is a holy GIANT microstutter; why is my game unsmooth?
I want my money back!

Especially if
--> 10ms sudden timing error at 30fps on 640x480 of yesteryear = 1/100th of 640 = only 6.4 pixel stutterjump at one screenwitdh per second turning/panning (totally hidden by middle of ~21 pixels of LCD motion blurring because of high persistence simultaneously combined with slow GtG back in those olden days)
Here, the stutter jump is only ~25% of motion blur size. I couldn't see it.
--> 10ms sudden timing error at 300fps on 2560x1440 ULMB of future = 1/100th of 2560 = big 25.6 pixel stutterjump at one screenwidth per second turning/panning (BLATANTLY visible stutter during 1ms MPRT strobed motion blur reduction = 2.5 pixel of motion blurring).
Here, the stutter jump is ~1,000% of motion blur size. Owie, painful stutter.

The amplified visibility effect is an important textbook lesson in the refresh rate race.

Here, the humble millisecond matters -- it is one of those "millisecond is human visible" situations. The "surely that millisecond doesn't matter" accidential assumption -- can bite a game developer's ass when de-stuttering a game engine on G-SYNC and FreeSync!

Causing a game studio to waste money fixing their engine to behave better for G-SYNC and FreeSync. Companies can save money and learn from Blur Busters. For software developers and game programmers who want to learn more, Understand The Black Box Between Present()-to-Photons. That what Blur Busters does -- we understand what happens between the game software & the photons hitting human eye balls.

TL;DR: Software Developers Should Aim For Microsecond-Accurate Sync Between Game Clocks -> Frame Presentation Time

Yes, I know it's aspirational. You will not truly get microsecond accurate. But there are already many microsecond clocks and many ways to avoid adding accidental delays between game times & frame presentation times. Skip the accidental shoot-feet mistakes and just try to make sure the microsecond you Present() is in relative sync to the microsecond of your gametime clock.

Presto. That's it. That's really mostly all a game need to do to avoid stutters that G-SYNC and FreeSync is unable to fix. Future faster computers become more and more microsecond accurate, benefitting your game naturally.

Some games like CS:GO already does it naturally. Which is why many old games just works fine: The game naturally improved smoothness far beyond frame rates they were remotely tested back then. It's amazing how many game engines insert so much frametime / gametime errors, and from so many unexpected reasons. Many studios hire developers that are not high-Hz experienced, and they make natural "that millisecond doesn't matter" mistakes peppered throughout a lot of perfectly good code. This is probably not what happened to Apex, but it has happened to a few games. Sometimes it's even worse (e.g. games not properly working at above 60 frames per second).

Local gametime (for local movement purposes) DOES NOT mean networktime (for hitreg purposes), they can be different, like it is for CS:GO and many games that just still feel delicious in this refresh rate race far beyond the original Hz the game was designed at. And even PUBG while the engine is stuttery (even allegedly unoptimized), it still keeps sufficient gametime:presenttime sync for G-SYNC/FreeSync to do miracles on it, creating beauty smoothness out of a stuttery engine.

Bosses/CEOs/CFOs/PMs Of Game Studios, It Matters To Your Company's Bottom Line.

So you game studios -- Want your game to last as long in esports as CS:GO? Want your game to keep up with the refresh rate race -- and in both fixed-Hz and variable-Hz situations? Want more profits longer? Even at future frame rates and refresh rate not yet tested? Good, you're listening. Many engines (Source, Unreal, Unity), in most default programming workflows, automatically works to keep game-time-vs-presentation-time sync to the microsecond as much as possible. But if you're building an in-house engine, or do an unconventional workflow with an existing engine, then please futureproof your game engines by following the above simple rule! ;)

And hey, look at how everybody is milking their games on app stores 10 years later. It's no longer bricks and mortar. Pay attention to future proofing, and you'll have revenues for longer (= you get to pay more employees & happier employees too). Just please don't mis-design your netcode/game engine to make your game non-futureproof in this refresh rate race.

Don't be the game studio with a 1-line bug making your game VRR-incompatible -- your influencer gamer is going to switch to CS:GO or other smoother games -- and not bother being your influencer for your game if it's not fun for them to play your game, if your game is unsmooth, because you didn't follow a microsecond "gametime:presenttime" rule. Millisecond errors are literally 100x+ more human visible with the new "refresh rate race" variables explained in above paragraphs.

Besides, it is not an expensive Apollo Space Mission to make a game VRR compatible. Who knows, maybe your game company might get more free sponsored equipment (like free GPUs for your game company) if you're on their good side being compatible with the world's entire sync technologies -- and your game is smoother than your competitor's. Keeping up with the refresh rate race. It's the bottom line for your video game business, isn't it? And don't employees and stockholders need to be rewarded by better game profits, and less time fixing stutter bugs?

Simply following a cheap, inexpensive, microsecond "gametime:presenttime" rule, automatically makes your game work better with all sync tech, anyway -- Fast Sync, Enhanced Sync, VSYNC OFF -- and whatever future sync technologies that nas not yet been invented. It automatically futureproofs your game to all future sync tech.

So, whenever the game is in VSYNC OFF mode in menus, make the game engine synchronize gametime to the timing of Present().
(A) It makes VSYNC OFF behave better
(B) It makes game work better with G-SYNC and FreeSync (perpetual VSYNC ON look at fluctuating framerates -- like www.testufo.com/vrr animation ...!)

</Technical>

(I know a few employees of big-name game companies already lurk these forums. If that's you, please forward this message -- LINK: posting.php?mode=edit&f=2&p=47076 -- if somebody at your game company say stutters on G-SYNC and FreeSync don't matter, or those "stutters that only become visible at refresh rates above 60Hz" situations. That's a warning signal that your game is not properly fluidity-futureproofed to be compatible with all Hz / all sync tech.)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: FreeSync Seems Not Working only in Apex Legends

Post by Chief Blur Buster » 23 Mar 2020, 17:36

Adding image for completeness' sake.

Ever seen high speed videos of a monitor refreshing? See www.blurbusters.com/scanout

Now... A variable refresh monitor begins refreshing the moment you Present().

Image

That's WHY you must maintain gametime:presenttime sync in order to achieve gametime:photontime sync.

Game clocks correctly relative to photons hitting human eye balls.

You don't need to understand the above diagram. Just make sure you Present() as "microsecond-close-as-you-can" to keep gametime clocks in relative sync with frame presentation time. The stutter elimination just happens automatigically when you do.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

BazzerBiller
Posts: 2
Joined: 02 Feb 2020, 17:42

Re: FreeSync Seems Not Working only in Apex Legends

Post by BazzerBiller » 25 Mar 2020, 12:02

Firstly thanks for the short and long answer, I will definitly keep this amazing textbook keep in mind since I am a student who is studying Computer Science at university and willing to start Unity.

So, this problem basically related to games and their implementation of those tech(FreeSync, Gsync). Thats actually quite annoying for me because I always get fps pattern such as 60-141 from any battle royale games so far. Stuttering seems inevitable too. In summer, surely considering to build a pc setup too.

Also, notice that in Cod:Warzone, FreeSync is quite broken too. When I reboot this game second time, FreeSync is not working at all. In monitors information area I can see my VRR is not same as game. For example, in normal situation, assuming my fps is 90, I see 80-100 Hz from monitor. However, when I reboot this game second time or more (without rebooting computer), I see 90 for one second then back to 143.9 Hz in monitor's information hub, and stuttering is all over the place too. I have had this problem on Crysis games but managed to fix it by changing the game's Display between Windowed and Fullscreen until it works but in Cod:Warzone it is not working. Again, I am assuming thats a problem related games, not because my poor implementation or something...

In addition, I had a problem in Apex, it is also happening on Cod:Warzone too, that is related to Packet Loss. I dont have any problem in my network in terms of bad connection or problem due to bad placement or something but on those games sometimes my packet loss jumps to %2-10 and stay like that about 1 min and then drop to 0. Dunno if it is related to FreeSync but I watched couple of netcode videos from Battle(non)sense that covers these both games and their poor netcode and hz problems in servers.

I have searched this problem and this topic more than 2 months now. While those technologies are amazing in theory, I also think most of the AAA game companies can not or dont want to fix inconsistency of frametime in their game so it ends up no benefit from FreeSync at all. Like, I mean in battleroyale games frametimes are always inconsistent unless you have a high-end computer. However, not everyone have a high-end computer, So the fluctuation in Frametime makes the FreeSync kinda useless. FreeSync still works affective unless when I cap my fps to lowest fps. In the end, I loose many benefits that come with higher hz = better gameplay experience. Or maybe, I am just too poor and trying to find some excuses for my wallet. Again, thanks for the really detailed answers! Have a health day!

diakou
Posts: 83
Joined: 09 Aug 2020, 11:28

Re: FreeSync Seems Not Working only in Apex Legends

Post by diakou » 09 Aug 2020, 11:59

Chief Blur Buster wrote:
12 Feb 2020, 21:02
Simple Short Answer
For end users
BazzerBiller wrote:
02 Feb 2020, 18:01
My framerate was not constant and it goes between 70-141 according to the situations ( fight, or in building etc . ) but I didnt notice any stutter
FreeSync and G-SYNC is quite miraculous in erasing stutters in PUBG
BazzerBiller wrote:
02 Feb 2020, 18:01
But... Apex Legends was horrible in terms of gameplay. So, in Apex Legends my fps pattern almost same in Pubg ( 70 min on cities and 141 maxed in building or more optimized locations ).
...But Apex has been very variable refresh rate unfriendly.
Sometimes it works well, but often it doesn't want to play well. You're not the only person who experienced this.

Different games are less friendly to GSYNC/FreeSync than other games. Some games are also optimized to specific sync technologies (e.g. consoles are optimized for fixed-Hz VSYNC ON), making it difficult to port it to VRR compatibility.

Long Complex Answer
For software developers

(Just because many game developers lurk Blur Busters to read stuff -- many view counts from game studios).

The below is textbook reading for software developers who want to make their game engine more compatible with VRR. Stop reading the below if you're not a software developer. The below is mainy for other technical readers that visit these forums.

<Technical>

VRR can does amazing work to keep gametime:photontime consistent. When the game time is in perfect sync with the frames becoming visible to human eyes -- even with erratic delivery of frames -- things look smooth despite nearly random frametimes -- as seen in the demo at www.testufo.com/vrr

However, sometimes there are games that just really skew the frametimes around in ways that VRR cannot fix. For example, erratic frame rendering times, or erratic input-read times, or erratic game clocks. Even with Google search, typing "Apex doesn", it autocompletes to "Apex doesn't feel smooth" -- so everybody has the same problem with Apex.

The best thing game developers can do is to make sure that frame presentation is in sync with gametime, and that rendertimes are consistent.

Even 1 millisecond errors in gametimes generates visible stutters. 4000 pixels per second turning = 1ms stutter = 4 pixeljump stutter! Not all game developers realize how much milliseconds matters for stutter. Game clocks need to be microsecond accurate wherever possible --

Preliminary virtual reality headset tests show that 0.2ms gametime errors sometimes still create human visible stutter -- during low-persistence 8K (e.g. strobed ultra-high-resolution virtual reality during fast head turning speeds -- 8K virtual reality head-turning 2 screenwidths per second = 16,000 pixels per second = one 0.2ms stutter is a 3.2 pixel stutterjump, which can be visible if the strobed MPRT persistence is sufficiently low, such as 0.1ms-0.2ms -- the Valve Index doesn't quite go low, but it does 0.33ms persistence). The higher the resolution, the lower the persistence, and the faster the eye-trackable motionspeeds, with more eyetracking time (wide 180 FOV virtual reality headset), the more visible tiny gametime errors become visible. So if strobed persistence is really low (e.g. 0.3ms persistence found in Valve Index virtual reality) and resolution is really high (e.g. nearly 4K), sub-millisecond sudden stutter begins to become human visible.

The point being, Most software developers do not realize how important the millisecond is in this refresh rate race to retina refresh rates.

Now, variable refresh rate cannot fix certain kinds of stutters (e.g. game clocks greatly divering from frame presentation clock, as well as ultra-volatile rendering times).

Stutters UNFIXABLE by G-SYNC and FreeSync
Stutters caused by frame presentation (aka frame visibility to eyes) diverging away from game clock (aka gametime).
Sometimes it diverges because of a programming bug
sometimes it diverges because engine is prioritizing to netcode (prone to network quality)
sometimes it diverges because of power management
sometimes it diverges because of wildly fluctuating frame rendering times
sometimes it diverges because the game engine is optimized for VSYNC ON
sometimes it diverges because of a buggy frame rate cap (switch to RTSS cap instead of game engine cap)
Etc.

Games that have a built-in "VSYNC OFF" feature usually are generally more compatible with VRR, since VSYNC OFF has behaviours that are inherently compatible with variable refresh rate technology.

I don't know if Apex netcode has any effect on panning fluidity -- but it varies by game. The original early DOOM (1994) often prioritized local strafing and lateral movements to netcode, which is why modem errors sometimes created weird strafing speed effects, like your movements suddenly lurching etc. While today, CS:GO prioritized local movements to gametime:presenttime sync, which is why it still feels relatively smooth (mouseturns, strafing, etc) independently of network volatility. Your fragging might be erratic (hitreg varies with network volatility, argh), but your panning/turning/strafing/etc never gets erratic. Battle(non)sense YouTube channel covers ton of netcode material. Great textbook study. But local gametimes for local movements (camera view) should NOT be slaved to network -- CS:GO doesn't make that mistake -- and CS:GO is still popular today.

Most of the time, game times are naturally in sync with frame presentation times. That's why variable refresh is retroactively with many games made long before GSYNC/FreeSync. But unfortunately, not all game engines play well with variable refresh.

Public Service Advisory To Game Developers / Game Studios
Best-practice guideline for game developers and game studios: Do your best to make your gametime clocks & frame presentation times, as microsecond-accurate as possible. Please. In order to futureproof your game in the refresh rate race to retina resolutions. Today's 1ms-visible microstutter may be tomorrow's 0.4ms-visible microstutter (at increased resolutions & lowered persistence.

Stutter Math:

2000 pixels/second screen panning
+
2ms sudden divergence in game time away from frame Present() time
=
4 pixel stutter-jump even in G-SYNC and FreeSync


regardless of Hz! regardless of framerate! Though higher Hz & higher frame rate will amplify visibility of momentary gametime divergences.

Human-Visible Millisecond Errors

Let's say your code frequently inserts timing errors ("game time-vs-frame present time" divergence for any reason), at very frequent or regular intervals:
--> A 10ms gametime-presenttime sudden error at 33ms 30fps is no big deal in the Classic Olden Days. I didn't see it.
--> A 10ms gametime-presenttime sudden error 3ms 300fps is a holy GIANT microstutter; why is my game unsmooth?
I want my money back!

Especially if
--> 10ms sudden timing error at 30fps on 640x480 of yesteryear = 1/100th of 640 = only 6.4 pixel stutterjump at one screenwitdh per second turning/panning (totally hidden by middle of ~21 pixels of LCD motion blurring because of high persistence simultaneously combined with slow GtG back in those olden days)
Here, the stutter jump is only ~25% of motion blur size. I couldn't see it.
--> 10ms sudden timing error at 300fps on 2560x1440 ULMB of future = 1/100th of 2560 = big 25.6 pixel stutterjump at one screenwidth per second turning/panning (BLATANTLY visible stutter during 1ms MPRT strobed motion blur reduction = 2.5 pixel of motion blurring).
Here, the stutter jump is ~1,000% of motion blur size. Owie, painful stutter.

The amplified visibility effect is an important textbook lesson in the refresh rate race.

Here, the humble millisecond matters -- it is one of those "millisecond is human visible" situations. The "surely that millisecond doesn't matter" accidential assumption -- can bite a game developer's ass when de-stuttering a game engine on G-SYNC and FreeSync!

Causing a game studio to waste money fixing their engine to behave better for G-SYNC and FreeSync. Companies can save money and learn from Blur Busters. For software developers and game programmers who want to learn more, Understand The Black Box Between Present()-to-Photons. That what Blur Busters does -- we understand what happens between the game software & the photons hitting human eye balls.

TL;DR: Software Developers Should Aim For Microsecond-Accurate Sync Between Game Clocks -> Frame Presentation Time

Yes, I know it's aspirational. You will not truly get microsecond accurate. But there are already many microsecond clocks and many ways to avoid adding accidental delays between game times & frame presentation times. Skip the accidental shoot-feet mistakes and just try to make sure the microsecond you Present() is in relative sync to the microsecond of your gametime clock.

Presto. That's it. That's really mostly all a game need to do to avoid stutters that G-SYNC and FreeSync is unable to fix. Future faster computers become more and more microsecond accurate, benefitting your game naturally.

Some games like CS:GO already does it naturally. Which is why many old games just works fine: The game naturally improved smoothness far beyond frame rates they were remotely tested back then. It's amazing how many game engines insert so much frametime / gametime errors, and from so many unexpected reasons. Many studios hire developers that are not high-Hz experienced, and they make natural "that millisecond doesn't matter" mistakes peppered throughout a lot of perfectly good code. This is probably not what happened to Apex, but it has happened to a few games. Sometimes it's even worse (e.g. games not properly working at above 60 frames per second).

Local gametime (for local movement purposes) DOES NOT mean networktime (for hitreg purposes), they can be different, like it is for CS:GO and many games that just still feel delicious in this refresh rate race far beyond the original Hz the game was designed at. And even PUBG while the engine is stuttery (even allegedly unoptimized), it still keeps sufficient gametime:presenttime sync for G-SYNC/FreeSync to do miracles on it, creating beauty smoothness out of a stuttery engine.

Bosses/CEOs/CFOs/PMs Of Game Studios, It Matters To Your Company's Bottom Line.

So you game studios -- Want your game to last as long in esports as CS:GO? Want your game to keep up with the refresh rate race -- and in both fixed-Hz and variable-Hz situations? Want more profits longer? Even at future frame rates and refresh rate not yet tested? Good, you're listening. Many engines (Source, Unreal, Unity), in most default programming workflows, automatically works to keep game-time-vs-presentation-time sync to the microsecond as much as possible. But if you're building an in-house engine, or do an unconventional workflow with an existing engine, then please futureproof your game engines by following the above simple rule! ;)

And hey, look at how everybody is milking their games on app stores 10 years later. It's no longer bricks and mortar. Pay attention to future proofing, and you'll have revenues for longer (= you get to pay more employees & happier employees too). Just please don't mis-design your netcode/game engine to make your game non-futureproof in this refresh rate race.

Don't be the game studio with a 1-line bug making your game VRR-incompatible -- your influencer gamer is going to switch to CS:GO or other smoother games -- and not bother being your influencer for your game if it's not fun for them to play your game, if your game is unsmooth, because you didn't follow a microsecond "gametime:presenttime" rule. Millisecond errors are literally 100x+ more human visible with the new "refresh rate race" variables explained in above paragraphs.

Besides, it is not an expensive Apollo Space Mission to make a game VRR compatible. Who knows, maybe your game company might get more free sponsored equipment (like free GPUs for your game company) if you're on their good side being compatible with the world's entire sync technologies -- and your game is smoother than your competitor's. Keeping up with the refresh rate race. It's the bottom line for your video game business, isn't it? And don't employees and stockholders need to be rewarded by better game profits, and less time fixing stutter bugs?

Simply following a cheap, inexpensive, microsecond "gametime:presenttime" rule, automatically makes your game work better with all sync tech, anyway -- Fast Sync, Enhanced Sync, VSYNC OFF -- and whatever future sync technologies that nas not yet been invented. It automatically futureproofs your game to all future sync tech.

So, whenever the game is in VSYNC OFF mode in menus, make the game engine synchronize gametime to the timing of Present().
(A) It makes VSYNC OFF behave better
(B) It makes game work better with G-SYNC and FreeSync (perpetual VSYNC ON look at fluctuating framerates -- like www.testufo.com/vrr animation ...!)

</Technical>

(I know a few employees of big-name game companies already lurk these forums. If that's you, please forward this message -- LINK: posting.php?mode=edit&f=2&p=47076 -- if somebody at your game company say stutters on G-SYNC and FreeSync don't matter, or those "stutters that only become visible at refresh rates above 60Hz" situations. That's a warning signal that your game is not properly fluidity-futureproofed to be compatible with all Hz / all sync tech.)
I know I already wrote another post on another category regarding somewhat similar topic, but just curious question to you;

In regards to fighting game developers for games that typically lock to 60fps, it is very clear that higher refresh rate monitors make these games perform better, whether that's because of v-sync improvements in a "fake fullscreen mode" due to DWM as higher refresh rates double or triple lower these values of inputlag.

And obviously it's really important to try and sync your gametime/presenttime in a fighting game due to how you have even less frames to play with for inaccuracies or something being pushed forward to the next frame has dramatic inputlag consequences in a 60 fps game comparatively to 300fps.

But what would the argument be out there for fighting games running at 60fps / game simulation? Street fighter 5 is notorious for a ton of problems in regards to these things and the opposite (where g-sync high refresh rate heavily improves the game)

Sorry if I couldn't get the point across properly, I've educated myself a ton the past two years on these topics, but I still struggle talking about them and/or presenting my information. Bit of the problem on knowing how to solve 2+2 but not being able to teach it :P

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: FreeSync Seems Not Working only in Apex Legends

Post by Chief Blur Buster » 10 Aug 2020, 15:11

diakou wrote:
09 Aug 2020, 11:59
I know I already wrote another post on another category regarding somewhat similar topic, but just curious question to you;

In regards to fighting game developers for games that typically lock to 60fps, it is very clear that higher refresh rate monitors make these games perform better, whether that's because of v-sync improvements in a "fake fullscreen mode" due to DWM as higher refresh rates double or triple lower these values of inputlag.

And obviously it's really important to try and sync your gametime/presenttime in a fighting game due to how you have even less frames to play with for inaccuracies or something being pushed forward to the next frame has dramatic inputlag consequences in a 60 fps game comparatively to 300fps.

But what would the argument be out there for fighting games running at 60fps / game simulation? Street fighter 5 is notorious for a ton of problems in regards to these things and the opposite (where g-sync high refresh rate heavily improves the game)

Sorry if I couldn't get the point across properly, I've educated myself a ton the past two years on these topics, but I still struggle talking about them and/or presenting my information. Bit of the problem on knowing how to solve 2+2 but not being able to teach it :P
More frames is indeed better for less display motion blur & less stroboscopic effect.
However, a high-Hz VRR monitor indeed has huge latency-lowering benefits for 60fps-locked games.

Fixed frame rate output from games usually hugely benefit with lag reductions with FreeSync/G-SYNC, as long as the framepacing is good (e.g. the game is presenting frames 1/60sec apart without frametime fluctuations).

This is because there's no such thing as a "frame pushed forwared to next refresh". If a frame is a microsecond too late, the monitor actually waits an extra microsecond for the game. Or even 10 microsecond. Or even 1 millisecond.

1/60sec is 16.7 milliseconds.

So 60fps can be 16.7ms-16.7ms-16.7ms-18ms-16.7ms-16.7ms-16.7ms. Those 18ms frames stay 18ms, instead of becoming 33ms. With G-SYNC and FreeSync, the monitor is syncing to the game instead of vice-versa. The stutter thus, becomes much less, since there's no such thing as a missed refresh cycle for FreeSync or G-SYNC because they are asynchronous refresh cycles with variable amount of time between individual refresh cycles.

Now, independent of frametime variances, is scanout latency, as seen at www.blurbusters.com/scanout -- the higher the maximum refresh rate of a VRR display, the faster it's refreshing individual refresh cycles. A 240Hz gaming monitor will display 60fps frames in 1/240sec.

This is because not all pixels refresh at the same time, so the higher the Hz, the faster it will take from the first pixel to last pixel to get refreshed, in one sweep -- typically from top edge to bottom edge. The higher the max Hz, the faster the refresh scanout sweep will be. This is independent of refresh rate on a VRR monitor. Even though the time interval between refresh cycles can vary, the amount of time refreshing specific refresh cycles remain constant at the max Hz. (so all framerates 30fps-240fps is 30Hz-240Hz, but the time spent refreshing each refresh is constant all of them refresh in 1/240sec on a 240Hz G-SYNC or FreeSync monitor)

This is why well-framepaced fixed-framerate material works excellently with such low latency, on variable refresh rate displays -- whether emulators or 60fps-locked games. You don't have 16.7 milliseconds suddenly doubling to 33.3ms, anything that becomes longer than 16.7ms stays unchanged. Or even anything shorter (like 15ms). Likewise, the individual refresh cycles are faster, resulting in lower latency despite the same frame rate. So that's two major pros:
(1) Reduction of stutter from VSYNC-missed events (typical of fixed Hz displays)
(2) Latency reductions from the faster refreshing of individual refresh cycles

A good demonstration of stutter elimination of random frametimes is this simulation of TestUFO variable refresh rate simulation, "Struggle at Max" -- the VRR half stays smooth, looking as if it's already maximum framerate, while the non-VRR half has visible stutter.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply