Blur Buster's G-SYNC 101 Series Discussion

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by Chief Blur Buster » 05 Jun 2018, 10:30

<TECHNICAL REPLY>
Advanced Reading for Advanced Users
jorimt wrote:Beyond the conceptual basics that I've already covered in my series, short of reverse engineering the G-SYNC module and G-SYNC driver component, I doubt we'll ever have the exact answer to such specific questions regarding G-SYNC inner workings, of which I'm sure Nvidia prefers; industry secrets and all that.
This is true, G-SYNC is proprietary, so my answer will be based on FreeSync / Adaptive-Sync / HDMI VRR behavior which all behave exactly the same at the blanking interval behaviour level (Timings & Resolution)

Portions will definitely also apply to G-SYNC because the panel is still the same (in both GSYNC and FreeSync) even if the monitor motherboard driving them is different. For example, the ViewSonic XG2530 FreeSync and XG2560 G-SYNC uses the same panel. Just different electronics driving the panel. At the end of the day, the panel is accepting synchronization signals, and so, the terminologies still kind of apply at the TCON level. Today, GSYNC is often (but not always) built into a custom TCON programmed by NVIDIA, so the answer may or may not apply at the LCD pixel level...

However, this will cover fundamental concepts of VRR since all transmissions over video cables still have synchronization signals (Porches, Sync, etc).
KKNDT wrote:I have questions about the VBI:
Before I answer, I need to explain to readers about the purpose of VBI. (Ever wonder where "VSYNC" comes from?)

Even a 2018 DisplayPort cable is transmitting the same signal topology as a 1930s analog TV signal transmitting a calendar-style-sequence of imagery, top-to-bottom, left-to-right, involving 1930s era Porch signals and Sync signals, that are still used to this date regardless of cable format, and still applies to signals between an NVIDIA card and a GSYNC monitor. How the card and monitor handles them is up to question, but the method of serializing two-dimensions into one-dimensions remains unchanged for almost a century.

Whether year 1930s or 2020s, it's all the same:
  • From left-to-right, is Horizontal Sync -> Horizontal Back Porch -> Horizontal Active -> Horizontal Front Porch
  • From top-to-bottom, is Vertical Sync -> Vertical Back Porch -> Vertical Active -> Vertical Front Porch
  • Sync signals in analog era controls a CRT gun to move to a new position (e.g. top edge, or left edge, or both)
  • Sync signals in digital era is simply logic to tell electronics to "Prepare for a new row of pixels" or "Prepare for a new refresh cycle".
  • Porches are simply overscan padding (safety padding between sync signal & active image).
  • VRR simply commandeers the bottom edge overscan (Front Porch), varying it, to vary the timing of individual refresh cycles.
Image

This is the way it has been done with serializing a 2D image over a one-dimensional wire or radio signal, and it has transferred over to digital and packetization standards, still in use digitally, even if the sync intervals are greatly reduced.

So, because the fundamental "serialization of 2D image into 1D wire" concepts remain unchanged for the better part of 100 years -- we can easily generalize how VRR was piggybacked onto this. While GSYNC is proprietary, it still has to adhere to a "cram 2D into 1D" concept, sticking to a reasonable amount of cable signal standardization (to be compatible with things like amplifiers, switches, etc), so a significant percentage of the answer will still apply to GSYNC even if extra proprietary data is embedded in the signal.

Here's my answer:
KKNDT wrote:1. Default VBI consists of VBPD, VFPD, VSPW. Which one is to be changed during VRR opertation?
Disambiguation for readers:
VBPD = "Vertical Back Porch" seen in ToastyX CRU
VFPD = "Vertical Front Porch" seen in ToastyX CRU
VSPW = "Vertical Sync" interval seen in ToastyX CRU
KKNDT wrote:2. When buffer swap occurs, the game will trgger a new refresh cycle. Is it the VBPD who is leading the new cycle?
Correct. During VRR operation, the monitor is held in continual transmission of Back Porch scanlines (VBPD) -- (metaphorically -- basically dummy rows of blank pixels hidden above the top edge of the screen). A variable Back Porch, while Sync and Front Porch remains fixed.

(For those who program Custom Resolutions and do FreeSync range overriding techniques: What gets entered in ToastyX CRU simply becomes the minimum-size Back Porch, and the FreeSync driver will automatically vary it to vary time between refresh cycles, down to the minimum Hz of the VRR range).

Interestingly, on this topic:
This adds an ultra-minor microseconds-league granularity behavior that few people know about:
During VRR, the timing of the next vertical refresh cycle is the granularity of the horizontal scanrate!

Once the buffer swap occurs (e.g. API call, Present() or glutSwapBuffers() occurs), the next scanline transmitted out of the graphics output will be the first row of pixels of the frame buffer (top edge of Active), right after the current still-scanning-out Back Porch scanline is complete. This cleverness allows adherance to classic video signal structure, while adding a VRR upgrade to it to allow display refresh cycles to be software-triggered.

NOTE1: If too much time passes without a buffer swap, the graphics card (at least on open VRR standards) will begin re-transmitting a duplicate of the previous refresh cycle. This is because pixels on a panel will go stale (e.g. fade away in a glitchy way) or go blank (e.g. go black or go white, due to electronics on the TCON) if too much time passes without a refresh cycle. So that's why you have a minimum Hz for a VRR.

NOTE2: For framerates below minimum Hz, there's a trick available. Low Frame Rate Compensation logic is simply intelligently timing the repeat-refresh cycles in a way to prevent colliding with the timing of API-triggered refresh cycles. So 24fps may actually cause the drivers to intelligently time the repeat refresh cycles exactly between API calls, creating a 48Hz out of 24fps (24 API calls per second). Done well, there's no stutter. However, random framerates (highly variable frametimes) can defeat this guessing logic of Low Frame Rate Compensation and timings of auto-repeat-refreshes will begin colliding with timings of software-API-triggered refresh cycles -- creating microstutter for framerates below minimum Hz of the VRR range.

If the display is currently scanning at ~160KHz horizontal scan rate (160,000 pixel rows per second including blanking interval), that means the new refresh cycle may be delayed by something like 1/160,000sec before it finally "hits the wire". That's because it waits for the current still-scanning out Back Porch scanline to finish, before beginning Pixel row #1 of the visible refresh cycle. There may be other driver and codec granularities being introduced (e.g. DisplayPort packetization) that adds a few scanlines of delay but practically, it's immediate.

Granularity in VRR operation that reaches the millisecond level, can become human-visible as ultra-minor microstutter, so it's critical that refresh timing remains in the <100us range. At 160KHz scanrate, that's less than 10 microseconds granularity between API call and the pixels hitting the cable -- far below a human's ability to detect.

VRR is piggybacked in such a clever way -- to point that FreeSync also unexpectedly successfully works on some analog MultiSync CRTs (at least the ones without oversensitive refresh-rate-change-blanking circuits, especially when refresh rate slewing isn't too quick).

</TECHNICAL REPLY>
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

KKNDT
Posts: 51
Joined: 01 Jan 2018, 08:56

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by KKNDT » 09 Jun 2018, 03:05

That's very TECHNICAL!

So I guess the story of pixels transmitting from the graphics card to the monitor via the cable is like this

Image

If you use resolution lower than native without scaling, the black area surounds the active is just the porch?
Attachments
1.png
1.png (9.97 KiB) Viewed 9244 times

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by Chief Blur Buster » 10 Jun 2018, 03:54

Pretty close, topologically.

It's an infinite repeating sequence (Sync-FrontPorch-Active-BackPorch-Sync-FrontPorch-Active-BackPorch-Sync-FrontPorch-Active-BackPorch-etc-etc-etc) ... I usually think of it in this order: Sync-FrontPorch-Active-BackPorch even if from a VRR perspective it is sometimes more easily diagrammed as Active-BackPorch-Sync-FrontPorch like your diagram. But the infinite repeating sequence is still the same. [...]-Active-BackPorch-Sync-FrontPorch-Active-BackPorch-Sync-FrontPorch-Active-BackPorch-Sync-FrontPorch-[...] looping in that order every refresh cycle. With the Vertical Back Porch being used to time-pad between refresh cycles.

The horizontals are full height. Be noted that the vertical porches will often still have the horizontals embedded in them. Only the vertical sync is completely absent of the horizontal sync signal (in the analog era). Porches can essentially behave as extra virtual resolution and data has sometimes been hidden in them (e.g. closed captioning signal - offscreen Scan Line #21 just above the top edge).

So your black left edge will extend the full height (except it'll be absent from the vertical sync). Because the overscan area (vertical porches) still have horizontals embedded in them too. So they are still stuck at horizontal scanrate granularity.

From a signal structure -- it's the same for higher and lower resolution. Most of the time, the monitor is responsible for deciding what to do with the resolution (centered or scaled). If the monitor is doing the windowboxing, of course.

But yes, in the analog era at least -- porches are black-colored. In NTSC signals, the porches are 7.5 IRE (7.5/100ths of the voltage between complete black and complete white). And Sync is 0 IRE (0/100ths of the voltage between complete black and complete white). That's my understanding. Digitally, the meanings are often quite different, with the confusion of HDTV sets having a toggle for 0-to-100 IRE versus HDTV 7.5-to-100 IRE -- creating a black level difference. In the digital era, it's wasted dynamic range so we often use full range signals in the digital era.

Things can be confusing between the porches and active when things are being underscanned... but on analog displays you saw the image data of the porches. And you often saw the structure with ghosted signals (e.g. two NTSC signals overlapping each other -- one distant station showing a ghost image) -- with that, sometimes you saw the actual structure of the porches and sync.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

MT_
Posts: 113
Joined: 17 Jan 2017, 15:39

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by MT_ » 15 Jul 2018, 08:05

jorimt wrote:I actually used a -2 fps limit across the board in all those tests. I simply came to the conclusion that the absolute safe minimum would be -3, but this could vary with certain in-game limiters, that can fluctuate like crazy. -2 in most instances is perfectly safe regardless.

As for G-SYNC borderless/windowed and DWM lag, I wasn't originally going to even test for that scenario until someone suggested it, and when I did, I was surprised at the results. As I say in the article, "Further testing may be required, but it appears on the latest public build of Windows 10 with out-of-the-box settings (with or without “Game Mode”), G-SYNC somehow bypasses the 1 frame of delay added by the DWM."

It could vary by system, but on my setup in the article (which I've now upgraded from), I couldn't get it to tear in borderless or windowed without G-SYNC, where others can, so it's possible that may have been a factor, but then that means I was testing the worst case scenario, and there was still no more lag in G-SYNC borderless/windowed than in exclusive fullscreen.

The slight increase for the windowed mode in some instances could be because the frame update starts right at the top of the screen with G-SYNC, and with windowed, there was obviously the window title bar at the very top, which made detection of the start of the frame impossible in that area. Otherwise, yeah, everything was well within my margin of error.

Battlenonsense also got a notable reduction in input lag with Game mode, and my undocumented results in this thread showed no improvement whatsoever.

System differences make direct comparison of some testing scenarios difficult.
Does this basically mean that for all G-sync users, this new ‘Full screen optimization’ DWM Bypass technique is not relevant anymore?

Because If there is theoretically the lowest possible amount of input lag achieveable now in (full) windowed mode, and basically equally performing versus full screen exclusive mode + Gsync / Vsync on...I dont see the point. (Setting aside for a sec the general performance implications of using anything but full screen exclusive)

Edit:
I pulled the trigger on Windows 10 (1803) to observe this new hybrid mode, and I have to say I’m impressed.

All three games I run flawlessly play nice with this new feature, observing no noticeable fps drops, in fact they all seem to show a minor increase.

To made sure the mode worked I observed a few criteria:

- As good as instant alt+tabbing (<100ms rough guess), no windows will be put in front of the game when doing so.
- With NVCP V-sync forced off I get tearing.
- Running G-sync full screen-only mode and enabled G-sync indicator shows it as being active.

One game out of the box was really laggy, and im not sure if the full screen optimization actually applied initially as alt tabbing was slower and flashed my screen twice, but further investigation showed me that the game kept putting my monitor back from 120 to 60hz after the first alt+tab occured. Strange as my global setting for refresh rate in NVCP was set to ‘Prefer highest available’ and game in particular doesnt have manual refresh rate option.

I fixed this issue with CRU and ripped out any resolution/refresh rate other than 120hz. Resulting in globally (sytem wide) enforcement of 120hz mode.

Another thing I noticed is when Gamebar is deactivated in Windows settings, full screen optimization will not apply at all.

I was really sceptical at first as 99% of the posts on google were negative ‘Disable FSE to fix Windows 10 game fps issues’. But in all honesty it appears to be a very welcomed new technology that probably doesn’t seem to be suited for people that have no clue about how to make it work properly or let alone know what its supposed to do. Perhaps its partly true that people these days are so used to Full window + triple buffering and smooth laggy gameplay they switch to full screen just to observe tearing?

Anyway, a New era of low latency gaming is at hand! Without the quirks :P
LTSC 21H2 Post-install Script
https://github.com/Marctraider/LiveScript-LTSC-21H2

System: MSI Z390 MEG Ace - 2080 Super (300W mod) - 9900K 5GHz Fixed Core (De-lid) - 32GB DDR3-3733-CL18 - Xonar Essence STX II

knypol
Posts: 76
Joined: 18 Aug 2016, 03:40

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by knypol » 25 Jul 2018, 03:41

Hi,
I read whole article about GSYNC and at the conclusion there is statement that best settings are: GSYNC+VSYNC ON and limit fps. I'd like to ask why should i even bother enabling vsync when it will never engage if the limit is below max refresh rate? Am i missing something?

MT_
Posts: 113
Joined: 17 Jan 2017, 15:39

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by MT_ » 25 Jul 2018, 04:09

knypol wrote:Hi,
I read whole article about GSYNC and at the conclusion there is statement that best settings are: GSYNC+VSYNC ON and limit fps. I'd like to ask why should i even bother enabling vsync when it will never engage if the limit is below max refresh rate? Am i missing something?
G-sync and V-sync under the refresh rate limit still work together to do frame time stabilization. How much you’ll notice most likely depends on your (un)optimized system, the game in particular etcetera.

I would use G-sync withour V-sync for a more ‘raw’ experience, and with V-sync for less serious games.
LTSC 21H2 Post-install Script
https://github.com/Marctraider/LiveScript-LTSC-21H2

System: MSI Z390 MEG Ace - 2080 Super (300W mod) - 9900K 5GHz Fixed Core (De-lid) - 32GB DDR3-3733-CL18 - Xonar Essence STX II

User avatar
RealNC
Site Admin
Posts: 3740
Joined: 24 Dec 2013, 18:32
Contact:

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by RealNC » 25 Jul 2018, 09:19

knypol wrote:Hi,
I read whole article about GSYNC and at the conclusion there is statement that best settings are: GSYNC+VSYNC ON and limit fps. I'd like to ask why should i even bother enabling vsync when it will never engage if the limit is below max refresh rate? Am i missing something?
Gsync can have tearing too. Enabling vsync will fix that at no additional latency cost (as long as you cap your FPS correctly.) It's basically "for free."

This was originally how gsync was shipped by nvidia. The ability to disable vsync was added later on, since some people wanted it. But disabling it can result in tearing even if you cap your FPS.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

TTT
Posts: 253
Joined: 28 Jul 2018, 14:17

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by TTT » 22 Sep 2018, 13:21

In the article it says ingame limiters should'nt introduce any latency.

I've been messing around with my Destiny 2 settings and having the limiter on introduces very noticable input lag, basically like Vsync on with floaty mouse. I turned it off and instantly its super snappy.

Is this just the game?

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by Sparky » 22 Sep 2018, 16:54

TTT wrote:In the article it says ingame limiters should'nt introduce any latency.

I've been messing around with my Destiny 2 settings and having the limiter on introduces very noticable input lag, basically like Vsync on with floaty mouse. I turned it off and instantly its super snappy.

Is this just the game?
Make sure the game isn't reducing your refresh rate outside of g-sync, and that the framerate cap is set below the maximum refresh rate of your monitor.

https://www.kotaku.com.au/2017/10/how-t ... destiny-2/

If you still have problems, post a copy of that cvars.xml file, or try RTSS.

TTT
Posts: 253
Joined: 28 Jul 2018, 14:17

Re: Blur Buster's G-SYNC 101 Series Discussion

Post by TTT » 23 Sep 2018, 04:03

Sparky wrote:
TTT wrote:In the article it says ingame limiters should'nt introduce any latency.

I've been messing around with my Destiny 2 settings and having the limiter on introduces very noticable input lag, basically like Vsync on with floaty mouse. I turned it off and instantly its super snappy.

Is this just the game?
Make sure the game isn't reducing your refresh rate outside of g-sync, and that the framerate cap is set below the maximum refresh rate of your monitor.

https://www.kotaku.com.au/2017/10/how-t ... destiny-2/

If you still have problems, post a copy of that cvars.xml file, or try RTSS.
The game must be updated since that article becuase you can choose any number you want now. I have the 240hz Gsync monitor, so leaving it unlimited on max settings I don't go over the sync anyway.

I turned gsync off completely and it feels alot better than even having gsync on uncapped frames. I think I'm just sensitive to input lag and apparently the unpercievable input lag Gsync causes I can tell like night and day, it makes the mouse feel heavier with like a drag and quick precision aiming on fps games becomes very difficult.

Post Reply