GSYNC vs CRT input lag?

Everything about latency. Tips, testing methods, mouse lag, display lag, game engine lag, network lag, whole input lag chain, VSYNC OFF vs VSYNC ON, and more! Input Lag Articles on Blur Busters.
User avatar
Chief Blur Buster
Site Admin
Posts: 11670
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: GSYNC vs CRT input lag?

Post by Chief Blur Buster » 11 Mar 2014, 10:41

pZombie wrote:Does the GPU use the maximum data rate to send data through a DVI/DP/VGA/HDMI cable?
If not, why?
Generally, no.
The data is usually sent at the same rate as scanout.
That is to allow zero buffered scanout (refreshing pixels as the pixels come in on the cable).
You may have seen the high speed videos of LCD refreshing, and see that LCD refresh top-to-bottom in a similiar way to CRT, except LCDs do not flicker.
pZombie wrote:Again, if not, is there a way to manually increase the data rate to the maximum possible, because as i see it, this would benefit a gamer by reducing the lag by some milliseconds maybe?
With digital displays, accelerating the scanout is more viable nowadays and with DisplayPort can present opportunity to transmit a refresh cycle faster. Currently, I am not 100% sure if there has been any situations where maximum data rate of a single-channel DisplayPort is used for a refresh of a slower scanout.

One can also accelerate a scanout somewhat using a Custom Resolution Utility on some displays by using a larger Vertical Total, by using an extremely large Front Porch (offscreen scanlines below the bottom edge of the screen) which increases the number of scanlines per refresh cycle, and forces the active scanlines to be transmitted in less time. However, the input lag savings for this situation mainly affects VSYNC ON situations and by only up to ~1ms average (~0ms at top, 1ms saving at center, ~2ms savings at bottom during VSYNC ON, when using Vertical Total 1350 on a BENQ XL2720Z), rather than VSYNC OFF situations which splices the new refresh immediately in the middle of the scanout (new refresh being spliced in the current scanout = tearline artifact for that refresh cycle).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

pZombie
Posts: 4
Joined: 11 Mar 2014, 09:43

Re: GSYNC vs CRT input lag?

Post by pZombie » 12 Mar 2014, 02:27

Chief Blur Buster wrote: However, the input lag savings for this situation mainly affects VSYNC ON situations and by only up to ~1ms average (~0ms at top, 1ms saving at center, ~2ms savings at bottom during VSYNC ON, when using Vertical Total 1350 on a BENQ XL2720Z), rather than VSYNC OFF situations which splices the new refresh immediately in the middle of the scanout (new refresh being spliced in the current scanout = tearline artifact for that refresh cycle).
I fail to grasp this particular part.

How is it that vsync off would not benefit from a higher bandwidth cable?

While i can see that sending a single pixel through the cable when there is no other data running through it would happen within nanoseconds, i assume that the cable is constantly filled with data sent to the monitor in a vsync off situation. As you said, it keeps sending pixels, not waiting for the scanout to finish a whole frame.

So the most recent pixel sent from the GPU has to line up behind the other pixels in the cable, there being a limit in the bandwidth. Limited by both the scanout speed, which i assume is a function of the refresh rate of the monitor, and also limited by the cable's max bandwidth should the scanout be able to operate at this max speed.

I can see that this whole situation of how the pixels travel through the cable might be less trivial than i described it above, so i would rather stop guessing and wait for if someone else could shed some more light onto this.

User avatar
RealNC
Site Admin
Posts: 3768
Joined: 24 Dec 2013, 18:32
Contact:

Re: GSYNC vs CRT input lag?

Post by RealNC » 12 Mar 2014, 06:13

pZombie wrote:i assume that the cable is constantly filled with data sent to the monitor in a vsync off situation. As you said, it keeps sending pixels, not waiting for the scanout to finish a whole frame.
The act of sending pixels through the cable *is* the scanout :) At least traditionally, since the VGA signal was actually driving the monitor's electron cannon. But even with digital monitors, the signal is setup in such a way that it emulates the behavior of an analog display. A digital monitor signal is not the equivalent of transmitting a bitmap. Monitor timings, even digital, still include stuff like "front porch" and "back porch", which actually would make no sense to a pure digital image transmission.

The graphics card is sending data at the same rate through the cable, regardless of whether vsync is on or off. The rate at which data is sent depends on the pixel clock, which in turns influences refresh rate. The GPU reads data sequentially and cyclically from a framebuffer and sends it to the monitor as it reads it. It does not wait to read the whole thing first before sending it; the only point I'm not sure about is whether it does this pixel by pixel, or line by line.

At 120Hz, the GPU reads out the framebuffer and sends it through the cable 120 times per second, regardless of vsync. And it doesn't matter whether new data is available or not; the framebuffer is sent over the cable 120 times per second even if the data is the same in each cycle. The only difference between vsync on and off is that with vsync on, the framebuffer does not change until the current cycle is completed. With vsync off, the framebuffer can change while data from it is still being read (this is the cause of tearing, since reading from the framebuffer continues as normal, but with new data.)
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
Chief Blur Buster
Site Admin
Posts: 11670
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: GSYNC vs CRT input lag?

Post by Chief Blur Buster » 12 Mar 2014, 08:43

pZombie wrote:How is it that vsync off would not benefit from a higher bandwidth cable?
The transmission over the cable is traditionally the scanout, so a faster scanout and longer pauses between refreshes, simply means more pageflips will occur during the blanking interval. That may lead to less tearing, but only marginally - e.g. a 2x faster scanout with extremely long blanking intervals (pauses between refreshes) would only cut tearing artifacts by half. When a pageflip occurs, the tear artifact takes at the exact location in the scanout where the page flip from the back buffer to front buffer ocurred.

Keep in mind there are drawbacks to a faster scanout -- severe color degradation, much like how 144Hz has poorer color quality than 60Hz. LightBoost at 120Hz has poorer color gamut (even when excluding brightness impact) in part due to the faster scanout needed for longer pauses between blanking intervals (used to let GtG transitions finish before flashing the backlight on a fully refreshed frame). Some monitors can do a good fast scanout with no noticeable color degradation, however.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

pZombie
Posts: 4
Joined: 11 Mar 2014, 09:43

Re: GSYNC vs CRT input lag?

Post by pZombie » 13 Mar 2014, 13:28

I agree with the above, but here is how i imagine it

I a vsync on situation

GPU computes the next frame in the backbuffer and stops computing further frames (in case of doublebuffer)
The GPU waits until the last pixel of the frontbuffer was sent to the monitor, and then flips (the backbuffer becomes the frontbuffer)

With triple buffering this becomes more complicated, especially since there are many ways to use 3 buffers. For games the best way would be if the GPU would keep computing frames into the two remaining buffers until the frontbuffer was read completely and then turn whichever of the two buffers had the latest complete frame into the frontbuffer. This would ensure the most up to date information was sent with vsync on.


As for vsync off,

Imagine two situations


1) The GPU is connected via a normal DVI cable to the monitor

2) One is where the graphics card is placed right inside the monitor, connected directly to the monitor input. Let's assume almost infinite bandwidth here.

In the first case, whenever the GPU manages to compute the next frame it will flip and the backbuffer becomes the frontbuffer, no matter if the monitor finished the full former frame. Let's say the monitor was done with pixel number 1000.
However, the monitor won't _instantly_ display pixel 1001 from the new information contained in the frontbuffer, because in the cable there are a lot of pixel from the "old" frontbuffer still.
For the sake of argument, let's say there are another 1000 pixels in the cable, so the monitor will go to 2000 pixels, before it starts picking the new information.
The pixel 2001 will be from the "new" frontbuffer.


In the second scenario however, because there is such high bandwidth and communication between the monitor and the GPU is so close, it would be imaginable that once the "new" frontbuffer is available, the monitor would get signal from the GPU to drop the 1000 old pixels and start accepting the newest information directly, resulting in a gamer getting feed more recent information about the gameworld.

At least this is what i imagine would be possible. Maybe i am wrong...

pZombie
Posts: 4
Joined: 11 Mar 2014, 09:43

Re: GSYNC vs CRT input lag?

Post by pZombie » 13 Mar 2014, 13:38

Finally i would like to ask if Gsync(NVIDIA) and freesync (AMD) are equivalent or if NVIDIA is offering something extra, given they choose to go through all the lengths of offering a complete hardware solution, when freesync is basically already available in the vesa specs as far as i understand it.

Does the GSYNC hadware offer an additional buffer which mirrors the backbuffer of the GPU as the newest frame gets computed, which would result in the frame being already close to the monitor just when it's done, and possibly being able to also accelerate the scanout so it takes much less time to output the whole frame than usual?

In such a case, GSYNC would be superior to freesync.

candyman9
Posts: 1
Joined: 29 Mar 2014, 12:13

Re: GSYNC vs CRT input lag?

Post by candyman9 » 29 Mar 2014, 12:19

do any models of CRTs have input lag at all, or are all always zero lag

User avatar
Chief Blur Buster
Site Admin
Posts: 11670
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: GSYNC vs CRT input lag?

Post by Chief Blur Buster » 29 Mar 2014, 12:20

None of the CRTs have input lag, as far as I know. That said, if you're using a video processor or line doubler (e.g. SDTV -> HDTV conversion), there is input lag there. Surround sound receivers can add some minor input lag too.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

spacediver
Posts: 505
Joined: 18 Dec 2013, 23:51

Re: GSYNC vs CRT input lag?

Post by spacediver » 29 Mar 2014, 15:59

candyman9 wrote:do any models of CRTs have input lag at all, or are all always zero lag
Technically it's not zero, but it might as well be. The video signal is sent through a cable as an analog waveform, and then is fed through video amplifiers in the display, and the resulting voltages cause electrons to fly across the tube at close to the speed of light. The rise time of typical phosphors, as I understand it, is on the order of a few nanoseconds.

baskinghobo
Posts: 9
Joined: 28 Feb 2014, 05:38

Re: GSYNC vs CRT input lag?

Post by baskinghobo » 26 Jul 2014, 01:13

changed my Vertical total to 1502 @ 120hz using CRU and toastyx pixel patcher on my xl2411t for fun. Didn't change any other setting because i wasn't sure if i was supposed too. It made my screen more grainy. Is it supposed to become like this?

Post Reply