Why does G-Sync require memory?

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
TheExodu5
Posts: 26
Joined: 18 Dec 2013, 14:35

Why does G-Sync require memory?

Post by TheExodu5 » 07 Jan 2014, 11:27

I'm just trying to think of the reasons why the G-Sync module has 768mb of onboard memory, and what the implications are, especially with regards to buffering and input lag.

My current hypothesis and hope is that a buffer is only employed when the framerate surpasses the refresh rate of the monitor (as I assume a frame would have to be buffered at this point to avoid being dropped entirely).

User avatar
nimbulan
Posts: 323
Joined: 29 Dec 2013, 23:32
Location: Oregon

Re: Why does G-Sync require memory?

Post by nimbulan » 07 Jan 2014, 21:54

I can't find the link at the moment but I do remember seeing a quote from nVidia that GSync does not perform any buffering. It is designed for low latency gaming after all.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Why does G-Sync require memory?

Post by Chief Blur Buster » 07 Jan 2014, 22:16

TheExodu5 wrote:I'm just trying to think of the reasons why the G-Sync module has 768mb of onboard memory
Bandwidth. They needed three chips in parallel for triple-channel bandwidth. I imagine that sufficiently-fast chips smaller than 256mb weren't cheaper, so they went with that.

The memory is needed for processing, from what I understand.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
RealNC
Site Admin
Posts: 3757
Joined: 24 Dec 2013, 18:32
Contact:

Re: Why does G-Sync require memory?

Post by RealNC » 08 Jan 2014, 03:03

So the G-Sync module runs an actual OS inside? It's like an embedded OS and thus needs RAM?
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

HeLLoWorld
Posts: 33
Joined: 07 Jan 2014, 18:44

Re: Why does G-Sync require memory?

Post by HeLLoWorld » 08 Jan 2014, 09:25

Here's my take on it:
It may not be to redraw the frame when time exceeds 33ms. After all, the card can resend it after that time if it has no new frame, and quite possibly does.
(They say the card polls (source of current 1ms overhead) the module to know if it is still drawing a frame before trying to upload a new frame, even though one would think the card should know this very well: when the screen is drawing, it's because the card is uploading, or a tiny delay later; or it's because 33ms have passed and the screen is refreshing itself from onboard memory without the card knowing? Then the card could use a parallel timer to guess this).

Anyway, my take on memory:
The max rate is 144Hz, 7ms. Also, the link upload time is also 7ms (even when not reaching the full rate btw, which is good for lag).
Thing is, possibly they chosed to refresh the panel way faster (top-down) than the speed at which the data arrives, after receiving the image.
That's why you need memory to store what's coming from the link, and spit it later. Hence the buffering.
This faster update rate is good, because it means the changing of the whole screen is more instantaneous (unlike a crt where the image gets redrawn again and again without pausing inbetween, and different parts of picture have different lag).
This also probably means that pixels decay more simultaneously thus homogeneously and that maybe less complex y-dependent overdrive algo are needed if any. Oh! Also, think about multipass for these algos...Buffer needed to read twice. Or overdrive? Buffer needed for the last frame.

This faster refresh time, at last, is simply a requirement for crosstalk-free 3D / lightboost / ulmb.
So the time for all pixels being refreshed must be an order less than the refresh rate, lets say 1 or 2ms out of 7.
So you need to spit pixel data way faster than they arrived.
Faster than 144Hz 1080p (think it's 1440p or 4k ready)...3 or 7x faster. I did the math : 4.18GB/s for mere FHD and 2ms.
See, you need a bit of bandwidth. Oh! and this is for single pass.

Also, if they're smart, they can start slapping top pixels before bottom data has come, in a way both things finish simultaneously, like a youtube buffered video that plays when the beginning has arrived, and catches the download buffer right at the end perfectly. This saves lag (Mark here suggested this some time ago) (if multipass, you only gain on the last pass however).

As a side note, Carmack talked about non-topdown refreshes, maybe random or grid like. This had also crossed my mind. Visual results could be smoother. Another case where buffering the frame would be needed (though you could not start in advance in this case, you may want the last pixel early on :) ...Though if the pattern is algorithmic the card could send the frame in the correct format ie in the order the pixels are refreshed...Z order and Morton codes come to mind...After all that's precisely how memory is layed out in modern gpus framebuffers! So in fact, heh, it may actually avoid the linearization that probably takes place before the stuff goes on the wire).
[On second thought, I'm not sure it would be better that way, for this reason : granted, you sure lose the non-isotropous curtain effect of top-down refresh, but on the other hand, you lose locality coherence, ie locally similar pixel decay times, and that seems bad]

HeLLoWorld
Posts: 33
Joined: 07 Jan 2014, 18:44

Re: Why does G-Sync require memory?

Post by HeLLoWorld » 08 Jan 2014, 09:49

Quick answer: no need to gather pitchforks and torches _every_ time someone says the word 'buffer' ("To arms, brothers! We shall put an end to Those Who Introduce Lag In Our Games!" :) )
Buffers for current frame or current+last frame do not necessarily mean there will be a one or more frame delay.
There might be cases where it's needed, and what's more as I showed, there might be cases when it will not cause lag, at all.

On OS in ram: No. Not an os as in, linux. Probably the thing that drives the refresh may use a microfirmware for a dsp or a bunch of instructions for an custom asic or something like that, but I wouldn' grant that the title of OS.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Why does G-Sync require memory?

Post by Chief Blur Buster » 08 Jan 2014, 10:53

HeLLoWorld wrote:Thing is, possibly they chosed to refresh the panel way faster (top-down) than the speed at which the data arrives, after receiving the image.
How? That's true for LightBoost (delay, then partially buffer, then accelerated scanout, for a very long blanking period to settle LCD pixels, before flashing the strobe backlight). But this does not seem to be true for regular 120Hz/144Hz. My high speed camera tests and photodiode-oscilloscope tests show that the scanout is in real time. There's only about 2-3ms between Direct3D Present() of a blank solid color buffer and the first pixels at the top edge of the screen hitting my eyes, based on my tests.

All current ASUS/BENQ 120Hz-144Hz monitors use zero-buffered realtime scanout. Perhaps they buffer a few scan lines, but the latency from Present()-to-photons is less than one frame cycle.

I did hear about the 1ms poll time, but inquiring mind would like to know how you got this information that GSYNC does an accelerated scanout? That would be bad for top-edge lag, because you cease to do real-time scanout.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

HeLLoWorld
Posts: 33
Joined: 07 Jan 2014, 18:44

Re: Why does G-Sync require memory?

Post by HeLLoWorld » 08 Jan 2014, 11:51

Hi, if you did measure that top of frame is updated before bottom has come, then it's clear there's just no accelerated refresh!
It was a speculation, I do not have more information than anyone :) . I explicitly said 'possibly'.
Anyway the thing is designed to still be able to do 3d and ulmb, so maybe the ram is only there for this.
Or as I said, even with no accelerated, scanout if there's overdrive or just an optimized refresh needing the last state of current pixel to customize the voltage they send to attain the desired color, you need a buffer with the last frame.

TheExodu5
Posts: 26
Joined: 18 Dec 2013, 14:35

Re: Why does G-Sync require memory?

Post by TheExodu5 » 08 Jan 2014, 13:54

Alright so it seems that the buffer is likely needed in case the framerate ever surpasses the refresh rate of the monitor, and a high bandwidth would be needed in this case.

This begs the question: can the framerate ever be too high? Can the bandwidth ever be insufficient? Frames will have to be dropped at some point if the framerate is consistently above the refresh rate of the monitor, so I wonder exactly how this may be handled.

User avatar
nimbulan
Posts: 323
Joined: 29 Dec 2013, 23:32
Location: Oregon

Re: Why does G-Sync require memory?

Post by nimbulan » 08 Jan 2014, 14:22

TheExodu5 wrote:Alright so it seems that the buffer is likely needed in case the framerate ever surpasses the refresh rate of the monitor, and a high bandwidth would be needed in this case.

This begs the question: can the framerate ever be too high? Can the bandwidth ever be insufficient? Frames will have to be dropped at some point if the framerate is consistently above the refresh rate of the monitor, so I wonder exactly how this may be handled.
G-sync functions in conjunction with nVidia's adaptive vsync, so your framerate can never exceed the refresh rate. In order to maintain a framerate higher than the refresh rate with G-sync on, the monitor would either have to discard every other frame and cut the refresh rate in half, which would increase input lag and decrease the displayed framerate, or reintroduce stutter by behaving like triple buffering. I can't see any situation where the first case would be preferrable to capping the framerate, and in the second case you're better off just turning vsync off completely.

Post Reply