Page 2 of 5
Re: GSYNC vs CRT input lag?
Posted: 08 Mar 2014, 19:43
by Chief Blur Buster
RealNC wrote:Why is DP faster? Higher pixel clock? Does that mean that we can lower input lag over DVI by tweaking the timings? (Reduced vs standard CVT, for example.)
There are multiple causes of latency in DVI or DP. There's the transciever/packetization latency which is independent of the dotclock latency. Reducing input lag by a faster dotclock is one technique, but the monitor needs to support it. It is theoretically possible to deliver a 1080p refresh cycle in only about 5 milliseconds over DisplayPort, and if using both channels, you can push that down to as little as apporoximately 2.5 milliseconds -- theoretically allowing 1080p @ 400Hz or 4K @ 100Hz. Using reduced chroma (e.g. 4:2:0), you could probably squeeze 4K @ 120Hz over existing DisplayPort bandwidth.
There is a lot of opportunity for future displays to reduce latency further via DisplayPort, via faster frame refresh delivery. GSYNC is a part of this, since frame delivery occurs at the full GSYNC maximum refresh rate, regardless of the current GSYNC framerate/refreshrate. e.g. 30Hz with the low latency of 144Hz.
Re: GSYNC vs CRT input lag?
Posted: 08 Mar 2014, 19:48
by Chief Blur Buster
ENiKS wrote:DP limit is 4K@60Hz, which means it should be able to do 1920x1080@240Hz, therefore taking 4.2ms to upload a FullHD frame. Dual Link DVI limit is 1920x1080@144Hz, taking 7.9ms.
I'm not sure if the transmit speed is fixed or changing according to monitor refresh rate. I will try to measure actual values.
Actually, using both channels, the DisplayPort math indicates you can theoretically do 4K at nearly 100Hz over DisplayPort, and if you decimate chroma a bit (e.g. 4:2:0), you may be able to do 4K 120Hz over DisplayPort.
DisplayPort is 17.28 gigabits per second total over both channels.
4K 60Hz is only 11.9 gigabits per second. Even with error correction overhead, there's still room for a higher 4K refresh rate.
My math shows you can deliver a 1080p frame in about 2.5ms rather than 4.2ms, if pushing the max limit of DisplayPort 1.2. The future 1.3 will allow 1080p in a mere 1.2-1.5ms, almost hitting the theoretical 1000fps @ 1080p -- maybe possible with decimated chroma

Re: GSYNC vs CRT input lag?
Posted: 08 Mar 2014, 22:48
by Q83Ia7ta
baskinghobo wrote:Chief Blur Buster wrote:Also, I forgot to also mention that there's a 1-2ms differential between VGA and DVI.
Ok thanks. My monitor also supports displayport. Would that have even lower input lag than dvi or the same?
144hz, vsync OFF, gsync OFF:
ASUS VG248QE with gsync module (displayport): 14 ms
ASUS VG248QE (dvi): 24 ms
http://forums.blurbusters.com/viewtopic.php?f=5&t=389
Re: GSYNC vs CRT input lag?
Posted: 08 Mar 2014, 23:45
by baskinghobo
Wait!? is that meant to mean with gsync installed (but not turned on) the VG248QE has 14ms less input lag than the BenQ XL2411T? Shieeeet. Should of read that thread before buying a XL2411T. Did they test the XL2411T's instant mode? that's the whole reason i bought it for because i thought it was faster.
Re: GSYNC vs CRT input lag?
Posted: 09 Mar 2014, 00:48
by Chief Blur Buster
baskinghobo wrote:Wait!? is that meant to mean with gsync installed (but not turned on) the VG248QE has 14ms less input lag than the BenQ XL2411T? Shieeeet. Should of read that thread before buying a XL2411T. Did they test the XL2411T's instant mode? that's the whole reason i bought it for because i thought it was faster.
Actually, you won't be able to save input lag, as I have both and tested them against each other. Both my XL2411T and VG248QE (GSYNC) has nearly identical input lag (less than 1ms difference) for non-GSYNC mode, according to my tests.
I can't remember all the precise details, but IIRC there were certain modes of the unmodded VG248QE that adds a full frame of input lag but it is otherwise capable of bufferless scanout.
Re: GSYNC vs CRT input lag?
Posted: 09 Mar 2014, 01:00
by Q83Ia7ta
baskinghobo wrote:Wait!? is that meant to mean with gsync installed (but not turned on) the VG248QE has 14ms less input lag than the BenQ XL2411T? Shieeeet. Should of read that thread before buying a XL2411T. Did they test the XL2411T's instant mode? that's the whole reason i bought it for because i thought it was faster.
Instant mode on BenQ XL2411T is ON by default and i'm sure tester left defaults on every monitor. I have BenqQ XL2411T for more than a year and play high paced fps. I also play on CRT with 130 hz time to time and have to say difference in input lag is too tiny. Much more input lag(2-4x times) i feel when LightBoost is on.
Re: GSYNC vs CRT input lag?
Posted: 09 Mar 2014, 02:10
by baskinghobo
Oh i think i got you now. You meant that both of these monitors have the same input lag with dvi right? So i'm guessing the massive 14ms drop in latency in that test was from using displayport? But i'm a tad confused because you said it was only theoretically possible so does that mean it's not confirmed yet and that this test suggests that displayport lowers input lag but it's no proof?
Re: GSYNC vs CRT input lag?
Posted: 09 Mar 2014, 11:27
by Chief Blur Buster
baskinghobo wrote:Oh i think i got you now. You meant that both of these monitors have the same input lag with dvi right? So i'm guessing the massive 14ms drop in latency in that test was from using displayport? But i'm a tad confused because you said it was only theoretically possible so does that mean it's not confirmed yet and that this test suggests that displayport lowers input lag but it's no proof?
There are tons (literally thousands) of subtle causes of input lag. DisplayPort can increase or lower input lag, depending on the variables. e.g. There's always more overhead than VGA (more frame beginning lag), but that can be compensateable via a faster signal transmission and faster scan-out (less frame ending lag)
The big input lag difference is likely not caused by DisplayPort itself, but other issues that triggers the high lag (e.g. Instant Mode being turned off on the DisplayPort input, as an example -- e.g. firmware behavior differences on different inputs). ASUS doesn't have an adjustable instant mode setting, but that doesn't mean it doesn't have internal settings that may behave differently on each input. On one monitor, lag difference between DVI and DP could be nearly zero, and there can be another monitor that mysteriously has a large lag difference between DVI and DP.
It would be nice to get an unmodified VG248QE and test all inputs/combinations.
Re: GSYNC vs CRT input lag?
Posted: 09 Mar 2014, 13:45
by spacediver
interesting discussion of displayport vs hdmi 2.0 bandwidth here.
http://www.youtube.com/watch?v=kKJhX6eT5OE
(watch from 26.25 to 31.00)
Re: GSYNC vs CRT input lag?
Posted: 11 Mar 2014, 10:02
by pZombie
As an inputlag fetishist myself, i was interested to read more about the way of how and WHEN exactly the data of a fully computed frame is sent to the monitor via a VGA/DVI/HDMI/DP cable.
For the sake of argument, let's assume it would take 5 milliseconds to send a fully computed frame to the monitor via the cable on average.
Now my question is, does the monitor/GPU wait until the frame is fully computed and THEN starts to draw the data from the GPU, which would mean that the monitor would keep the "old" frame displayed for at least 5ms, until the new data it just requested or was signaled by the GPU to receive comes in
OR
does the GPU send the data to the monitor while computing the frame, filling up some kind of buffer inside the monitor which then gets read whenever a complete frame is available and the monitor is ready to refresh again?
In the former case scenario, even if the monitor & GPU were capable of higher refresh rates, there would always be at least the 5ms lag just from the data having to go through the cable.
In the second scenario with the monitor already receiving data into a second buffer, there would be almost zero lag from the data leading to the crystals switching
I searched for this info but unfortunately my google fu failed me, or it is not available in a form i understand it.
Related to the above, as an additional question,
If scenario one applies,
Does the GPU use the maximum data rate to send data through a DVI/DP/VGA/HDMI cable?
If not, why?
Again, if not, is there a way to manually increase the data rate to the maximum possible, because as i see it, this would benefit a gamer by reducing the lag by some milliseconds maybe?
edit: Additional to the above, supposed scenario 1 is the case. Does the GSYNC nvidia device add such a buffer i described in scnenario 2? It would only make sense if the intent was to minimize the input lag and stuttering, unless i am thinking something wrong here..