Gsync review [from board won in competition]

Talk about NVIDIA G-SYNC, a variable refresh rate (VRR) technology. G-SYNC eliminates stutters, tearing, and reduces input lag. List of G-SYNC Monitors.
Post Reply
cirthix
Posts: 27
Joined: 20 Dec 2013, 15:59

Gsync review [from board won in competition]

Post by cirthix » 05 Feb 2014, 02:59

Sorry for being slow on this.

Installation:
Took me about 5 minutes of work, plus about an hour inspecting things :P.

Overall rating here: low difficulty, but I REALLY don't like how nvidia just bends the board, especially around the DP connector. Sorry, but having a latching surface-mount connector (it does have through-hole lugs for the mounting posts) is bad enough (though common), but to have it actually bending the board to mount flush is just poor design and practically begging for damage to happen. Also, seriously, breaking off a standoff? The board is huge, should have moved the module over and made a hole in the board. :/

https://picasaweb.google.com/1129081983 ... 9395640866

There are lots of other pics of installation, no sense beating a dead horse there.

The assembled monitor has 3 modes of operation.
1) normal scanout, constant backlight brightness
2) ULMB (Ultra Low Motion Blur).
What is ULMB? Here, the gsync kit reads in the video data, tosses it into ram and simultaneously writes it out to the screen, flashes the backlight, and repeats the process.

ULMB uses a fast scanout (I measured ~1/175th of a second) so the added latency should be around 2ms at the top of the frame, 0 at the bottom. The dead time at the beginning of the input frame is used to strobe the backlight for a brief duration (I measured 1.8ms). Extended dead-times between frames are needed to give the pixels at the bottom of the panel time to complete their transitions before the backlight is turned on.

An oscilloscope capture of this process is shown here:
This picture shows the active data region in blue with the LED - side in red. The backlight is on when the red line is LOW. The display is updating when the blue line appears noisy.
https://picasaweb.google.com/1129081983 ... 1192396818

This mode is great for reducing motion blur, but is not perfect. It is essentially lightboost that can be enabled and disabled at the press of a button. Images in the lower third of the screen tend to have after-shadows from the last frame. Compare the non-ulmb mode with the two ulmb pictures. Motion clarity is GREATLY enhanced. For example, the moving images tests at testufo, such as
http://www.testufo.com/#test=photo&phot ... 0&height=0
are perfectly clear when moving. Street names are readable!
no ulmb:
https://picasaweb.google.com/1129081983 ... 2828371810
ulmb at the top of the screen:
https://picasaweb.google.com/1129081983 ... 0287927938
ulmb at the bottom of the screen:
https://picasaweb.google.com/1129081983 ... 6967487746


There are three downsides to ULMB.
A) Overdrive is tuned such that the perceived light output averages out to converge to the desired pixel value faster, but with a strobing backlight, we perceive a single sample of this process. I suspect that this mismatch is what causes the normally minor overdrive artifacts to be visible as multiple afterimages which can be clearly seen in the images.
B) Many people, me included, find flickering light sources irritating and uncomfortable to use. If really cheap improperly ballasted fluorescent lights or low-frequency PWM dimming strain your eyes, then ULMB is not going to be for you.
C) GSYNC is incompatible with ULMB in it's current incarnation. Each pulse of the LED is at a set current level, controlled by the brightness setting. The duration of each pulse is constant (1.8ms). This means that the brightness of the screen would go down with framerate, and with gsync, would be jumping all over the place. If nvidia wanted to get really fancy, they could pulse the LEDs harder during lower framerate scenarios to obtain constant perceived brightness, but that would require that the LEDs could handle it. I suspect not, so the solution was to disable ULMB under GSYNC mode or go to ~20% of the already-dim light output.

3)GSYNC mode: In a sentence: pretty frigging sweet. Here, the display is driven at a variable framerate to match what the gpu has finished rendering. The resulting video looks significantly smoother. To best explain, load up a game and configure it to run at ~40-50fps (assuming a 60Hz monitor). Now fly around the map and look at the movement of objects in front of each other. Even though your motion in the game is along a constant path, objects do not move smoothly, they judder around as if the camera was riding on a slightly bumpy road (the bumpy road analogy would be positional noise and not temporal, but if looking at a single object, the effect is pretty close). GSYNC paves this road. Motion is smoothed out. Having tearing removed is a nice benefit too ;).

For games running at framerates significantly higher than what the monitor can do, tears are generally pretty small and motion is smooth enough to make it hard to notice if running in gsync mode or not. For framerates around or under what the monitor can do is where gsync shines. The distracting nature of the jumpy motion is eliminated, so the downside of running at these speeds is primarily responsiveness and not fluidity. This can make some newer games (battlefield) significantly better :)

Overall, would I recommend the gsync kit? Not really. It is an overall expensive solution for something that is just a 24" 1080p tn panel, even though it can do some pretty cool stuff. Would I recommend gsync for your next monitor? Absolutely, variable refresh rate displays are awesome!

As a side note: I was unable to test my just-in-time rendering shim layer for some other reasons.
https://picasaweb.google.com/1129081983 ... 8580186674
https://picasaweb.google.com/1129081983 ... 6250361698
1) gsync does not work with windowed applications now
2) my hooks aren't implemented very well and break under resizing, including going to full-screen :/
Last edited by cirthix on 05 Feb 2014, 04:44, edited 1 time in total.

PoWn3d_0704
Posts: 111
Joined: 31 Dec 2013, 15:20

Re: Gsync review from competition.

Post by PoWn3d_0704 » 05 Feb 2014, 04:10

Awesome write up and completely consistent with what I see, down to the ghosting issue at the bottom of the screen.

However, my GSync kit isn't bent inside the monitor. It lays perfectly flat. Did you forget to break off the metal support in the center of the board? Because that is what it sounds like. Time to open her back up!.
Asus VG248QE with GSync. Blur Busters GSync Contest Winner.

User avatar
nimbulan
Posts: 323
Joined: 29 Dec 2013, 23:32
Location: Oregon

Re: Gsync review from competition.

Post by nimbulan » 05 Feb 2014, 04:15

I have to agree with you about bending the board. That really surprised me when I installed it, even for a prototype.

Also some very interesting information about the variable ghosting across the screen with ULMB active. I haven't seen anyone else note that and I hadn't seen it myself before now, but I've been almost exclusively using G-sync.
PoWn3d_0704 wrote:Awesome write up and completely consistent with what I see, down to the ghosting issue at the bottom of the screen.

However, my GSync kit isn't bent inside the monitor. It lays perfectly flat. Did you forget to break off the metal support in the center of the board? Because that is what it sounds like. Time to open her back up!.
The mounting brackets for the main board and the power supply board do not line up so you are forced to bend the board to secure it on both sides. Unless your monitor was manufactured differently, you must just not have noticed it bending.

Ptolmy
Posts: 8
Joined: 29 Jan 2014, 18:32

Re: Gsync review from competition.

Post by Ptolmy » 05 Feb 2014, 07:58

Very accurate review!
Was wondering if I was doing something wrong when I seen the board bending during installation..
I was just thinking how great it would be to have ULMB and gsync running at the same time and was hoping that they would offer it in a new driver down the road. But after your explanation about how the display brightness would fluctuate with the framerate, it doesn't seem like this will be possible.. oh well it's pretty cool as is though!

spacediver
Posts: 505
Joined: 18 Dec 2013, 23:51

Re: Gsync review from competition.

Post by spacediver » 05 Feb 2014, 16:27

Thanks for this fantastic and honest review :)

Black Octagon
Posts: 216
Joined: 18 Dec 2013, 03:41

Re: Gsync review from competition.

Post by Black Octagon » 06 Feb 2014, 02:30

Great write up!

(You OWN a friggen oscilloscope?! I feel thoroughly out-geeked)

Sent from dumbphone (pls excuse typos and dumbness)

cirthix
Posts: 27
Joined: 20 Dec 2013, 15:59

Re: Gsync review from competition.

Post by cirthix » 06 Feb 2014, 04:39

Black Octagon wrote:Great write up!

(You OWN a friggen oscilloscope?! I feel thoroughly out-geeked)

Sent from dumbphone (pls excuse typos and dumbness)
Of course I do!

http://www.picotech.com/picoscope5200.html

User avatar
ManuelG_NVIDIA
Posts: 33
Joined: 27 Jan 2014, 14:46

Re: Gsync review from competition.

Post by ManuelG_NVIDIA » 13 Feb 2014, 15:49

I must say I am very jealous of your oscilloscope (and G-Sync monitor).
Please send me a PM if I fail to keep up on replying in any specific thread.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Gsync review [from board won in competition]

Post by Chief Blur Buster » 13 Feb 2014, 16:38

cirthix wrote:ULMB uses a fast scanout (I measured ~1/175th of a second) so the added latency should be around 2ms at the top of the frame, 0 at the bottom. The dead time at the beginning of the input frame is used to strobe the backlight for a brief duration (I measured 1.8ms). Extended dead-times between frames are needed to give the pixels at the bottom of the panel time to complete their transitions before the backlight is turned on.
That's pretty correct. Large blanking interval (longer pauses between refreshes) are a staple of good strobe-backlight monitors.

There are two ways to create longer pauses between refresh cycles (accelerated scanout + long pause):
-- Do it at signal level. Sometimes it's done at the signal level (e.g. Vertical Total 1350 for a 1080p signal, using large Sync/Porch timings in a Custom Resolution Utility) as I'm able to do for the BENQ Z-Series monitor (newer firmware) with ToastyX CRU to reduce strobe crosstalk by putting more of the GtG transitions into a long (2ms) sync interval between scanouts.
-- Create it by partial FIFO buffering of input. In NVIDIA's implementations (LightBoost and ULMB), the signal is often output from the computer to the monitor at a constant speed (e.g. 1/120second for a 120Hz refresh cycle), so the monitor has to partially buffer the signal before doing an accelerated scan-out. So that you are unable to reduce input lag.

DisplayPort (single channel), however, can theoretically transmit a 1080p frame in 1/175th of a second, so this is an opportunity to reduce input lag. I do not think that it's currently being done because the display dotclock of NVIDIA's GPU is unaffected, so this suggests that computer-to-monitor transmission cycle is unaffected, since the transmission from GPU to monitor is based on the dotclock; and you kind of need a higher dotclock to do 1/175sec transmission of a 1080p frame, unless the drivers/chips are doing some novel things "behind the scenes" (which none of us knows about).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply