Yes, it is BenQ's brand name of blur reduction, while NVIDIA uses "ULMB". See Motion Blur Reduction FAQ. They use the same technique (backlight strobing) to reduce motion blur.drmcninja wrote:Is the Blur Reduction on the BenQ XL2411 basically ULMB under another name or is it more optimized? (i.e, brighter, less ghosting/blurriness in middle of screen, less input lag)
Brand names for blur reduction include "ULMB", "LightBoost", "Turbo240", "MOTION240", "DyAc", "Dynamic Acceleration", "Blur Reduction" -- all of them use the same technology of backlight-strobing techniques.
To reduce the strobe crosstalk (double-image effects) with compatible BenQ/Zowie monitors, see Blur Busters Strobe Utility.
We agree Blur Reduction isn't useful for everything, especially when you're trying to focus on stationary crosshairs.cskippy wrote:I find ULMB distracting during competitive gameplay, and I actually do better with it off. It's easier to focus on the crosshair placement without seeing everything as you move it into position. My real issue is although it's 144Hz/144Ffps, it's still a fixed amount, and you can see a judder if you don't focus on it but view it in your peripheral vision.
It's a good option to have, but not perfect for every single game or tactic. -- It is a perspective of "I'm glad it's a setting that's available to me" even if not used for all games.
It's well demonstrated via TestUFO Eye Tracking Demo that Blur Reduction mainly helps when your eyes are tracking motion (ULMB helps) instead of trying to focus on stationary crosshairs (ULMB doesn't help)
....Which means someone might use Blur Reduction if you're doing low-altitude high-speed helicoptor flybys (Tracking fast-scrolling enemies), smooth RTS scrolling (tracking fast-scrolling objects), etc -- but they might not want to use Blur Reduction during stare-only-at-crosshairs tactics in competitive/eSports play (unless they are doing specific gameplay tactics that needs fluid eye-tracking advantages -- although this was more common during CRT-tube days).
That said, VSYNC ON can be ultra-low-lag if done properly. Unfortunately ultralow-lag VSYNC ON (average lag of half a frame for screen centre) is extremely finicky to setup, configure, and verify (without a high speed camera), and requires precise sync between the game engine and the GPU.cskippy wrote:VSync is a no go for competitive gaming as you're too far behind the other players regardless of setup (1 framre prerendered, double/tripple buffering etc).
It requires delayed rendering techniques which very few games and frame capping utilities are able to do.
Some implementations of ultra low-lag VSYNC ON is sometimes simply VSYNC OFF with intelligent tearline steering (timing the tearline to occur off the bottom edge of screen). One way to think about it is, tearline position, equals scanout position, equals time into refresh cycle. So, the figurative tearline (at 1080p@120Hz) moves downwards for every 1/135,000th of a second delay (at 135KHz horizontal scanrate = number of pixel rows per second = the number you see in ToastyX or NVIDIA Custom Resolution for 1080p@120Hz mode) -- tearlines at center of screen (crosshairs level) only need half a frame of lag to become offscreen, as the scanout on the display is a finite speed from top-to-bottom edge scan, and so center of screen is only half a frame away from either the top edge or bottom edge. So the time-difference between a tearline offscreen (bottom edge) and a tearline 10 pixels from bottom edge (wait for VSYNC ON) -- can be only 10/135,000th second (1080p@120Hz) or 10/67500th second (1080p@60Hz). The higher the tearline you're trying to eliminate, the more input lag in waiting for VSYNC. But for simplicity, engines/drivers just buffer-up and be done with it -- rather than actively aiming for the lowest possible VSYNC ON lag. And the VSYNC OFF is just a raw way to achieve the same thing (adding tearing and microstutter mechanics while at it too, in exchange for zeroing-out the lag framebuffer flip lag).
Today, GSYNC + in-game frame capping is the easiest way to get the "ultra low lag VSYNC ON" look. The monitor's flexible on waiting for slightly delayed frames so it can jitter easily (say 130-140fps) while having far less lag than VSYNC ON. So the only responsibility the game engine needs is a good, preciase, accurate in-game framerate cap (low-jitter so jitter in frametimes don't become shorter than display scanouts) with quick low-lag "(1)inputread-(2)render-(3)flip" cycle. The monitor'll do the rest, and you gain the "VSYNC ON" look with less lag.
However, it's theoretically possible to do this ultralow-lag VSYNC ON without using a variable refresh rate monitor, by finishing rendering (from input read) right before VSYNC, but this risks missing refresh cycles on a fixed-refresh monitor, unless you use some form of adaptive VSYNC OFF algorithm to create a ultralow-lag VSYNC ON look. (predictive waiting in order to reduce lag)
So if you look at that mathematically -- at 240Hz, half a frame of lag is only 2 milliseconds, so as future 240Hz or 480Hz monitors come out over the years, and that more of them can provide the "VSYNC ON look and feel" (e.g. GSYNC + fps cap) while adding much less lag -- it's probably going to be a little bit more tolerable competitively. More ideal with game engines that can't exceed monitor refresh rate (e.g. like playing Battlefield 4 on a 240Hz monitor -- and only achieving 100-200fps). Imagine ultralow-lag VSYNC ON on a future theoretical 480Hz monitor, may actually only add +1ms lag (half a frame lag). And you wouldn't really always need strobing at 480Hz, since that's already 2ms persistence (1/480sec = ~2ms = 2 pixels of motion blurring during 1000 pixels/sec) without needing backlight strobing ala ULMB. And framerates would rarely exceed refresh rate, anyway -- the monitor ceases to be the limiting factor in frame delivery.
But for now, VSYNC OFF is king for competitive gameplay on the high-framerate games -- especially CS:GO and Quake -- since VSYNC OFF at framerates far exceeding refresh rate is extremely silky-low-lag. That, we definitely agree with!
Yet for others who love Blur Reduction and microstutter-free motion, some of us wish we had lagless VSYNC ON. Technologically, we can get much closer to it. Eventually.
...Alas, it is /extremely/ hard to get ultralow-lag "VSYNC ON" however, often requiring co-operation between the game engine, the GPU, and the display... It's EASY to have 2 or 3 frames of lag, when it really theoretically only needs to be HALF a frame of lag (screen centre).