Does Overdrive and ULMB add Input Lag?

Ask about motion blur reduction in gaming monitors. Includes ULMB (Ultra Low Motion Blur), NVIDIA LightBoost, ASUS ELMB, BenQ/Zowie DyAc, Turbo240, ToastyX Strobelight, etc.
User avatar
RealNC
Site Admin
Posts: 3737
Joined: 24 Dec 2013, 18:32
Contact:

Re: Does Overdrive and ULMB add Input Lag?

Post by RealNC » 28 May 2017, 21:53

drmcninja wrote:
Haste wrote:- Overdrive does not add input lag.
So this is an outdated article?

https://www.bit-tech.net/hardware/monit ... verdrive/1 (From 2009)
I've just read that article and it doesn't seem there's any actual correlation between overdrive and input lag. It just happens that the monitors that had higher input lag also had overdrive. In other words, they don't actually know if overdrive was the cause of input lag or not.

And their own conclusion was:

"it seems clear that excessive input lag appears to only affect PVA panels with overdrive technology (with the exception of the closely related MVA panel type)."

Also, it's from 2009 :-/
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Does Overdrive and ULMB add Input Lag?

Post by Chief Blur Buster » 29 May 2017, 10:01

Short answer: Not anymore. Lag generally doesn't happen anymore with overdrive.

Long answer: "yes" and "no".

Lag from overdrive is no longer used in most modern gaming monitors. But there used to be lag from overdrive. As recently as a few years ago. There's older overdrive algorithms that simply framebuffer the input, and computes the overdrive before running an LCD scanout. Adds another +16.7ms of input lag at 60Hz

However, modern LCD displays can do realtime overdrive (based on a buffer of the PREVIOUS refresh cycle only) -- while real-time scanning out the current refresh cycle. There's probably one or two pixel rows of buffer to give enough time for math calculations and stuff (and "previous-scanline" realtime scaling algorithms) -- a monitor can do almost everything (e.g. picture processing, scaling processing, AND overdrive processing) in realtime using just a "previous-scanline" buffering. But a single scan line -- one single row of pixels -- at 135KHz horizontal scan rate of 1080p@120Hz -- is only 1/135,000th of a second (less than 0.01 millisecond input lag.) Single-scanline-buffer processing has pretty much unmeasurable input lag in the light of 1ms GtG's. Even a 1 degree cooler temperature (LCDs respond slower in the cold) actually add more input lag than a single scanline!

There's more than one way to do overdrive. The old way was to do it as a full framebuffer, sometimes simpler electronically and computer-programming-wise if you have enough RAM, but it's mathematically unnecessary.

Today, HDR is the new engineering challenge for input lag. HDR is easier to do on a framebuffer basis, but you can do it in realtime -- if you have the proper chips (e.g. FPGA) and minimizing of processing dependancies (e.g. compression algorithms that depends on future data). It takes a lot of time to engineer new monitor capabilities using precision clock-cycle-perfect realtime programming (DSPs, ASICs, FPGAs, etc), especially when we hit dotclocks of 600 million pixels per second (1080p@240Hz or 4K@60Hz). Almost 2 billion subpixels per second when you divide into R/G/B which all have to be processed separately. In addition to high-quality real-time scaling algorithms and overdrive algorithms, that's a lot of computing power if you're needing floating-point HDR thrown into the mix too... So some HDR displays have added input lag due to HDR, but not all of them. Early-release of new monitor technologies often have more input lag than newer release for a specific technology such as HDR. Until it's refined into a more realtime technology (that doesn't require full frame buffering).

Input lag from turning on overdrive -- in a gaming monitor -- still happened a few years ago. One early 120Hz monitor, the Samsung SA series -- such as S23A700D for example. Its overdrive added 1 frame of input lag. This was a catch-22 because keeping overdrive turned off made the "Samsung 3D mode" (which doubled as an amazingly good blur reduction strobe backlight) look really bad unless you enabled an overdrive setting ("Response Time" = "Normal"), which then suddenly added 1 extra frame of input lag. Now strobing + at least 2 frames of input lag. You could feel the lag. Ouch!

Today, for BENQ/Zowie monitors, they use essentially lagless overdrive, so you're not going to be able to measure input lag differences from turning overdrive on/off. If you use a microsecond-accurate photodiode, to a lab workshop oscilloscope machine, with VSYNC-wire (e.g. VGA/DVI), you might see a few tens/hundreds microseconds less input lag with overdrive enabled -- because of earlier-GtG completion thanks to overdrive.

Now, some panel technologies may be extremely tricky and require forward-AND-backward overdrive (lookahead algorithms), and still need to add input lag during overdrive. But technically, lookahead is unnecessary.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

StrobeMaster
Posts: 48
Joined: 25 Apr 2014, 01:31

Re: Does Overdrive and ULMB add Input Lag?

Post by StrobeMaster » 29 May 2017, 11:06

RealNC wrote:It is a native 1440p monitor.
Evidence?

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Does Overdrive and ULMB add Input Lag?

Post by Chief Blur Buster » 29 May 2017, 14:41

StrobeMaster wrote:
RealNC wrote:It is a native 1440p monitor.
Evidence?
Already posted link:
http://techreport.com/news/31954/agon-a ... 0-or-240hz
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
RealNC
Site Admin
Posts: 3737
Joined: 24 Dec 2013, 18:32
Contact:

Re: Does Overdrive and ULMB add Input Lag?

Post by RealNC » 29 May 2017, 16:26

StrobeMaster wrote:
RealNC wrote:It is a native 1440p monitor.
Evidence?
This is a 1440p monitor. It's sold as such. If you claim it's not, YOU need to provide evidence.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

drmcninja
Posts: 112
Joined: 09 May 2017, 10:26

Re: Does Overdrive and ULMB add Input Lag?

Post by drmcninja » 29 May 2017, 17:28

Chief Blur Buster wrote:Short answer: Not anymore. Lag generally doesn't happen anymore with overdrive.

Long answer: "yes" and "no".

Lag from overdrive is no longer used in most modern gaming monitors. But there used to be lag from overdrive. As recently as a few years ago. There's older overdrive algorithms that simply framebuffer the input, and computes the overdrive before running an LCD scanout. Adds another +16.7ms of input lag at 60Hz

However, modern LCD displays can do realtime overdrive (based on a buffer of the PREVIOUS refresh cycle only) -- while real-time scanning out the current refresh cycle. There's probably one or two pixel rows of buffer to give enough time for math calculations and stuff (and "previous-scanline" realtime scaling algorithms) -- a monitor can do almost everything (e.g. picture processing, scaling processing, AND overdrive processing) in realtime using just a "previous-scanline" buffering. But a single scan line -- one single row of pixels -- at 135KHz horizontal scan rate of 1080p@120Hz -- is only 1/135,000th of a second (less than 0.01 millisecond input lag.) Single-scanline-buffer processing has pretty much unmeasurable input lag in the light of 1ms GtG's. Even a 1 degree cooler temperature (LCDs respond slower in the cold) actually add more input lag than a single scanline!

There's more than one way to do overdrive. The old way was to do it as a full framebuffer, sometimes simpler electronically and computer-programming-wise if you have enough RAM, but it's mathematically unnecessary.

Today, HDR is the new engineering challenge for input lag. HDR is easier to do on a framebuffer basis, but you can do it in realtime -- if you have the proper chips (e.g. FPGA) and minimizing of processing dependancies (e.g. compression algorithms that depends on future data). It takes a lot of time to engineer new monitor capabilities using precision clock-cycle-perfect realtime programming (DSPs, ASICs, FPGAs, etc), especially when we hit dotclocks of 600 million pixels per second (1080p@240Hz or 4K@60Hz). Almost 2 billion subpixels per second when you divide into R/G/B which all have to be processed separately. In addition to high-quality real-time scaling algorithms and overdrive algorithms, that's a lot of computing power if you're needing floating-point HDR thrown into the mix too... So some HDR displays have added input lag due to HDR, but not all of them. Early-release of new monitor technologies often have more input lag than newer release for a specific technology such as HDR. Until it's refined into a more realtime technology (that doesn't require full frame buffering).

Input lag from turning on overdrive -- in a gaming monitor -- still happened a few years ago. One early 120Hz monitor, the Samsung SA series -- such as S23A700D for example. Its overdrive added 1 frame of input lag. This was a catch-22 because keeping overdrive turned off made the "Samsung 3D mode" (which doubled as an amazingly good blur reduction strobe backlight) look really bad unless you enabled an overdrive setting ("Response Time" = "Normal"), which then suddenly added 1 extra frame of input lag. Now strobing + at least 2 frames of input lag. You could feel the lag. Ouch!

Today, for BENQ/Zowie monitors, they use essentially lagless overdrive, so you're not going to be able to measure input lag differences from turning overdrive on/off. If you use a microsecond-accurate photodiode, to a lab workshop oscilloscope machine, with VSYNC-wire (e.g. VGA/DVI), you might see a few tens/hundreds microseconds less input lag with overdrive enabled -- because of earlier-GtG completion thanks to overdrive.

Now, some panel technologies may be extremely tricky and require forward-AND-backward overdrive (lookahead algorithms), and still need to add input lag during overdrive. But technically, lookahead is unnecessary.
This is fascinating, thanks for the write up!

Did all the old 120hz Nvidia 3D monitors have Overdrive like that? I have an old Hanns-G HS233 back from 2013 (probably made even earlier) which had an 'Overdrive' option in the OSD that made it appear to have less ghosting.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Does Overdrive and ULMB add Input Lag?

Post by Chief Blur Buster » 29 May 2017, 22:05

It's hard to say which monitors had full-frame-lag overdrive -- it truly varied quite a lot across the board especially in the days when half of the monitors began to have lagless overdrive capability.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Does Overdrive and ULMB add Input Lag?

Post by Chief Blur Buster » 29 May 2017, 22:07

RealNC wrote:
StrobeMaster wrote:
RealNC wrote:It is a native 1440p monitor.
Evidence?
This is a 1440p monitor. It's sold as such. If you claim it's not, YOU need to provide evidence.
Easy, RealNC. I already replied to him first (in a "nice-terse" way than yours). It might be time to re-review Blur Busters Forum Happy Forum Guidelines, okay?

I like StrobeMaster's reverse engineering work and I don't want to uspet him.

StrobeMaster is probably legitimately asking on the AG251FG due to the lingering confusion of 240Hz working only in 1080p on this 144p monitor. The monitor is quite an odd beast, giving you either 1440p@144Hz or 1080p@240Hz, and it's totally understandable that he's asking this question because more than 50% of the articles on the Internet about this monitor doesn't even clarify nor acknowledge this oddity (yet). This is probably caused by an internal bandwidth limitation -- lots more complexities than simply using two DisplayPort cables.

I already mentioned this AG251FG quirk, so I've reminded him of the already-posted link.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

StrobeMaster
Posts: 48
Joined: 25 Apr 2014, 01:31

Re: Does Overdrive and ULMB add Input Lag?

Post by StrobeMaster » 30 May 2017, 08:43

RealNC wrote:
StrobeMaster wrote:
RealNC wrote:It is a native 1440p monitor.
Evidence?
This is a 1440p monitor. It's sold as such. If you claim it's not, YOU need to provide evidence.
Yes, I can see that it is sold as such. And I thought I provided evidence already for it not to be native 1440p (viewtopic.php?f=4&t=3335&start=20#p26440). The numbers I mentioned in my post came from the PDF datasheet, and these numbers speak for a native horizontal resolution of 1920 pixels, which I took as rather strong evidence for a native 1080p panel. Note that I am not questioning that this monitor can display a 1440p signal, and I am not surprised by the "1440p@144Hz or 1080p@240Hz quirk", but I would be surprised if it was using a native 1440p panel.

In the TechReport article (http://techreport.com/news/31954/agon-a ... 0-or-240hz) it just says "according to AOC's specs" and there is a link to the specs which, today, contain only very sparse information, and so does the PDF datasheet that can be downloaded from there - no numbers anymore.

I guess AOC is currently "updating" the specs to make them less misleading about what is actually in the box. Let's wait until AOC has finished updating.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Does Overdrive and ULMB add Input Lag?

Post by Chief Blur Buster » 30 May 2017, 10:01

Yes, spec updating shenanigans is causing great confusion at this time.

We will have to wait and see what the launch is. The story is probably more complex.
-- They had 1440p but switched to 1080p; or
-- They have 1080p with ability to downscale 1440p->1080p; or
-- They have 1440p hardware and there's "left hand vs right hand" confusion because 240Hz only works at 1080p.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply