Blur Busters Forums

Who you gonna call? The Blur Busters! For Everything Better Than 60Hz™ Skip to content

ULMB(lighboost) & G-sync? [What are technical challenges?]

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers. The masters on Blur Busters.

Re: ULMB(lighboost) & G-sync? [What are technical challenges

Postby Trip » 26 Apr 2014, 05:30

flood wrote:Is it possible to force ulmb + gsync? so that the strobe length is constant

This would not cause visual problems if the game's framerate is exactly consistent. For example csgo can easily be capped to exactly 120fps

No this is exactly the problem gsyncs functionality would be lost. Gsync fixes problems if fps =/= refresh and adjusts the monitors refresh to the framerate bluntly said. But the strobe interval is a bit like regular vsync it is fixed so it would not work. What you are suggesting could be done by just running vsync/adaptive vsync and strobing. If gsync would be running while ulmb would be running you will get sync problems the strobes would happen while the display is still scanning out the image. The only way to fix this is to adjust the strobe rate in correspondence with refresh rate. But this creates brightness issues since the strobes are still the same length but happen less often. These last two parts are what people now try to fix and especially the latter part is really tought to implement right now(brightness problem).
Trip
 
Posts: 152
Joined: 23 Apr 2014, 15:44

Re: ULMB(lighboost) & G-sync? [What are technical challenges

Postby flood » 26 Apr 2014, 15:00

i thought the main difficulty is in strobe length rather than strobe timing...

surely it can't be too difficult to strobe for a fixed amoutn of time after each refresh
flood
 
Posts: 881
Joined: 21 Dec 2013, 01:25

Re: ULMB(lighboost) & G-sync? [What are technical challenges

Postby Trip » 26 Apr 2014, 18:36

flood wrote:i thought the main difficulty is in strobe length rather than strobe timing...

Correct timing the strobe to happen is not really difficult since the vsync signal is the trigger, after that the display will be refreshed the time it takes to activate all the pixels is also fixed (maximum refresh rate). Next you wanna know when the last pixel is finally transitioned to another color. This should ideally be the vblank time. Next the strobe will happen which will light up all those pixels which reside in the dark. But here the biggest problem starts since you need to know how long you want to strobe. This needs to be done extremely precise (microsecond accurate) as mentioned earlier in the thread by mark.
For this there currently is no implementation in the monitor to acount for this. I suggested controlling the length by software since I thought this would be possible but alas it is not.
What Mark suggested atleast I hope I interpeted it right is by gradually allowing the backlight to light up following some kind of sine wave like trajectory during the vblank. And during every vsync calculate the light intensity of the strobe in case of pwm a longer strobe and pwm free a gradually increasing light intensity (sine wave). So during fast framerates you will have these short but really intense strobes and while frame rates lower you will have these softer sine waves(in case of pwm free) until some limit and then it will not at all be using strobing anymore but will stay constant.
Think of it as throwing a rock in a pool of water first you will get violent fast waves which will turn into ripples which will finally turn into a smooth surface. Here the rock is the vsync signal the violent wave/ripple the strobe.
The slope after the sine wave should mirror the slope before the sine wave since this will smoothly decrease the brightness while keeping average brightness intact. Maybe some paint will help I hope this is an accurate way to describe it since Mark uses a flat line in his examples but not different frequency's in a single line. The red line is the average brightness needed to be obtained note that the straight slope before it will flat line this is the brightness increase while it is waiting in vblank. In reality this line will probably have a lot more curve into it but drawing that is rather painfull to do with this mouse.
Strobes.png
Strobes.png (3.67 KiB) Viewed 2664 times

flood wrote:surely it can't be too difficult to strobe for a fixed amoutn of time after each refresh

So you see it is not a fixed amount of time that it needs to be strobed which is exactly the problem. The strobe should follow some kind of smoothly increasing line to atain average brightness in lower frame rates.
Trip wrote:But this creates brightness issues since the strobes are still the same length but happen less often. These last two parts are what people now try to fix and especially the latter part is really tought to implement right now(brightness problem).

Maybe I wasnt clear but the stuff in italics has to do mostly with strobe length not with the timing (when to strobe).
Trip
 
Posts: 152
Joined: 23 Apr 2014, 15:44

Re: ULMB(lighboost) & G-sync? [What are technical challenges

Postby flood » 26 Apr 2014, 18:47

Trip wrote:So you see it is not a fixed amount of time that it needs to be strobed which is exactly the problem. The strobe should follow some kind of smoothly increasing line to atain average brightness in lower frame rates.
I know i understand that.
but what I'm asking is if it's possible using existing hardware to have strobed gsync with fixed strobe length

I know this causes problems when there's frame time jitter or fps drops, but I want to know if it's even possible right now
flood
 
Posts: 881
Joined: 21 Dec 2013, 01:25

Re: ULMB(lighboost) & G-sync? [What are technical challenges

Postby Trip » 26 Apr 2014, 18:57

Technicly yes it is, but this defeats the purpose of both g-sync and strobing back lights when fps is not equal to the maximum display refresh rate or whatever you put it to. So you will end up with strange artefacts while this is the case since the strobe rate is out of sync with the refresh rate. What gsync basicly does is extend blanking time until a frame is ready. But if the frame isnt ready the strobe will still happen while it is scanning out. Which creates colors that are off double images etc. Even when the refresh rate is back up to the maximum atainable refresh rate it will still be out of sync since it never corrected for it.
I dont know if nvidia will even let you do this I dont own the monitor so I cant test it.
Trip
 
Posts: 152
Joined: 23 Apr 2014, 15:44

Re: ULMB(lighboost) & G-sync? [What are technical challenges

Postby Trip » 26 Apr 2014, 19:04

Btw this will not at all cause fps drops since gsync will still operate correctly you just get a flawed presentation on the screen itself. See it like the tearing problem. Those tears are pretty similar to what is happening with these strobes. Instead of you getting a torn apart image you now get a frame which may or may not be ready to be displayed.
Trip
 
Posts: 152
Joined: 23 Apr 2014, 15:44

Re: ULMB(lighboost) & G-sync? [What are technical challenges

Postby Trip » 28 Apr 2014, 05:14

Ah flood I think I finally understand your question. You mean to just use gsync in conjuction with a limit to your frame rate to say 120 fps and then assume it stays on 120 fps all the time so you can eliminate the lag created from vsync right? In that case I see no reason why this is not possible maybe nvidia put a lock somewhere that does not allow you to do this but I see technically no reason why this will not be possible. Though be warned whenever fps dips you get terrible brightness issues going from normal at 120 fps to really dim at 90 fps since strobes happen less often. But yeah if you want to use it for that then I can see some potential for games like quake live, cs, tf2, and other games with really steady frame rates. Quake on 125 fps g-synced with lightboost would be amazing though.
But you will have to consult someone with a g-sync monitor to see if nvidia allows it as I dont know.
Trip
 
Posts: 152
Joined: 23 Apr 2014, 15:44

Re: ULMB(lighboost) & G-sync? [What are technical challenges

Postby Chief Blur Buster » 08 May 2014, 23:10

Trip wrote:Ah flood I think I finally understand your question. You mean to just use gsync in conjuction with a limit to your frame rate to say 120 fps and then assume it stays on 120 fps all the time so you can eliminate the lag created from vsync right?

Only if this framerate cap is done by the game engine (and not NVIDIA).

Trip wrote: In that case I see no reason why this is not possible maybe nvidia put a lock somewhere that does not allow you to do this but I see technically no reason why this will not be possible.

This discussion is silly. This can't be reliably done at the driver/hardware level.

Even when you do this, you still need to modify the game engine to do ultralow-latency VSYNC ON algorithms. Such as Just-In-Time VSYNC. At this stage, capped GSYNC and VSYNC ON would become the same input lag, and at that point, what is the point of GSYNC??

External throttling of a game's framerate adds latency, by forcing the game to wait before displaying the frame. You need to let the game engine decide when to display its frame, so that it reduces the gap of time between input reads and display of frame.

Remember: 8-bit games and Nintendo games (including Super Mario Brothers) was VSYNC ON and had no input lag. A couple decades ago, we played Street Fighter and similar games; they didn't have noticeable lag. VSYNC ON only became evil because of 3D graphics accelerators and their framebuffered architectures. VSYNC ON originally wasn't high-lag in the past before 3D framebuffers. It's how it's used nowadays.

TL;DR: Game engines need to do in-engine framerate capping for low-latency fixed framerates (including for VSYNC ON and including for capped-framerate GSYNC). External frame rate limiters (e.g. NVIDIA, drivers, monitor limit such as GSYNC max rate) always add input lag. Always limit framerate at the source (write your game engine accordingly), not at the destination. That way, source never waits (adds lag) for destination.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter!
User avatar
Chief Blur Buster
Site Admin
 
Posts: 3438
Joined: 05 Dec 2013, 15:44

Re: ULMB(lighboost) & G-sync? [What are technical challenges

Postby StrobeMaster » 31 Oct 2016, 10:22

I have written down a pretty simple idea of how to combine variable refresh rate and strobed backlight, also implementation-wise pretty straight forward. It might well be that this idea is not exactly new, but at least I never saw it clearly spelled out like that. And if the idea is not new, I wonder what the manufacturers are waiting for. It is too simple for not being implemented.

NVIDIA has a pending patent regarding the topic (US 2015/0109286 A1, System, method, and computer program method for combining low motion blur and variable refresh rate in a display), but I don't think it includes my idea. The patent covers at least part of Chief Blur Buster's ideas though (Strobing on variable refresh displays), which he posted just one day after NVIDA filed the patent.
StrobeMaster
 
Posts: 42
Joined: 25 Apr 2014, 01:31

Re: ULMB(lighboost) & G-sync? [What are technical challenges

Postby Sparky » 31 Oct 2016, 16:02

StrobeMaster wrote:I have written down a pretty simple idea of how to combine variable refresh rate and strobed backlight, also implementation-wise pretty straight forward. It might well be that this idea is not exactly new, but at least I never saw it clearly spelled out like that. And if the idea is not new, I wonder what the manufacturers are waiting for. It is too simple for not being implemented.

NVIDIA has a pending patent regarding the topic (US 2015/0109286 A1, System, method, and computer program method for combining low motion blur and variable refresh rate in a display), but I don't think it includes my idea. The patent covers at least part of Chief Blur Buster's ideas though (Strobing on variable refresh displays), which he posted just one day after NVIDA filed the patent.
There are a bunch of techniques that might work, but the hard part is human. It's going to take a lot of testing to figure out which artifacts people notice, and how to tune everything to avoid noticeable flicker(which is nasty because different people have different sensitivity to it).

As for the artifact caused by your suggested method(for the moment lets say you can tune it to completely eliminate visible flicker), say you your eyes track something across your screen, with pure low persistence it's a sharp image, with pure sample and hold it's a continuous blur, with your low framerate mode you'll get a sharp image with a trail that looks like ghosting. Maybe that's a benign artifact, maybe people will find it more annoying than sample and hold. only way to find out is to try it.
Sparky
 
Posts: 514
Joined: 15 Jan 2014, 02:29

PreviousNext

Return to Area 51: Display Science, Research & Engineering

Who is online

Users browsing this forum: Baidu [Spider] and 1 guest