Why aren't gamers worshipping at the altar of strobing?

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
hasham12
Posts: 1
Joined: 03 Jun 2015, 09:51

Re: Why aren't gamers worshipping at the altar of strobing?

Post by hasham12 » 03 Jun 2015, 10:02

great job sir u are really amazing and from 50hz to 60 hz is awsome

RLBURNSIDE
Posts: 104
Joined: 06 Apr 2015, 16:09

Re: Why aren't gamers worshipping at the altar of strobing?

Post by RLBURNSIDE » 07 Jun 2015, 18:58

After having owned a 144 G-sync LCD for the past week, I will never go back.

However I am curious about ways to improve sample and hold motion blur in the context of variable refresh rates with a wide range of frame rates, 30-144hz. Let's say we wanted to add some black time in between frames to reduce sample and hold, we could do it in such a way to guarantee an upper limit on the motion blur by intercepting through SweetFX or some DirectX wrapper, then inserting a black frame X miliseconds after the last non-black frame. If a new frame comes in the meanwhile, use that, if not, use the new black frame.

Should do it, no? A g-sync display in theory, at least past 30fps, hold the last frame sent by the GPU on the frame. so it's "holding" the last frame anywhere between 1/30th and 1/144th of a second. If instead you wanted to guarantee that your frame is held at most 1/90th of a second, you would send a black frame to the display after each frame in the 30-90 fps range (or after the equivalent X ms of duration since the last swap call or present was made).

This would mean, of course, that above 90fps your brightness would be at 100%, and below it would be at like 30-100%. This large range of brightness would be of course a hugely distracting thing since the human eye is very sensitive to changes in luminance, so what you'd need to do is boost the gamma of frames in the 30-90 range to perceptually compensate for the dimness. The luminance is an accumulated value of the past X duration in the human eye (i.e. it's integrated over time within some small interval window). So the instantaneous brightness you see at time T is some kind of average over the past little while. This would allow you in theory to compensate for loss of brightness when trying to combine a variable refresh rate / frame rate cadence with UMLB.

I wonder if it's possible to reduce the artifacts enough to make it usable. But given that humans are very sensitive to luma changes I'd say they have their work cut out for them. It's easy if, say, you have a 120hz display and a 60hz input signal, you just display one black frame after every signal frame and then your brightness drops by 50% but so does your motion blur. This is in general going to make super bright displays with strobing more and more necessary.

It should be possible to simulate some of these g-sync + ULMB ideas using some PP shaders and in a game where your FPS varies wildly but stays in the 60-90 fps range on a 30-144hz range g-sync or freesync monitor.

Personally I don't care too much about ULMB, with my new g-sync I don't see much blur at all and I doubt you will either with 120hz OLEDs. You could probably get away with a 50% ULMB given that OLEDs on smartphones reach 700-800 nits, which would bring them down to the order of a typical LCD monitor after all is said and done. Although the way Carmack and the Morpheus guys are reducing input lag now is by rendering at 120hz constantly guaranteed, by reprojecting the last frame with new head + object position inputs in screen space. Basically "faking" 120hz. Ps4 needs to do this, because, hey, good luck getting more than 60 real frames per second out of a pathetic console GPU, am I right?

Edmond

Re: Why aren't gamers worshipping at the altar of strobing?

Post by Edmond » 08 Jun 2015, 06:38

RLBURNSIDE wrote:snip
I liked what Michal Abrash said in some recent post somewhere - cant remember where now.

When talking about VR and how to get low persistence on variable refresh rate.
He said that one way would be to do real 100hz with variable refresh rate. Yet have each refreshed frame doubled a bunch of times like TV does it.
I dont remember exactly all the stuff, but if i remember correctly, i think the result meant a 0-100hz variable refresh OLED, while being flicker free and low persistence due to repeated frames going up to 500 or 1000 FAKE hz.

If this is true and if i remember this correctly. Id definitely want a monitor like that.

One thing is certain tho. With this new VR craze - top people are working on how to combine variable refresh rate with low persistance without mad flicker and WITHOUT needing 500fps@500hz.
So, i put my faith in those nerds.

Edmond

Re: Why aren't gamers worshipping at the altar of strobing?

Post by Edmond » 08 Jun 2015, 06:41

flood wrote:well at least we're past the days where a "gaming" monitor is a piece of crap 60hz tn overdrived so they can advertise 1ms
And this. A hundred times - this.

Glide
Posts: 280
Joined: 24 Mar 2015, 20:33

Re: Why aren't gamers worshipping at the altar of strobing?

Post by Glide » 08 Jun 2015, 07:07

RLBURNSIDE wrote:This large range of brightness would be of course a hugely distracting thing since the human eye is very sensitive to changes in luminance, so what you'd need to do is boost the gamma of frames in the 30-90 range to perceptually compensate for the dimness.
You can't really account for that loss of brightness with a gamma shift. It just doesn't work.
Another area where people have tried this is when emulating scanlines for old 2D games.

"True" scanlines would be to blank out every other line, and make up for the loss of brightness by making the display itself brighter.
There have been many attempts at emulating scanlines in a way which blacks out those lines and then tries to restore the original brightness of the image with gamma correction, but it just does not work at all. You get weird color shifts and highlight compression.

I'm not really sure that there's a good solution for VRR+ULMB until we actually have fine control over not only the strobe duration, but also the strobe brightness.

As long as the backlight is bright enough, and your maximum brightness is limited by the lowest framerate rather than the highest, you should be able to keep brightness equal. But then you have to deal with flicker below 50Hz and for most people that is absolutely unacceptable.

If you start to repeat frames to reduce this flicker, you eliminate the benefits of VRR by introducing strobing artifacts.
If you interpolate (reprojection) to reduce this flicker then you eliminate the benefits of strobing to increase the clarity of the image.

And I think that a variable flicker frequency would be a huge problem for most people. With even a 60Hz CRT, after a short amount of time you get used to the flicker. But if you change it to a different frequency, suddenly that flicker is very noticeable again. If the frequency is constantly changing, flicker might always be a distraction.

I suppose that as you approach lower framerates, you could reduce the strobe duration, increasing the amount of motion blur that we see, but I'm not sure whther that's the right approach either.


As much as I'd like to see it, I'm not actually sure that there is a good solution which lets you combine the two technologies.
I'd much rather see ULMB offered at lower refresh rates than 85Hz - since keeping newer games above 85 FPS at 1440p is not a realistic goal on today's hardware.
RLBURNSIDE wrote:Personally I don't care too much about ULMB, with my new g-sync I don't see much blur at all and I doubt you will either with 120hz OLEDs.
I've seen a few people say similar things, and it has me wondering whether people are just used to how bad 60Hz full-persistence displays are after a decade of using them, or if they actually can't see the difference.

There's a significant difference in motion clarity to my eyes between strobed displays and full-persistence displays when tracking objects in motion.
Edmond wrote:I liked what Michal Abrash said in some recent post somewhere - cant remember where now.

When talking about VR and how to get low persistence on variable refresh rate.
He said that one way would be to do real 100hz with variable refresh rate. Yet have each refreshed frame doubled a bunch of times like TV does it.
I dont remember exactly all the stuff, but if i remember correctly, i think the result meant a 0-100hz variable refresh OLED, while being flicker free and low persistence due to repeated frames going up to 500 or 1000 FAKE hz.

If this is true and if i remember this correctly. Id definitely want a monitor like that.
With VR, updating the display at a fixed rate is essential. If you don't update at a fixed rate, people are very likely to get motion sickness.

And this is where reprojection comes in. They take the previous frame and distort it to "fake" a new frame until the GPU has actually rendered a new image.

With this, you could render at 60 FPS but update the headset at 500 FPS when turning your head.
It's essentially the same thing as motion interpolation except you're calculating the new frame with positional data from the head tracker rather than estimating the motion.
And because you're using head tracking, the differences between each frame are much smaller than if you were using a mouse for control, since the movement speed is going to be far slower.

So it's possible that this could work for VR, since it is more important to be updating the screen at a fixed rate than having the absolute best image quality, and if you were updating at 500Hz, that could be similar to a 2ms low-persistence display without the flicker.

But the technique is not really going to be suitable for desktop monitors when you're controlling a game with a keyboard and mouse.

Gryz
Posts: 13
Joined: 27 Apr 2015, 12:36

Re: Why aren't gamers worshipping at the altar of strobing?

Post by Gryz » 10 Jun 2015, 07:12

I think the best solution to deal with both stutter (G-Sync) and anti-blur (ULMB) would be to have 1000 Hz monitors.

The problem with stutter is that if the GPU isn't capable of rendering a frame every 16.6ms, you will see some frames shown for 33.3ms. And if the GPU is really fast, you will see some frames skipped. That different between 16.6 and 33.3 is too big. G-Sync makes it so the frames are shown anywhere between 14ms and 35ms. So that the displayed times are closer to the time that was needed for rendering.

What if a screen can do 500 Hz ?
Then the screen can show a frame multiple times. Suppose it took the GPU 21 ms to render a frame. The monitor can then show it 22 times for 1 ms each time. That will make it so that the human eye sees it for a time that's pretty close to the time that it was supposed to be shown. The error-margin is just 1 ms. 2 ms max. A lot better than the 16.6ms error error-margin of a 60 Hz screen.

Now what if that screen can do 1000 Hz ?
You can then do the same trick that the Eizo FG2421 does. Insert a black frame in-between every real frame. That should get rid of motion-blur. With a 1000 Hz monitor, you can minimize stutter to a point that it's not noticable anymore. And you can do anti-blur.

All we need is a panel that can do 1000 Hz, and has a true pixel-response-time of under 1 ms.
And there will be no need for variable pulses of the backlight. Much easier, I'd guess.

Black Octagon
Posts: 216
Joined: 18 Dec 2013, 03:41

Re: Why aren't gamers worshipping at the altar of strobing?

Post by Black Octagon » 11 Jun 2015, 11:27

Gryz wrote:I think the best solution to deal with both stutter (G-Sync) and anti-blur (ULMB) would be to have 1000 Hz monitors.

The problem with stutter is that if the GPU isn't capable of rendering a frame every 16.6ms, you will see some frames shown for 33.3ms. And if the GPU is really fast, you will see some frames skipped. That different between 16.6 and 33.3 is too big. G-Sync makes it so the frames are shown anywhere between 14ms and 35ms. So that the displayed times are closer to the time that was needed for rendering.

What if a screen can do 500 Hz ?
Then the screen can show a frame multiple times. Suppose it took the GPU 21 ms to render a frame. The monitor can then show it 22 times for 1 ms each time. That will make it so that the human eye sees it for a time that's pretty close to the time that it was supposed to be shown. The error-margin is just 1 ms. 2 ms max. A lot better than the 16.6ms error error-margin of a 60 Hz screen.

Now what if that screen can do 1000 Hz ?
You can then do the same trick that the Eizo FG2421 does. Insert a black frame in-between every real frame. That should get rid of motion-blur. With a 1000 Hz monitor, you can minimize stutter to a point that it's not noticable anymore. And you can do anti-blur.

All we need is a panel that can do 1000 Hz, and has a true pixel-response-time of under 1 ms.
And there will be no need for variable pulses of the backlight. Much easier, I'd guess.
And of course, we've been hearing for years that OLED is capable of both those response times and those refresh rates. Not that any of the OLEDs hitting the market have anywhere near those specs, of course :(

Falkentyne
Posts: 2795
Joined: 26 Mar 2014, 07:23

Re: Why aren't gamers worshipping at the altar of strobing?

Post by Falkentyne » 11 Jun 2015, 15:47

Glide wrote:Wait, what? I thought the whole point of ULMB was that the strobe was synchronized with the LCD panel, and the backlight would only switch on when all transitions were completed.

That sounds awful.
That's LCD technology for you.
You cant control when all transitions are completed on an LCD screen, because the transition depends on the color transitions and the color changes.

For example, the transition is the FASTEST when a grey pixel shade changes to another grey pixel shade (1-2 ms).
The transition is the slowest when a pixel changes from white, to black, and back to white (>12 ms).

So how are you supposed to control a backlight strobe to compensate for this? You can't. Then the top to bottom refresh retrace, vertical blanking times, and so on...

the only time you would not have this type of artifact on a LCD is if the HIGHEST (slowest) transition time for white to black to white was 1ms. (this would match CRT phosphor decay times).

A CRT or a fast persistence OLED screen would not have this type of issue. LCD's will and do.

Glide
Posts: 280
Joined: 24 Mar 2015, 20:33

Re: Why aren't gamers worshipping at the altar of strobing?

Post by Glide » 11 Jun 2015, 18:56

Falkentyne wrote:So how are you supposed to control a backlight strobe to compensate for this? You can't. Then the top to bottom refresh retrace, vertical blanking times, and so on...

the only time you would not have this type of artifact on a LCD is if the HIGHEST (slowest) transition time for white to black to white was 1ms. (this would match CRT phosphor decay times).
60Hz = 16.67ms

For a strobe duration of 1ms:
  • Send image to display
  • Wait 15.67ms for panel to update
  • Switch backlight on for 1ms
  • Repeat
Higher refresh rates require faster panels, but reviews at sites like TFT Central indicate that the better gaming-optimized panels should be able to complete all transitions <7.33ms (120Hz = 8.33ms)

If the panel can complete all transitions faster than 15.67ms, reduce the wait time to minimize latency.

You don't need response times of 1ms. That's why it's a benefit that the backlight and the LCD panel are not "coupled together" like this.
OLED displays are the light source and the panel creating the image, so you can't "hide" artifacts like this.

And I'm not convinced that OLEDs are as instantaneous across all transitions as manufacturers would have you believe.
But maybe that willbe true by the time we have RGB OLEDs suitable for use as computer displays.

Falkentyne
Posts: 2795
Joined: 26 Mar 2014, 07:23

Re: Why aren't gamers worshipping at the altar of strobing?

Post by Falkentyne » 11 Jun 2015, 20:17

This is EXTREMELY complicated. What you're saying is far simpler than what really is happening here.

You are right about the response time.
The reason why 60hz has LESS STROBE CROSSTALK (crosstalk, not overdrive ghosting, but crosstalk) is because the panel has more time to complete transitions than at something like 120hz. You can test it with benq blur reduction with the single strobe "ON" setting at 60hz, compared to 120hz (WITHOUT a VT tweak). You will notice that the crosstalk at 120hz fullscreen test on this page:

http://www.testufo.com/#test=photo&phot ... &height=-1

Takes up literally 33% of the bottom of the screen.

While at 60hz (Single strobe must be enabled in the service menu), the crosstalk is only at about 10% of the bottom.

That's because at 60hz, the response time and refresh rate are low enough that the panel has enough time (within the vertical blanking period) to complete the pixel transitions.

You can see the same thing in Lightboost mode (which uses both per-line overdrive, which benq blur reduction does not) and which uses accelerated LC panel scanout (Benq BR needs VT tweaks to accomplish this), when comparing 100hz vs 120hz lightboost).

I'm probably using grossly incorrect terms, but I'm sure Chief (Where is Chief blur buster?) or masterotaku can explain far better than I can.

Post Reply