Why aren't gamers worshipping at the altar of strobing?

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
SeeNoWeevil
Posts: 22
Joined: 21 Apr 2014, 16:40

Why aren't gamers worshipping at the altar of strobing?

Post by SeeNoWeevil » 23 May 2015, 04:48

Strobing is the best thing to happen to gaming displays since LCD screens became mainstream imo. And yet, all gamers seem to want to do is push for higher and higher static resolutions at 60Hz. The concept of motion resolution just doesn't appear to exist. If you explain to someone how much of that detail is lost as soon as anything begins to move (i.e always), you'll be met with a blank stare. A 1440p monitor is a 1440p monitor. A 1440p monitor is more detailed than a 1080p monitor. One number is bigger than the other. G-Sync appears to have made this even worse, not only does it preclude the use of strobing (no one chooses strobing over variable sync) but it makes even lower refreshes 'smooth', further increasing motion blur. People with powerful machines and G-Sync displays, dropping to 30fps to maintain 2160p. All that GPU power burnt to throw away so many pixels in display blur :cry:

I guess you really need to see it first hand to appreciate. I can't imagine anyone could watch the UFO test, while turning strobing off/on would not be shocked at the results. To see 720p strobed resolve far more detail than 1080p non-strobed would have me marching my 60Hz LCD out to the tip.

User avatar
lexlazootin
Posts: 1251
Joined: 16 Dec 2014, 02:57

Re: Why aren't gamers worshipping at the altar of strobing?

Post by lexlazootin » 23 May 2015, 06:13

Have you tried "high speed" gaming on 144hz Gsync?

I think strobing is a amazing technology but I would argue that because of frame tears and image crosstalk that appears at the top or bottom of the screen you are getting a very clear picture, but it's a very clear picture of rubbish. I also don't like the mouse ghosting affect you get when your eyes move.

G-sync on the other hand does a amazing job of altering the overdrive to get a much clearer image compared to most monitors, nowhere as good as strobing but pretty freaking good. I think in total that you get much more consistently good 'information' compared to strobing but I do wish I could mix the two technology, maybe in the future. :)

Glide
Posts: 280
Joined: 24 Mar 2015, 20:33

Re: Why aren't gamers worshipping at the altar of strobing?

Post by Glide » 23 May 2015, 12:40

There are a lot of issues with strobing tech right now.
Though BenQ seem to have their own strobe tech which is more universal, the only other solution is really NVIDIA's ULMB and that is limited to 85/100/120Hz.
So it's already useless for all the games which are (stupidly) locked to 60 FPS.

Strobing also significantly reduces your brightness.
To achieve CRT-like motion, the strobe duration has to be so short that you lose more than 90% of your original brightness.
When your monitor is only using a 300-400cd/m² backlight, that's far too much.
We need to see 800-1000cd/m² backlights before that becomes viable for most people, who want something in the 100-200cd/m² brightness range.

Contrast - at least subjectively - seems to take a hit as well from having to crank up the backlight like this and uniformity certainly appears worse.

Another issue is that with strobing you absolutely cannot deviate from a 100% consistent framerate.
As soon as your framerate deviates from a locked 60 FPS at 60Hz, 85 FPS at 85Hz etc. there is intolerable stuttering and double-images.
That's a really serious issue, and something which was already a bad enough problem without strobing that G-Sync/Adaptive-Sync had to be developed as a solution.

And of course there is the issue of flicker which can range from generally uncomfortable to causing migraines.
It just isn't an option for some people, especially after being used to completely flicker-free displays.


I do agree that it can make a huge difference to motion clarity and I would like to see more widespread use, but I can also understand why it may not.
Hopefully we will see a G-Sync revision/update which combines variable refresh technology with strobing, but that would require even more brightness so that they can compensate for the variable brightness this would introduce. (brightness would increase or decrease with FPS if it is not compensated for)

And another issue is price. I would really like to pick up a G-Sync/ULMB monitor but the price is about twice of what I actually think a panel of that size is worth - especially if it's using a TN panel.

User avatar
sharknice
Posts: 295
Joined: 23 Dec 2013, 17:16
Location: Minnesota
Contact:

Re: Why aren't gamers worshipping at the altar of strobing?

Post by sharknice » 23 May 2015, 14:38

lexlazootin wrote:Have you tried "high speed" gaming on 144hz Gsync?

I think strobing is a amazing technology but I would argue that because of frame tears and image crosstalk that appears at the top or bottom of the screen you are getting a very clear picture, but it's a very clear picture of rubbish. I also don't like the mouse ghosting affect you get when your eyes move.

G-sync on the other hand does a amazing job of altering the overdrive to get a much clearer image compared to most monitors, nowhere as good as strobing but pretty freaking good. I think in total that you get much more consistently good 'information' compared to strobing but I do wish I could mix the two technology, maybe in the future. :)
I feel exactly the same. Strobing is great, but GSYNC is an even bigger deal.

Edmond

Re: Why aren't gamers worshipping at the altar of strobing?

Post by Edmond » 23 May 2015, 14:59

I rather worship some 144hz, flicker free, IPS, gsync screens... which i do

Falkentyne
Posts: 2793
Joined: 26 Mar 2014, 07:23

Re: Why aren't gamers worshipping at the altar of strobing?

Post by Falkentyne » 23 May 2015, 15:39

SeeNoWeevil wrote:Strobing is the best thing to happen to gaming displays since LCD screens became mainstream imo. And yet, all gamers seem to want to do is push for higher and higher static resolutions at 60Hz. The concept of motion resolution just doesn't appear to exist. If you explain to someone how much of that detail is lost as soon as anything begins to move (i.e always), you'll be met with a blank stare. A 1440p monitor is a 1440p monitor. A 1440p monitor is more detailed than a 1080p monitor. One number is bigger than the other. G-Sync appears to have made this even worse, not only does it preclude the use of strobing (no one chooses strobing over variable sync) but it makes even lower refreshes 'smooth', further increasing motion blur. People with powerful machines and G-Sync displays, dropping to 30fps to maintain 2160p. All that GPU power burnt to throw away so many pixels in display blur :cry:

I guess you really need to see it first hand to appreciate. I can't imagine anyone could watch the UFO test, while turning strobing off/on would not be shocked at the results. To see 720p strobed resolve far more detail than 1080p non-strobed would have me marching my 60Hz LCD out to the tip.
I agree with you.
That's why my Benq XL2720Z is the best monitor I ever purchased since I had a CRT That didn't have arcing or calibration issues (and I thought my VG248QE was good with the AMD strobelight lightboost workaround, yes it was good and the ghosting on that is still second to none, but still...).

People have simply forgotten just how good CRTs were when you had a caped 125 fps/125 hz (quake 3 etc), 120 fps/120hz, 85 fps/85hz, 100 fps/100 hz and kept the framerate steady at all times.

Benq hit a slam dunk on their first Z series (The XL2730Z is very buggy, until it gets fixed its a huge step back) and people don't even realize it.

Yes, Gsync and adaptive sync (Freesync) are HUGE but that's for the traditional sample and hold LCD crowd. But we're STILL far back from where we USED to be. Besides WEIGHT, reliability issues and calibration/degrading, LCD's were a HUGE step back from CRT's and everyone knew it back then. If they had designed a scanning backlight before throwing the first non laptop LCD's on the market, we wouldn't be in such a bronze age right now.

While Gsync and freesync do away with the biggest issues we ever had with Vsync gaming (refresh rate / X, stuttering when triplebuffering when FPS drops below refresh rate), they still didn't put the two technologies together--strobed backlight and adaptive sync.

I cant fairly say one is more important than the other. One takes us back to where we should have been all along (shows us just how far back we backstepped) while the other is a really big change to what we've had since the last 2 decades. But you are right--it shows just how complacent we have become, when people are thinking that 60 hz gigantic displays and blurry sample and hold gaming is the holy grail :(

All that really has to be done is you combine gsync and backlight strobing and you DISABLE the strobing when the refresh rate drops under 59.94 hz That's all you have to do. Yes, 50hz strobing can still have a smooth image compared to the absurd blurry mess that is 50hz vsync, but no one on the planet can handle 50hz backlight flicker. 60hz is already pushing the limits of what can be tolerated......

User avatar
masterotaku
Posts: 436
Joined: 20 Dec 2013, 04:01

Re: Why aren't gamers worshipping at the altar of strobing?

Post by masterotaku » 23 May 2015, 18:07

Falkentyne wrote:but no one on the planet can handle 50hz backlight flicker.
I can 8-) . I've played through Castlevania Mirror of Fate and Shelter in 3D at 100Hz, which means 50Hz per eye, and I've played the The Evil Within demo at 51Hz. Of course, dark games are the most tolerable, and my tolerance to flicker is high. As with many things, if variable refresh rate and strobing ever gets combined, let the user decide at what point strobing is disabled. And if we prefer variable motion blur or variable brightness.

I'd like to try strobing at 30Hz for many emulated console games :lol: .
CPU: Intel Core i7 7700K @ 4.9GHz
GPU: Gainward Phoenix 1080 GLH
RAM: GSkill Ripjaws Z 3866MHz CL19
Motherboard: Gigabyte Gaming M5 Z270
Monitor: Asus PG278QR

Falkentyne
Posts: 2793
Joined: 26 Mar 2014, 07:23

Re: Why aren't gamers worshipping at the altar of strobing?

Post by Falkentyne » 23 May 2015, 20:19

masterotaku wrote:
Falkentyne wrote:but no one on the planet can handle 50hz backlight flicker.
I can 8-) . I've played through Castlevania Mirror of Fate and Shelter in 3D at 100Hz, which means 50Hz per eye, and I've played the The Evil Within demo at 51Hz. Of course, dark games are the most tolerable, and my tolerance to flicker is high. As with many things, if variable refresh rate and strobing ever gets combined, let the user decide at what point strobing is disabled. And if we prefer variable motion blur or variable brightness.

I'd like to try strobing at 30Hz for many emulated console games :lol: .
~_____~
You, Sir, are a greater man than I am.
I mean 60hz I can deal with. Used to run that on CRT's. But 50hz gets me. Oh man...

User avatar
masterotaku
Posts: 436
Joined: 20 Dec 2013, 04:01

Re: Why aren't gamers worshipping at the altar of strobing?

Post by masterotaku » 24 May 2015, 03:55

Falkentyne wrote: ~_____~
You, Sir, are a greater man than I am.
I mean 60hz I can deal with. Used to run that on CRT's. But 50hz gets me. Oh man...
Once you try 24fps videos with black frame insertion (so it looks like a 24Hz CRT), everything >=50Hz looks tolerable. Or maybe I can stand it because I'm european and we had 50Hz CRT TVs since we were children.

In games, unless I see a big portion of the screen covered in white, the flickering doesn't bother me.

PS: wow, switching to 60Hz after being at 50Hz for a while makes it looks like it's flicker free :lol: .
CPU: Intel Core i7 7700K @ 4.9GHz
GPU: Gainward Phoenix 1080 GLH
RAM: GSkill Ripjaws Z 3866MHz CL19
Motherboard: Gigabyte Gaming M5 Z270
Monitor: Asus PG278QR

SeeNoWeevil
Posts: 22
Joined: 21 Apr 2014, 16:40

Re: Why aren't gamers worshipping at the altar of strobing?

Post by SeeNoWeevil » 24 May 2015, 06:15

I use a Sony TV with Motionflow Impulse at 60Hz at a distance of around 7ft with some LED ambient lighting. Flicker is not a problem at all. 60Hz on a monitor right in front of your nose would probably be a different matter though. I only really play at night and will reduce the backlight from full. It's not really bright enough for playing in a daylit room though. It's like playing on a big CRT. If Sony's OLED/4K sets abandon Motionflow Impulse I am going to be a very sad gamer indeed.

I guess I'm just bothered by brute-forcing away the problems of sample and hold with more pixels. Can you even 'get back' the detail lost on a sample and hold set? Does a 4K screen resolve more detail at 60Hz than a strobed 1080p panel at the sort of motion you see playing a game? We have people bickering over 900p vs 1080p on the next gen consoles, all while playing on 60Hz LCD.

Post Reply