Blur Busters Forums

Who you gonna call? The Blur Busters! For Everything Better Than 60Hz™ Skip to content

Ensuring a flicker free gaming experience

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!

Re: Ensuring a flicker free gaming experience

Postby Chief Blur Buster » 22 Oct 2017, 17:37

Slow transitions: Yes
Flicker from slow transitions: No

Slow transitions do not reliably correlate to flicker. Also, the faster the transition, the more likely flicker will become worse (e.g. inversion artifacts, etc). When a pixel is slow, it's harder for the pixel to flicker harshly. Faster TN monitors have more inversion flicker than slower IPS/VA monitors. In short: Flicker and slow transitions may exist at the same time, but they generally almost are never the causes of each other.

You are currently asking deeper questions that is mainly only asked by those people who get eye pain from most displays (ouch to CRT, ouch to plasma, ouch to DLP). But you've already mentioned you're okay with many screens. So I think you're overanalyzing at this stage. Find a PWM-free display with modern ergonomic features (e.g. low blue light, etc) and be done with it. You can even get one from Amazon -- they have good money-back guarantees.

Regardless, I guarantee you -- relatively speaking -- a PWM-free backlit LCD is among the most flicker-free screen technology of any kind currently on the market (LCD, OLED, CRT, DLP, plasma, whatever) -- of all of those, LCD has the least flicker issues of them all, and of any flicker issues, only affects the ultra-sensitive type people who get instant, immediate eye pain from staring at CRT/plasma/DLP/etc. (usually less than 1% of population). And even so, since you have not indicated a problem looking at plasma, your eye issues may have nothing to do with flicker.

Usually, flicker is bad. But it's not always the case for 100% of humans worldwide. For some of us, there is motion blur eyestrain. The the use of NVIDIA ULMB (Ultra Low Motion Blur) uses intentional precisely synchronized backlight-flashing at 120 Hertz (like a 120Hz CRT) to fix the LCD motion blur, as seen in 60Hz vs 120Hz vs ULMB. You can see things like the old LightBoost testimonials from year 2013 of people absolutely loving a 120Hz strobed monitor flashing in their faces (LightBoost intentionally strobe-flashes at 120Hz). Obviously, you're probably wanting to avoid all that anyway, but it goes to say, what bothers person X doesn't always apply to person Y.

Also, you're not one of those unlucky people who get eye pain their entire lifetimes with all displays? The kind where you get sudden stabbing eye pain the moment you stare at a CRT/plasma/DLP/etc. [rheoretical question too, because you've already answered you've been fine with lots of screens]

And I've already given you advice (go for a PWM-free IPS 60Hz LCD) which solves the vast majority (often >99%) of all the flicker-related issues of an LCD display. And now, you're still asking these questions to try to eliminate the final 1% of the remaining faintest-gentlest-modulations of an already most-flicker-free-possible display technology category (PWM-free IPS 60Hz LCD)? [rheoretical question]
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6312
Joined: 05 Dec 2013, 15:44

Re: Ensuring a flicker free gaming experience

Postby link » 22 Oct 2017, 18:17

@chiefblurbuster thanks for the help and yea I'm definitely overthinking it. @realnc already addressed this but if you don't mind please share your thoughts on this articles claims https://www.eetimes.com/document.asp?doc_id=1272249

Can anyone recommend a true 8bit 60hz 1920x1080 ips monitor if one does exist. Doubt ips in those specs exists might look into higher resolutions and hz right?
link
 
Posts: 33
Joined: 07 Oct 2017, 23:45

Re: Ensuring a flicker free gaming experience

Postby Chief Blur Buster » 22 Oct 2017, 18:35

link wrote:@chiefblurbuster thanks for the help and yea I'm definitely overthinking it. @realnc already addressed this but if you don't mind please share your thoughts on this articles claims https://www.eetimes.com/document.asp?doc_id=1272249

Year 2005? Enough said.\

The art of Inversion/FRC/PWM/etc have greatly advanced over the years. Even back then in year 2005, LCDs flickered a lot less than CRT.

I have no new recommendations for you other than the ones I've already given. Apologies.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6312
Joined: 05 Dec 2013, 15:44

Re: Ensuring a flicker free gaming experience

Postby link » 24 Oct 2017, 17:31

Having a hard time finding a ips/pls panel that is true 8bit and either 4k or 1080p. Most are either 2k resolutions or 10bit and that won't work well with ps4 pro. Does anyone know of a true 8bit 1080p or 4k pwm free gaming monitor?
link
 
Posts: 33
Joined: 07 Oct 2017, 23:45

Re: Ensuring a flicker free gaming experience

Postby Chief Blur Buster » 24 Oct 2017, 17:49

Why avoid 10bit or 12bit (Except for cost)? That's nonsensical. That actually benefits PS4 Pro due to less rounding errors. There will be less FRC issues with 8-into-10 than 8-remapped-to-8 (e.g. digital gamma curve situation causing color-rounding errors).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6312
Joined: 05 Dec 2013, 15:44

Re: Ensuring a flicker free gaming experience

Postby link » 24 Oct 2017, 19:36

Chief Blur Buster wrote:Why avoid 10bit or 12bit (Except for cost)? That's nonsensical. That actually benefits PS4 Pro due to less rounding errors. There will be less FRC issues with 8-into-10 than 8-remapped-to-8 (e.g. digital gamma curve situation causing color-rounding errors).


Wouldn't 8bit content (which I understand is most content anyway aside from HDR) being fed to a true 10bit display get dithered up. As in the panel would use temporal dithering to fill in the gaps in the source. And would this technique be the same as 6bit+frc panels that display 8bit content. Or are 10bit panels backwards compatible in a sense and just display 8bit as is.

Maybe I should go for 8bit+frc panel instead since 8bit content would match up with the panel and I doubt id ever view 10bit content.

I understand you're saying there are rounding errors that the panel/gpu might dither but I don't think that presents itself as a flicker (flashing of 2 different shades of colors to simulate missing color) does it,
however faint it may be? The flicker from frc that people have a problem with is the type that presents itself from 6bit+frc where the panel literally can't produce the colors it's being fed. I'm thinking that's probably different from a true 8bit being fed 8bit having some rounding errors?

Also when it comes specifically to the ps4 and Xbox one I'm starting to think it doesn't matter what bit monitor you have as from what I am hearing the xbox and ps4 pretty much ignore what type of display you have and enable temporal dithering. Check out this quote and let me know if it makes sense to you. "EDID is the "advertised" resolution settings that a monitor outputs to a graphics card upon connection, in theory it lists color depth as an option, 6/8/10 bits, but the problem is graphics drivers are being too "conservative" and forcing dithering all the time even when it is not needed (like a native 8 or 10 bit monitor)"

I wonder if things will be different with the newer Xbox one x.

Not sure what this means but pretty interesting. I was able to dig up a quote from Stacey Spears a well respected individual in the TV industry I'm sure you've heard of him from spears and munsil. He worked on the Xbox One and had this to say about Xbox One's bit depth: "On the Xbox One, 8-bit RGB is delivered to the HDMI transmitter. If you are outputting 36-bit, it is padded with zeros just before output. No actual processing is done at >8-bit. Xbox One's Blu-ray player is a matter of convenience only. A standalone Blu-ray player is better. I worked on the Xbox One, so I know the internals of the entire video pipeline."
link
 
Posts: 33
Joined: 07 Oct 2017, 23:45

Re: Ensuring a flicker free gaming experience

Postby RealNC » 25 Oct 2017, 06:04

link wrote:Wouldn't 8bit content (which I understand is most content anyway aside from HDR) being fed to a true 10bit display get dithered up. As in the panel would use temporal dithering to fill in the gaps in the source. And would this technique be the same as 6bit+frc panels that display 8bit content. Or are 10bit panels backwards compatible in a sense and just display 8bit as is.

10-bit includes all 8-bit colors. There are no gaps.

8-bit color means 2^8 colors per channel, which is 256 colors. 10-bit color is 2^10 colors per channel, which is 1024 colors. You can perfectly represent 256 colors using 1024 colors. The other way around is not possible; you can't represent 1024 colors using 256 colors perfectly, and you'd need dithering to make it less ugly.

The only negative thing that happens when feeding a 10-bit display with an 8-bit signal, and that out of the 1024 possible colors only 256 are used. The other 768 colors are going to remain unused.
TwitterSteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
User avatar
RealNC
 
Posts: 2731
Joined: 24 Dec 2013, 18:32

Re: Ensuring a flicker free gaming experience

Postby link » 25 Oct 2017, 14:15

RealNC wrote:10-bit includes all 8-bit colors. There are no gaps.

8-bit color means 2^8 colors per channel, which is 256 colors. 10-bit color is 2^10 colors per channel, which is 1024 colors. You can perfectly represent 256 colors using 1024 colors. The other way around is not possible; you can't represent 1024 colors using 256 colors perfectly, and you'd need dithering to make it less ugly.

The only negative thing that happens when feeding a 10-bit display with an 8-bit signal, and that out of the 1024 possible colors only 256 are used. The other 768 colors are going to remain unused.


Ohhhh ok so a true 10bit display would essentially display 8bit content the same way a true 8bit display would right?

Thing is cant something in the chain say "well we have a 10bit display here so even though this is 8bit content let's do some trick to get this to 10bit". For example ps4 and Xbox one games are likely 8bit (except for HDR games), but their gpus might say hey let's dither this up to 10bit since 10bit display is available. But then again i dont think this sort of dithering would be the same as FRC right because I don't think a gpu ever has control over display FRC?

Or would it not matter and everything is dependant on content bit depth? Couldn't temporal dithering be completely avoided by simply using say hdmi 1.2 on ps4 and Xbox one even if they force dither? Because the HDMI cable doesn't have the ability to send that signal from gpu and content etc?
link
 
Posts: 33
Joined: 07 Oct 2017, 23:45

Re: Ensuring a flicker free gaming experience

Postby link » 26 Oct 2017, 03:57

Would a IPS HDTV be more prone to inversion/frc flicker being visible because of a bigger size? Say a 50".
link
 
Posts: 33
Joined: 07 Oct 2017, 23:45

Re: Ensuring a flicker free gaming experience

Postby Chief Blur Buster » 26 Oct 2017, 09:21

Yes, 10 and 12-bit is a superset of 8-bit. It's simply a much larger color palette. 10-bit has 4x the number of colors of 8-bit, and 12-bit has 16x the number of colors as 8-bit.

I've never, ever heard of anybody ever having vision problems with inversion/FRC in an HDTV. There are many causes of eyestrain, and while I am no doctor, I would bet a whole house mortgage (At this point, with the huge numbers of clues you've given including plasma) that your eye problems are unrelated to inversion/FRC. Thus, at this point, I feel the discussion is useless for a person who hasn't had problems with plasma.

For the same resolution (1080p), the angular resolution of a 50" HDTV ten feet away via a sofa is lower than a 24" monitor from a computer chair two feet in front. And if you put a 50" HDTV (even the world's most flicker free one) on a computer desk only 2 feet in front of your face, you will get far more eyestrain from the size than even a flickerier 24" monitor. Fixing one problem creating bigger problems.

Yes, I have personally met Stacey Spears, I used to work in the home theater industry, creating line doublers/video processors/3:2 pulldown deinterlacers/etc ( http://www.blurbusters.com/about/mark ). His quote has nothing to do with flicker. Padding with zeros on current LCDs today, simply increases quantization issues (banding, blockier gradients, etc), instead of flicker issues.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6312
Joined: 05 Dec 2013, 15:44

PreviousNext

Return to General — Displays, Graphics & More

Who is online

Users browsing this forum: Bing [Bot] and 28 guests