How Is OLED Technology Considered "Good"?

High Hz on OLED produce excellent strobeless motion blur reduction with fast GtG pixel response. It is easier to tell apart 60Hz vs 120Hz vs 240Hz on OLED than LCD, and more visible to mainstream. Includes WOLED and QD-OLED displays.
thatoneguy
Posts: 181
Joined: 06 Aug 2015, 17:16

Re: How Is OLED Technology Considered "Good"?

Post by thatoneguy » 24 Mar 2023, 09:47

jorimt wrote:
01 Mar 2023, 21:24

As for black frame insertion, no, it doesn't really mitigate this particular form of judder on low framerate content.
BFI DOES mitigate the stutter. At the cost of double-image effect though and you need exact multiple of the reresh rate(71.928hz, 72hz, 96hz etc.). This is just like CRTs, they never had the stutter OLED Sample-and-Hold 24fps does.
Low framerate judder otoh is present even in slow LCDs and can only be solved by interpolation.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: How Is OLED Technology Considered "Good"?

Post by jorimt » 24 Mar 2023, 10:34

thatoneguy wrote:
24 Mar 2023, 09:47
jorimt wrote:
01 Mar 2023, 21:24

As for black frame insertion, no, it doesn't really mitigate this particular form of judder on low framerate content.
BFI DOES mitigate the stutter. At the cost of double-image effect though and you need exact multiple of the reresh rate(71.928hz, 72hz, 96hz etc.). This is just like CRTs, they never had the stutter OLED Sample-and-Hold 24fps does.
I don't consider that to be "mitigation," just another artifact with an ultimately similar effect (but with the addition of phosphor trails on CRT/plasma).

For context, I grew up watching movies on CRTs and also owned a plasma with a 96Hz mode years back, and for those that even notice this issue (most don't), if you're really looking for it, BFI doesn't really "solve" anything vs. sample-and-hold judder for low framerate (< 30 FPS) content; it kinda sucks on both where certain panning motion speeds (that vary by movie; some movie makers know how to mitigate this with specific panning and shutter speeds more than others) are purely concerned.

Low framerate content is low framerate content, unfortunately.
thatoneguy wrote:
24 Mar 2023, 09:47
Low framerate judder otoh is present even in slow LCDs and can only be solved by interpolation.
It is, but it is partially masked by less responsive GtG times on the slower/older LCDs with lower refresh rates (I.E. older VA or IPS at 60Hz).

Also, I mentioned interpolation as a method of mitigation in the sentence following the one you quoted, though few TV interpolation techniques are without artifacts of their own (or an unnatural look).
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

thatoneguy
Posts: 181
Joined: 06 Aug 2015, 17:16

Re: How Is OLED Technology Considered "Good"?

Post by thatoneguy » 24 Mar 2023, 22:33

It's definitely a less stuttery experience on a CRT compared to Sample-And-Hold OLED.
The double image effect masks it a little bit. Also if the CRT has a longer persistence phosphor like a lot of consumer TVs did that can help a bit too. Of course the smaller screens can help as well.

Technically if you run at 24fps@24hz on CRT(via a software BFI method) you don't have these issues but it's way too flickery at 24hz.

Also, there might be one other way other than interpolation to solve the 24fps judder issue.
It would involve pre-emptively doubling/tripling every frame in the source video file before hand(so a 24fps video with each frame being copied 3 times would become a 72fps video for example and so on) but that would require a bloated fliesize and might not work well with compression.

That said I do hope classic interpolation improves someday and we get interpolation without fake frames and artifacts. Would be great for old games stuck at 30 or 60fps too if it were lagless too.
But all I hear nowadays is AI interpolation which requires access to motion vectors and all that stuff which makes it incompatible with old stuff.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: How Is OLED Technology Considered "Good"?

Post by jorimt » 24 Mar 2023, 23:02

thatoneguy wrote:
24 Mar 2023, 22:33
It's definitely a less stuttery experience on a CRT compared to Sample-And-Hold OLED.
The double image effect masks it a little bit. Also if the CRT has a longer persistence phosphor like a lot of consumer TVs did that can help a bit too. Of course the smaller screens can help as well.
It's certainly different from sample-and-hold judder, but stroboscopic effect remains due to the sheer lack of frames per second.
thatoneguy wrote:
24 Mar 2023, 22:33
Also, there might be one other way other than interpolation to solve the 24fps judder issue.
It would involve pre-emptively doubling/tripling every frame in the source video file before hand(so a 24fps video with each frame being copied 3 times would become a 72fps video for example and so on) but that would require a bloated fliesize and might not work well with compression.
LG OLED "Real Cinema" mode already does something similar for 24 FPS content by repeating each frame 5 times for 5:5 "pulldown" at 120Hz, but it just prevents stutter (uneven frame delivery), not sample-and-hold judder.

There are options to apply varying amounts of interpolation along with the 5:5 pulldown on LG OLED TVs to reduce the "soap opera" appearance while also mitigating sample-and-hold judder, but I personally prefer 5:5 pulldown by itself when watching 24 FPS content.
thatoneguy wrote:
24 Mar 2023, 22:33
But all I hear nowadays is AI interpolation which requires access to motion vectors and all that stuff which makes it incompatible with old stuff.
Interpolation improvements for very low framerate content are less likely to significantly progress directly on the TV-side in the near-term and more likely to first more quickly progress through techniques like DLSS3, but yeah, not directly applicable to video (legacy or otherwise) in all cases.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

thatoneguy
Posts: 181
Joined: 06 Aug 2015, 17:16

Re: How Is OLED Technology Considered "Good"?

Post by thatoneguy » 24 Mar 2023, 23:18

jorimt wrote:
24 Mar 2023, 23:02
LG OLED "Real Cinema" mode already does something similar for 24 FPS content by repeating each frame 5 times for 5:5 "pulldown" at 120Hz, but it just prevents stutter (uneven frame delivery), not sample-and-hold judder.
No, what I was talking about is different.
Look at most video games which for example run at 60fps+ but the animation themselves run at 30fps or less. Yet you don't see double-image effect. You don't see double-image effect on NES Super Mario Bros., you don't see it in Sonic etc. despite them featuring many animations that are as low as 3fps.
That's because it's being actively run on real time by the game engine.

Pulldown takes time so it introduces its own judder. Pre-emptively duplicating the frames in the source file itself would most likely mimick the real-time video game effect I'm talking about.
Though again this probably would only work with RAW Uncompressed Video, but I'd like to see an experiment be made to see if it delivers the desirable result.

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: How Is OLED Technology Considered "Good"?

Post by jorimt » 25 Mar 2023, 09:53

thatoneguy wrote:
24 Mar 2023, 23:18
No, what I was talking about is different.
Look at most video games which for example run at 60fps+ but the animation themselves run at 30fps or less. Yet you don't see double-image effect. You don't see double-image effect on NES Super Mario Bros., you don't see it in Sonic etc. despite them featuring many animations that are as low as 3fps.
That's because it's being actively run on real time by the game engine.
That's engine-level tweening to save on animation complexity for higher framerate games. Apples and oranges where low framerate video content is concerned.
thatoneguy wrote:
24 Mar 2023, 23:18
Pulldown takes time so it introduces its own judder. Pre-emptively duplicating the frames in the source file itself would most likely mimick the real-time video game effect I'm talking about.
Though again this probably would only work with RAW Uncompressed Video, but I'd like to see an experiment be made to see if it delivers the desirable result.
I don't see a practical difference between that and 5:5 pulldown for 24 FPS content at 120Hz on a sample-and-hold display; what's causing judder is the same frame information is being displayed too long before the next (at 24 FPS, it's 41.6ms per), breaking the illusion of continuous movement during certain panning speeds.

So unless the video file method you're posing is adding some sort of new information between the original frames to essentially make there be new frame information every 8.3ms (tweening, etc), there would be no effective difference or improvement over 5:5 pulldown.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: How Is OLED Technology Considered "Good"?

Post by Chief Blur Buster » 25 Mar 2023, 16:21

Matter of factly, on OLED, there's no difference between 24fps 1:1 pulldown, 2:2 pulldown, 3:3 pulldown, 4:4 pulldown, and 5:5 pulldown, from a photons-to-eyeballs manner. When the pixel is shown 1/24sec unchanged regardless of pulldown, the the pulldowns are no-operations, image being replaced by an unchanged image, with no visible effect.

Now, if you add BFI to pulldowns, it's no longer sample and hold, you can have visible differences -- 2:2 will have double image, 3:3 will have triple image, etc.

Hate to restate the obvious, but I'll repost it here:

EXPLAINER: Why Does OLED Stutter More At Low Frame Rates? (Fast GtG Pixel Response)

Now understanding that, here's links to software-emulated double image effect that is slowed down to 15Hz. This is the same double-impulsing technique as a hardware double-strobe, except double-impulsing via software-based BFI. And 15Hz is a pretty low frequency that has more obvious flicker/stutter effects.

Some people perceive double-image effect as a form of stutter (or a similar feel such as judder/vibration/etc), because it's vibrating between the two positions to some eyes -- whether it is a pure-doubling or a vibrating-doubling, is simply a function of the specific human's flicker fusion threshold. So to catchall more humans, I intentionally lower the Hz to make the double-image effect stutter more intentionally, as a demonstration, that it can happen. It's much easier to see if it's 15Hz or less, for example:

https://www.testufo.com/blackframes#eas ... =2&pps=960
(View at 240Hz)

or

https://www.testufo.com/blackframes#eas ... =2&pps=960
(View at 120Hz)

or

https://www.testufo.com/blackframes#eas ... =2&pps=960
(View at 60Hz)

So the double image effect can be human-perceived as stutter to some but not all eyes.

Given how different humans see differently (And the different flicker fusion thresholds that affects the stutter-to-blur continuum, aka www.testufo.com/eyetracking#speed=-1 (stare at the 2nd UFO for 20 seconds) where fast stutter just blends/vibrates so fast it is just blur (like a fast vibrating music string).

On OLED, the stutterfeel of double-image effect is stronger than on LCD, so you need a higher framerate to remove the stutterfeel from the double image effect -- but this varies from human to human what the threshold is. For the same human, the threshold is higher for OLED than for LCD, because of faster GtG, which soften the transitions between frames (and thus the extra blur of slow GtG softens the stutters/judders). But the specific thresholds will vary from human to human.

So yes, it can blur or just blends to non-stuttering double image effect...
The stutter-framerate thresholds varies from human to human...
And yes, some people prefer the double image effect over stutter...
But this does not help people who see double image effect as stutter too.

So, it is best not to visionsplain other people, given "semantics-shemantics".

Cheers,
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: How Is OLED Technology Considered "Good"?

Post by jorimt » 25 Mar 2023, 16:54

Chief Blur Buster wrote:
25 Mar 2023, 16:21
Matter of factly, on OLED, there's no difference between 24fps 1:1 pulldown, 2:2 pulldown, 3:3 pulldown, 4:4 pulldown, and 5:5 pulldown, from a photons-to-eyeballs manner. The pulldowns are no-operations, image being replaced by an unchanged image, with no visible effect.
Chief Blur Buster wrote:
25 Mar 2023, 16:21
the double image effect can be human-perceived as stutter to some but not all eyes.
Yup and yup.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
NeonPizza
Posts: 64
Joined: 20 Oct 2021, 03:01

Re: How Is OLED Technology Considered "Good"?

Post by NeonPizza » 29 Mar 2023, 20:11

How long exactly will it take in order for future QD-OLED, or Micro-OLED(whichever tech comes next) TV's to achieve 1ms persistence and 1080p motion clarity ala' BFI, when lets say streaming movies/TV or gaming at 60fps?

User avatar
jorimt
Posts: 2481
Joined: 04 Nov 2016, 10:44
Location: USA

Re: How Is OLED Technology Considered "Good"?

Post by jorimt » 30 Mar 2023, 09:15

NeonPizza wrote:
29 Mar 2023, 20:11
How long exactly will it take in order for future QD-OLED, or Micro-OLED(whichever tech comes next) TV's to achieve 1ms persistence and 1080p motion clarity ala' BFI, when lets say streaming movies/TV or gaming at 60fps?
Longer than you'd like, I assume.

As far as I'm aware (though there may be other factors), for any significant MPRT reduction with BFI on OLED over current (3-4ms on CX at "High," for instance), sustained full-field brightness would have to increase multiple times over on WOLED (currently ~120-200 nits) and QD-OLED (currently ~200-250 nits), which is the most difficult thing to achieve on said panel technology due to things like heat, premature wear, and energy cost.

And even if they could push current-gen panels further, it would probably require much larger heatsinks and fans, which I've seen some buyers already complaining about in models that feature them (for both OLED and LCD).
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Post Reply