New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
John2
Posts: 8
Joined: 14 Jun 2020, 17:56

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Post by John2 » 14 Jun 2020, 19:22

Chief Blur Buster wrote:
14 Jun 2020, 18:25
John2 wrote:
14 Jun 2020, 18:16
Hello laymen here who needs things explained to me in laymen's terms so I can understand.

"Mathematically:
120fps at 120Hz non-strobed LCD = minimum possible MPRT/persistence is 8.33ms
240fps at 240Hz non-strobed LCD = minimum possible MPRT/persistence is 4.16ms
480fps at 480Hz non-strobed LCD = minimum possible MPRT/persistence is 2.1ms
1000fps at 1000Hz non-strobed LCD = minimum possible MPRT/persistence is 1ms"

Those numbers are based on a 1080p LCD display correct? And those numbers change if the resolution changes correct? For instance the motion blur is even worse on a 4k 30Hz display versus a 1080p 30Hz display right? The basic theory I'm trying to understand is this, even though two displays can have the same frames per second the one that has higher resolution will have worse motion blur, is that right?
The milliseconds numbers do not change, that’s the beauty.

You do get more pixels per inch of motion blur at higher resolutions, but the milliseconds is unchanged.

1920 pixels becomes 3840 pixels over the same physical distance for double resolution. But one screenwidth per second is constant time, even if more pixels. The blur is same physical distance over increased dpi.
https://blurbusters.com/blur-busters-la ... -and-hold/
From your article.
"Display persistence is more noticeable for bigger FOV (bigger displays or virtual reality) and for higher resolutions (retina resolutions) due to bigger clarity differences between stationary & moving images."

So when you say display persistence here you are basically meaning motion blur right? So in layman's terms you're saying here that motion blur is more noticeable to the human eye with higher resolutions, correct?

So if I had an 8k 60Hz LCD display sitting right next to a 1080p 60Hz LCD display, and they were both playing a 30FPS video game, the former would have noticeably worse motion blur right?

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Post by Chief Blur Buster » 15 Jun 2020, 01:10

John2 wrote:
14 Jun 2020, 19:22
Chief Blur Buster wrote:
14 Jun 2020, 18:25
John2 wrote:
14 Jun 2020, 18:16
Hello laymen here who needs things explained to me in laymen's terms so I can understand.

"Mathematically:
120fps at 120Hz non-strobed LCD = minimum possible MPRT/persistence is 8.33ms
240fps at 240Hz non-strobed LCD = minimum possible MPRT/persistence is 4.16ms
480fps at 480Hz non-strobed LCD = minimum possible MPRT/persistence is 2.1ms
1000fps at 1000Hz non-strobed LCD = minimum possible MPRT/persistence is 1ms"

Those numbers are based on a 1080p LCD display correct? And those numbers change if the resolution changes correct? For instance the motion blur is even worse on a 4k 30Hz display versus a 1080p 30Hz display right? The basic theory I'm trying to understand is this, even though two displays can have the same frames per second the one that has higher resolution will have worse motion blur, is that right?
The milliseconds numbers do not change, that’s the beauty.

You do get more pixels per inch of motion blur at higher resolutions, but the milliseconds is unchanged.

1920 pixels becomes 3840 pixels over the same physical distance for double resolution. But one screenwidth per second is constant time, even if more pixels. The blur is same physical distance over increased dpi.
https://blurbusters.com/blur-busters-la ... -and-hold/
From your article.
"Display persistence is more noticeable for bigger FOV (bigger displays or virtual reality) and for higher resolutions (retina resolutions) due to bigger clarity differences between stationary & moving images."

So when you say display persistence here you are basically meaning motion blur right? So in layman's terms you're saying here that motion blur is more noticeable to the human eye with higher resolutions, correct?

So if I had an 8k 60Hz LCD display sitting right next to a 1080p 60Hz LCD display, and they were both playing a 30FPS video game, the former would have noticeably worse motion blur right?
That is correct too.

It is not mutually exclusive. The perceived difference between stationary images and moving images is bigger, for a higher resolution dislay of the SAME MPRT number. The math is the same. You are asking an apples question and an oranges question, and you are conflating the two unintentionally by accident.

Milliseconds is not pixels without a speed — you assumed those MPRT milliseconds numbers changes, which is completely incorrect. The MPRT number can still stay the same while looking blurrier because of higher resolution.

The bottom line is, “The same MPRT number will look blurrier at same physical motionspeed of a higher resolution display of the same size”,

This is because higher resolutions means (A) amsharper static image versus (B) same milliseconds amount of physical MPRT motion blur over a larger number of pixels for same number of millimeters. The same physical (inches per second) contains more pixels over the same distance For a higher dots-per-inch (dpi) display.

1ms at one screenwidth/sec on 1920x1080p = about 1.9 pixels of motion blur
1ms at one screenwidth/sec on 3840x2160p = about 3.8 pixels of motion blur

If you can figure that statement out correctly, you got it correctly figured out!
Your earlier question pertained to, “does the numbers change” for:
Mathematically:
120fps at 120Hz non-strobed LCD = minimum possible MPRT/persistence is 8.33ms
240fps at 240Hz non-strobed LCD = minimum possible MPRT/persistence is 4.16ms
480fps at 480Hz non-strobed LCD = minimum possible MPRT/persistence is 2.1ms
1000fps at 1000Hz non-strobed LCD = minimum possible MPRT/persistence is 1ms
The answer is, no, those milliseconds numbers do not change.
The above statement is true for all resolutions, not just 1080p.

Think about this mathematically:

(Rounded to nearest decimal digit)
1ms at one screenwidth/sec on 1920x1080p = about 1.9 pixels of display motion blur
1ms at one screenwidth/sec on 3840x2160p = about 3.8 pixels of display motion blur
1ms at one screenwidth/sec on 7680x4320p = about 7.7 pixels of display motion blur

See - the same number of milliseconds creates more pixels of motion blur, because of higher DPI (dots per inch) of a higher resolution display, means more pixels of motion blur over the same inch, for the same physical motion speed.

Higher resolution displays of the same size, means pixels are smaller for the same size display. But the milliseconds is unchanged, you see? The DPI went up, but the millisecond is unchanged. But since the DPI went up, there is more pixels of motion blur for the same physical (inches per second) motionspeed.

And it’s being compared against a sharper stationary image (because the display is higher resolution at same size). So perceived motion blur for the same milliseconds, is bigger to the human eyes for a higher resolution display, when comparing same physical motion speed (inches per second).

You asked a question specifically about the milliseconds, not about the pixels.

So both of what I said is true — be careful not to confuse pixels and milliseconds without the correct variables and the correct math.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

John2
Posts: 8
Joined: 14 Jun 2020, 17:56

Re: New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Post by John2 » 15 Jun 2020, 13:55

So when you say that we need 1000Hz to get CRT clarity, I had always assumed you were by default talking about 1080p LCD displays, because I had always thought that you would need more than 1000Hz on a higher resolution LCD display to get CRT clarity. Motion blur is not same between 1000Hz 1080p LCD and 1000Hz 8k LCD, the former will have noticeably worse motion blur to the human eye because of the higher resolution, so wouldn't the former need a higher refresh rate to achieve the same motion clarity of a CRT?

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Post by Chief Blur Buster » 15 Jun 2020, 15:08

Assumptions are dangerous in science ;)
John2 wrote:
15 Jun 2020, 13:55
So when you say that we need 1000Hz to get CRT clarity, I had always assumed you were by default talking about 1080p LCD displays, because I had always thought that you would need more than 1000Hz on a higher resolution LCD display to get CRT clarity.
CRT phosphor varies a lot in persistence. Short-persistence phosphor versus medium-persistence versus long-persistence.

CRTs actually had a bit of motion blur in the form of “ghosting”, if motionspeeds were fast enough, high resolution enough, and phosphor slow enough.

Now, imagine the most extreme possible situation: Radar CRTs are intentionally designed to be slow, in order to ghost (long persistence) their previous images, since Radar CRTs essentially operated at low refresh rates (e.g. less than 1 Hertz) to be in sync with a mechanically rotating antenna.

But a television tube doesn’t need to preserve a refresh cycle for that long, so phosphor is short. However, short persistence phosphor is often dimmer, so sometimes manufactured used medium-persistence CRT phosphor to make the picture brighter, at the cost of some extremely slight ghosting. Early color CRT tubes like the 1954 RCA first color TV were really slow (phosphor fade far slower than LCD pixel response today of a non-strobed LCDs), and improved over the years towards the 2000s.

Some models, such as GDM-W900 had a medium-persistence phosphor, and the default of 1ms is used for the CRT phosphor decay to fade 90%+. However, there are CRTs have phosphor decay that takes much quicker (territory of ~0.1ms).

Because of this inconsistency, 1ms was assigned the standardized CRT phosphor gold standard — anything that meets 1ms MPRT can be pretty legitimately called “CRT motion clarity”, regardless of resolution, since CRTs can still motionblur (given sufficient resolution + motionspeeds).

Few people sat very close to large CRTs (e.g. 30 inch) running at high resolutions (e.g. 1080p). Most desktop monitors were 15 inch to 21 inch maximum, and TVs were viewed across a living room. So there was very little opportunity to witness the Vicious Cycle Effect as explained halfway down at www.blurbusters.com/1000hz-journey

But work around CRTs long enough, you’ll get familiar with the fact that some of them ghosted more than others. And if you looked closely at many old tubes, and compared to many strobed LCDs, it’s pretty clear that today, the venn diagram already overlaps in motion clarity capability, thanks to the modern emergence of Motion Blur Reduction strobe modes.

Some of them now can flash a strobe backlight for only ~0.25ms, with shorter strobe pulse length adjustments. And to match that, will require 4000fps at 4000Hz, mind you (1000/0.25 = 4000). So 1000Hz isn’t the final frontier (you are correct about that). That’s why I mentioned quintuple digit refresh rates near the end of www.blurbusters.com/1000hz-journey in the wide-FOV retina-resolution situation (e.g. virtual reality).

Refresh rate limitations will equally show on CRT and LCD for similiar milliseconds number (Give or take, allowing error margin for differences in curve shapes) — it’s just CRTs never reached retina resolutions.
John2 wrote:
15 Jun 2020, 13:55
Motion blur [in pixels of motion blur for same physical distance] is not same between 1000Hz 1080p LCD and 1000Hz 8k LCD, the former will have noticeably worse motion blur to the human eye because of the higher resolution, so wouldn't the former need a higher refresh rate to achieve the same motion clarity of a CRT?
Fixed it for you.

No 8K 60Hz progressive-scan CRTs has ever existed, and electron beam spot sizes were never reliably tight enough for that, so it’s not possible to make an apples-vs-apples comparision.

The same effect also applies to CRTs too — higher resolutions would also have made phosphor limitations more visible, too. The phosphor illumination is darn near instantaneous (microseconds, just like a strobe backlight turning on too) but the phosphor fade takes a longer time to fade, typically hundreds of microseconds on most common CRTs.

Of a strobe-backlight-driven LCD panel (a way to reduce LCD motion blur without the need for insane refresh rates) — a strobe backlight pulse is more of a square wave (strobe backlight) and a CRT phosphor fade is more like a curved sawtooth (shark’s tooth) when seen on a photodiode oscilloscope. So the rise-fall asymmetry shows up to human eyes as asymmetric blur (e.g. leading edge or trailing edge artifacts — aka “ghosting”).

Remember, CRTs were often low resolution, so they were always perpetually motion-clear to most, but in reality, really high resolution material, on a sufficiently large tube viewed close distance, at really fast motionspeeds, could also blur/ghost on a CRT tube too. I’ve seen it happen.

Mind you, not nearly as badly as a ghostly radar scope — but enough to allow CRT clarity venn diagram to easily overlap with strobe-backlight LCDs of similar strobe pulse lengths.

In other words, it will nothappen with 256x224 Super Mario material, mind you — it’s always perpetually perfectly clear at the fastest Nintendo panning speeds. There was never retina-resolution-textures Super Mario brothers at 7680x4320 on a retina CRT — that never existed.

But if one did, the motionblur/ghosting would be visible subject to sufficiently high-resolution CRT with sufficiently high-resolution shadowmask or aperturegrille, with a sufficiently slow phosphor.

In reality, it only takes a medium-persistence phosphor on a sufficiently large tube (Widescreen 24” Sony W900 / FW900 tubes), with current videogame motionspeeds, to create an extremely slight amount of real-world motionblur on a CRT tube, such as testing TestUFO Panning Map at about 3000 pixels/sec. That mapping ceases to be as clear as a stationary image, thanks to the CRT phosphor contributing to sufficient slowness to generate motionblur on a CRT tube surface that is slightly worse than a 0.25ms MPRT strobe backlight (reduced pulsewidth).

Then again, very few people have used sufficiently high-enough-resolution CRT tubes at sufficiently close viewing distances, to witness motionblur/ghosting caused by phosphor...

A simpler ghosting test for a CRT is bright objects on black backgrounds — there is a faint green ghost trail. In some slower tubes, that trail is about 10x worse. And in some early tubes with older phosphors, it was far worse than that (1000x+), like those early color CRTs such as the 1954 RCA Model CT-100 Color TV whose phosphors seemed to have in common with radarscopes than a standard monochrome TV.

As I’ve said in the first sentence of this post, assumptions are dangerous in science. ;)

Many of Blur Busters’s writings on the main website, are intentionally simplified for a Popular Science audience rather than a Research Journal audience — since display advocacy is one of the prime goals of Blur Busters.

Blur Busters — our namesake.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

John2
Posts: 8
Joined: 14 Jun 2020, 17:56

Re: New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Post by John2 » 15 Jun 2020, 15:58

Listen I'm a layman here so please ELI5 (explain like I'm five years old).

Here's really the gist of what I'm trying to understand. Is this basic theory correct for LCDs, the higher the resolution the more noticeable the motion blur will be to the human eye, is this correct? For instance, refresh rates on standard TVs have remained stable at 60Hz but resolutions are now going up to 8k and I'm assuming 12k is coming in a matter of years. The motion blur playing a 30FPS game on a 1080p 60Hz TV was already atrocious enough, but now we're gonna all be forced to buy 8k 60Hz TVs won't the motion blur be even worse though? I can't answer that question because I dont own an 8k tv.

p.s. Please answer that basic theory question and remember I'm just a layman here whose concerned about the rising resolutions but the stagnating refresh rates on standard consumer cheap LCDs (it looks like they won't stop til they hit 16K! Could you imagine if they push out 16K but keep the refresh rate at 60Hz?)

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Post by Chief Blur Buster » 15 Jun 2020, 16:57

Keep tuned!

While certainly within the sphere of Blur Busters -- writing an ELI5 articles is a huge undertaking effort that requires diagramming/eetc -- there are a few planned this year in this sphere of topic -- one planned article is going to be inspired by the "Vicious Cycle Effect" section of the 1000Hz-Journey article.

Keep an eye on the Blur Busters Area51 section, www.blurbusters.com/category/area51-display-research -- this is where I write my flagship articles about the refresh rate race to retina refresh rates.

NHK Japan has long been an advocate of 8K 120Hz partially because of this reason.

On a related topic, check out Ultra HFR as one of the possible long-term solutions.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

John2
Posts: 8
Joined: 14 Jun 2020, 17:56

Re: New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Post by John2 » 15 Jun 2020, 17:36

Chief Blur Buster wrote:
15 Jun 2020, 16:57
Keep tuned!

While certainly within the sphere of Blur Busters -- writing an ELI5 articles is a huge undertaking effort that requires diagramming/eetc -- there are a few planned this year in this sphere of topic -- one planned article is going to be inspired by the "Vicious Cycle Effect" section of the 1000Hz-Journey article.

Keep an eye on the Blur Busters Area51 section, www.blurbusters.com/category/area51-display-research -- this is where I write my flagship articles about the refresh rate race to retina refresh rates.

NHK Japan has long been an advocate of 8K 120Hz partially because of this reason.

On a related topic, check out Ultra HFR as one of the possible long-term solutions.
I appreciate your feedback.

But I need a yes or no answer here. An 8k 60Hz LCD will have noticeably worse motion blur than a 1080p 60Hz LCD, and when I say noticeable I mean someone standing there looking at it would definitely be able to tell the difference, yes or no?

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Post by Chief Blur Buster » 15 Jun 2020, 21:40

John2 wrote:
15 Jun 2020, 17:36
I appreciate your feedback.

But I need a yes or no answer here. An 8k 60Hz LCD will have noticeably worse motion blur than a 1080p 60Hz LCD, and when I say noticeable I mean someone standing there looking at it would definitely be able to tell the difference, yes or no?
I must define variables here.
Assuming same size screen, same frame rate, same refresh rate, full sample-and-hold persistence (non-strobed), same pixel response speed, with same video material (except different resolutions)

For 1080p video (on 1080p LCD) versus 1080p video (stretched to fill 8K LCD), no, the motionblur looks the same.
For 1080p video (on 1080p LCD) versus 8K video (on 8K LCD), yes, the motionblur will be easier to notice at 8K.

In other words, you won't see a downgrade for your existing 1080p videos, if that is your concern, and if that is what you are asking.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

John2
Posts: 8
Joined: 14 Jun 2020, 17:56

Re: New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Post by John2 » 17 Jun 2020, 19:38

Chief Blur Buster wrote:
15 Jun 2020, 21:40
John2 wrote:
15 Jun 2020, 17:36
I appreciate your feedback.

But I need a yes or no answer here. An 8k 60Hz LCD will have noticeably worse motion blur than a 1080p 60Hz LCD, and when I say noticeable I mean someone standing there looking at it would definitely be able to tell the difference, yes or no?
I must define variables here.
Assuming same size screen, same frame rate, same refresh rate, full sample-and-hold persistence (non-strobed), same pixel response speed, with same video material (except different resolutions)

For 1080p video (on 1080p LCD) versus 1080p video (stretched to fill 8K LCD), no, the motionblur looks the same.
For 1080p video (on 1080p LCD) versus 8K video (on 8K LCD), yes, the motionblur will be easier to notice at 8K.

In other words, you won't see a downgrade for your existing 1080p videos, if that is your concern, and if that is what you are asking.
So do the executives at companies such as Sony and Samsung know that when you increase the resolution but keep the refresh rates the same the motion blur gets worse? I'm very curious have you talked to any of these top executives at these top companies? Could you share some stories with me?

Do they realize they're essentially pushing out displays with worse and worse motion blur? Right because if you increase the resolution you must also increase the refresh rates to get the same motion blur you had before?

thatoneguy
Posts: 181
Joined: 06 Aug 2015, 17:16

Re: New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Post by thatoneguy » 21 Jun 2020, 04:39

Chief Blur Buster wrote:
15 Jun 2020, 21:40

For 1080p video (on 1080p LCD) versus 1080p video (stretched to fill 8K LCD), no, the motionblur looks the same.
This doesn't make sense to me from a physical perspective. Isn't upscaled 8K(from 1080p) still technically 8K(even if not native)?
I mean you still have 16 times the pixels.

Post Reply