BlurZapper wrote:I am a frequent visitor of this site and understand the importance of eliminating Monitor motion blur so that we can finally attain CRT-like clarity. But I just had an epiphany. Why has there been no push in other industries for this such as Sports?
Take GT or Le Mans Racing for example. Fast cars, the cameras recording on occasion sit still while the car pans the image. A quick analysis of the video shows each frame on pause actually has motion blur. An example of this can be seen by going to Youtube and observing any race. Nismo TV channel uploads a few races for example. Back to point, a paused frame shows that the recording actually has hold-scan refresh motion blur, bleeding the previous frame into the current frame. Why do their cameras not address this issue? If they had motion clarity high speed cameras there wouldn't be blur.
If you jump to Television, games like Football (American), you can easily see motion blur on their recordings when you pause the gameplay. Or maybe I haven't checked recently, but I'm sure the issue is still there.
Here's the real issue at hand though. If their recording-encoding has motion blur, and when you watch the video with a Blur-prone TV, that's effectively double the persistence of blur. We can clearly see the Market pushing for 4K TV and broadcasters don't even use 4K, assuming the bandwidth costs would be enormous. Sure higher resolution might improve still image clarity, but motion blur becomes even more apparent at higher resolution. Maybe sports industries (or with your leverage/suggestions blur buster) should actually push for 120hz+ TV channels / car racing streams to showcase true motion clarity. It would make watching sports much more enjoyable. Your thoughts guys? I would imagine the first to test the waters would be a trailblazer (or in our perception, old grandpas slow to picking up new improvements) in this arena.
First, welcome to Blur Busters!
You bring up excellent points.
Both the source (camera shutter) and destination (display) must both be blur-free, in order to produce blur-free video. If one or the other is a weak link, you will have motion blur (either built-in to the video, or from eye-tracking persistence
BlurZapper wrote:Here's the real issue at hand though. If their recording-encoding has motion blur, and when you watch the video with a Blur-prone TV, that's effectively double the persistence of blur.
Yup, it's additive.
16.7ms of blurring in source video frames (1/60sec shutter) and 16.7ms of blurring in display (60Hz sample-and-hold), translates to something like 33ms of motion blurring (33 pixels of motion blur at 1000 pixels/second motion).
The Blur Busters math formula actually ends up the same, at the camera shutter side or display side:
1ms of persistence translates to 1 pixel of motion blur for every 1000 pixels/second motion
And if your framerate is low, that also increases persistence even further too. Even if video frames are captured at 1/60sec, doing 30fps video on 60Hz LCD doubles the persistence. So you're getting (16.7ms shutter + 33.3ms display persistence) = 50ms of motion blurring! So half-framerate video is not good for motion blur, especially on sample-and-hold displays such as LCDs running in their default modes.
From what we now call the "Blur Busters Law" above, 50ms of motion blur =
50 pixels of motion blur at 1000 pixels/sec panning
100 pixels of motion blur at 2000 pixels/sec panning
150 pixels of motion blur at 3000 pixels/sec panning
You gotta fix both (short shutter at source / short persistence at destination), and full frame rate, in order to have perfect zero-motion-blur during fast motion.
Also, increasing resolution also makes decreasing persistence even more difficult.
As you go to 4K and 8K, ever faster shutter speeds are also needed if you want fast-motion at full 4K/8K clarity. And shorter strobe lengths at the display end, too. At 8K, a one-screen-width-per-second football pan is approximately 8000 pixels/sec. Even if you use a sports-fast shutter speed (1/1000sec), that's still 8 pixels motion blurring during 8000 pixels/second pan of one-screen-width-per-second on a future 8K display. Even if we strobe the display at 1ms flashing, that adds another 8 pixels of motion blurring at this same panning speed. Now 1/1000sec camera shutter per frame + 1ms persistence display strobing = 16 pixels of motion blurring during one-screen-width-pan-per second. 16 pixel widths at 8K is the angular equivalent to the angular width of four pixels on a 1920x1080 display (1/4th the horizontal resolution on a same-size display). Obviously, this size of motion blur (while much smaller than today), is still human-noticeable at typical TV view distance even with a fast shutter!
Thusly, we'll someday need to migrate to a very fast 1/5000sec or 1/10,000sec shutter speed (0.1ms-0.2ms shutter) and
ultra-short strobe flashes during motion blur reduction on the display (e.g. 0.1-0.2ms persistence) for super-fast panning with full retina-quality motion clarity during fast sports pans. Ouch, eh?
No 8K consumer CRTs has ever existed at full resolvable resolution, and no 8K cameras existed during the CRT era. So such ultrashort shutters and ultrashort persistences were critical important in preserving resolution during motion. With retina resolutions, the "motion blur calculus" (pun...) is indeed very challenging. Likely, we'll have to live with at least a little bit of motion blur during video. (That's where video games and computer-generated animations can avoid -- no source-based motion blurring! Then it just boils down to now working on eliminating display-based motion blurring)
That said, assuming more realistic 1ms camera shutters and 1ms display persistence -- during super fast panning, there will still be very slight blurring during pans -- but moderately brisk 8K pans would then only blur-degrade to 1080p quality, rather than blur-degrade nearly all the way to VHS-quality like many of today's sports on a cheap HDTV.
BlurZapper wrote:We can clearly see the Market pushing for 4K TV and broadcasters don't even use 4K, assuming the bandwidth costs would be enormous. Sure higher resolution might improve still image clarity, but motion blur becomes even more apparent at higher resolution. Maybe sports industries (or with your leverage/suggestions blur buster) should actually push for 120hz+ TV channels / car racing streams to showcase true motion clarity. It would make watching sports much more enjoyable. Your thoughts guys? I would imagine the first to test the waters would be a trailblazer (or in our perception, old grandpas slow to picking up new improvements) in this arena.
I foresee a bigger marketing push in the coming years, especially when 120Hz gets more standardized -- first things first.
1. Standardize 120fps HFR as the television broadcast standard.
2. Finally, add configurable interpolation-free blur reduction modes (120Hz strobing)
Flicker-based blur reduction at 120Hz is much more comfortable than at 60Hz.
Ultimately, in humankind this century, 120Hz also should not be the final frontier. This is because, on sample-and-hold displays, 120Hz only reduces motion blur by 50% compared to 60Hz. It's the equivalent of a 1/60sec photograph versus 1/120sec photograph -- that 1/1000sec sports photograph is still better. To achieve "1ms" CRT persistence on a fully flicker free sample-and-hold all 1ms timeslots must all be unique frames. That requiring 1000fps@1000Hz realtime (we recently tested a 480Hz LCD
and we currently predict true-1000Hz consumer gaming displays to be possible by ~2025-ish).
However, 120Hz is a great compromise because it makes easy 120Hz MBR modes (strobe backlights) possible without using motion interpolation -- 120Hz flicker is not objectionable to most people. At least 120Hz produces less blur regardless, and users can then optionally choose to reduce display motion blur even further via strobed/scanning modes. Much like today's strobed/scanning modes already available in higher-end HDTVs and gaming monitors. They often interpolate (e.g. 60fps -> 120fps) first and then add strobing to simulate an artificial motion clarity equivalence of higher refresh rates.
But real-life doesn't flicker, and the only way to have CRT-clarity (without flicker/strobing/impulsing/phosphor decay/etc) completely blur-free sample-and-hold -- is fill all the blackness with unique short-persistence frames. A millisecond each (or less). Unfortunately, that technique requires currently unobtainium refresh rates and frame rates. Barring the invention of analog-motion framerateless cameras & framerateless video displays, this century's progress will probably march slowly higher in refresh rates and improved interpolation technologies as flicker-free methods of blur-elimination. But for now, we're stuck with bringing back CRT-era flicker as a way of getting CRT clarity, and convincing the market that "flicker is good" at least until we're able to fill the whole black cycle with unique sub-millisecond point-sampled frames. Once 120Hz is a standardized television broadcast rate (I expect this to be popular by 2020s-2030s), it becomes much easier to bring back (optional) flicker without customer complaints.
So, for realistic 2020s sports ultimateness
8K 120Hz + fast-shutter video + strobing = probably the ultimate sports TV broadcast watching in the attainable future!