Cinematography of 2030s: Ultra HFR (1000fps at 1000Hz!)

Discussion about 120fps HFR as well as future Ultra HFR (240fps, 480fps and 1000fps) playing back in real time on high refresh rate displays. See Ultra HFR HOWTO for bleeding edge experimentation.
Post Reply
User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Cinematography of 2030s: Ultra HFR (1000fps at 1000Hz!)

Post by Chief Blur Buster » 02 Feb 2018, 14:27

Crossposted from reduser.net:
Cinematography of 2030s: Ultra HFR! I have witnessed realtime 1000fps on real 1000Hz.

Interesting reading of future Ultra-HFR.
Mark Rejhon wrote:Hello,

I'm the founder of Blur Busters -- and author of TestUFO Motion Tests. I consider myself an expert in displays and of low-persistence. I'm the co-author (along with NIST.gov, NOKIA, Keltek) of a peer reviewed conference paper on a display motion-blur testing technique, and am also the world's first mainstream reviewer to test a true-480Hz display and have also seen 1000Hz+ displays in the laboratory now. I have consequently written Blur Busters Law: The Amazing Journey To Future 1000 Hz Displays. I am more well known by gaming monitor manufacturers and eSports players, but since Blur Busters is slowly expanding news coverage into HFR (120fps, and even true-240fps+) content. I'm also a invited guest moderators of /r/hfr on Reddit now. I have had a few contracts with gaming monitor manufacturers to reduce display motion blur.

I'm one of the few people in the world to be able to do a good "Plain English" job of explaining display persistence (causes of display motion blur), which is explained in the 1000 Hz article. I'm the inventor of several optical illusions that is intentionally generated by display motion blur: TestUFO Eye Tracking and TestUFO Persistence Of Vision.

Also, resulting motion blur (seen by the human eye) is the sum of the source persistence (camera shutter length) and destination display persistence (pixel visibility time).

Meaning, a camera with 1/300sec shutter speed, displayed on a sample-and-hold display of 1/60sec frame visibility time, creates (1/300sec + 1/60sec) = (6/300sec) = 1/50sec of motion blurring for on-screen moving objects. A film that is filmed with a camera of 1/120sec shutter displayed on a 24Hz flickerfree DLP (E-Cinema), creates (1/120sec + 1/24sec) = (1/120sec + 5/120sec) = 1/20sec of motion blurring. (Assumption is frame rate matching refresh rate, of course -- there are many variables -- but once you've got a match, the mathematics is simple.)

Display Persistence is the destination-side equivalent of the camera shutter -- it is also known as MPRT (Moving Picture Response Time) in scientific papers) -- and reducing persistence is usually done via flicker (strobing, BFI, CRT impulsing, etc) but can also be done via a higher frame rate. At the end of the day, display motion blur is linearly proportional to pixel visibility time, and it's easily demonstrated in the motion test links above, and already well-documented by papers well outside the filming industry. Virtual reality needed to reduce motion blur because motion blur causes nausea (extra blur above-and-beyond natural human vision limits) in a Holodeck-like environment, so intense research is occuring to eliminate blur (both source-side and destination-side). Now, today, I am increasingly crossing-over between the source-side equation and the destination-side equation. And HFR cinematography, sometimes reaches into virtual reality (360 degrees). So my writings is slowly increasing into HFR too.

Intentional motion blurring is often added to frames to prevent stroboscopic and wagonwheel effects. Unfortuantely, adding camera blur or artifical blur is also a cause of nausea for some in virtual reality (VR). So you need to eliminate both motion blur and stroboscopic effects. The only way to do both simultaneously (can't add blur, can't have stroboscopic effect) is keep raising frame rates. That means things like true 1000fps at true 1000Hz. VR scientists have now agreed that going to at least quadruple-digit framerates, is necessary to pass a Holodeck Turing Test (unable to tell apart VR from real life). We still even get stroboscopic effects even on my true-480Hz display. While resolutions have gone Retina, true genuine frame rates + refresh rates are very far away from reaching Retina (e.g. achieving high enough framerates & Hz that it's not necessary to add motion blur to individual frames) by more than an order of magnitude.

From my true-480Hz tests:
Image
(One of the several artifacts forced by refresh rate granularity)

Early canary tests show promise in ultra-HFR video (e.g. true realtime 1000fps video!) which are successfully simultaneously blurless & stroboscopicless. This is a distant vision of what is to come (~2050) especially if 1000fps / 1000Hz becomes cheap. Today, in the lab with actual displays, it's much easier to tell apart 240fps@240Hz and 1000fps@1000Hz than it is to tell apart 4K and 8K. I love 8K as much as all of you, but it is surprising of the little focus on providing cinematographers further additional options. It can be increasingly important (e.g. blurfree+strobefree 360 degree videos for VR) to avoid viewer nausea for a bigger-sigma percentage of human population. Currently, VR headsets use 90Hz pulsing, but future VR is expected to go even further in frame rates in the decades to come to reduce the need for strobing/flicker as the blur-reduction technique, and migrate to sheer framerates as the blur-reduction technique. Research is also being done on frame rate amplification technologies for video games (Oculus can convert 45fps to 90fps laglessly, in a 3D-aware manner, without rerendering the intermediate frames) and tomorrow's tech may convert 100fps to 1000fps with less need of full re-renders between frames -- due to anything different from real life (e.g. flicker, blur, stroboscopic, etc) causing nausea in virtual reality environments. But this is also human-visible on a stationary display too, including experimental displays too, so the VR innovations may spinoff to future ultra-HFR.

Nobody talks about it yet, but what we see in the lab, means 120fps HFR is not the final frontier when we predict outwards to 2030s-2040s, we see ultra-HFR now becoming a more mainstream topic (then) within our lifetimes given the incredible rapidity of VR innovations occuring (and actually feasting my eyes on real 480Hz+ and 1000Hz+ laboratory displays myself -- essentially flickerless and strobeless with uncannily CRT-quality motion with zero stepping artifacts). After resolutions have maxed out to Retina leagues, we now believe frame rates and refresh rates is one of the new racees that will continue outwards to the end of the 21st century, especially with the mother-of-necessity VR pushes.

Motion blur is lovely when we want it. It is very cinematic. But sometime, certain situations, we want to understand how to control it & eliminate it. Since virtual reality often requires elimination of motion blur to prevent motion sickness (avoid extra blur forced on eyes, above-and-beyond human vision) -- the well known VR sickness. Now technology is arriving that makes this increasingly easier, for those specific kinds of situations where the Director's Intent is minimum-possible motion blur, or at least a fuller understanding of controlling motion blur via understanding source/destination-side in a deeper way.

I'm about to write a simplified HFR article on how source persistence and destination persistence combines into a final amount of perceived display motion blur. This article is of extreme great interest to 120fps HFR videographers who are trying to figure out how predict resulting motion blur that viewers will see, in order to best present their HFR material (Especially when they are allowed to choose a specific display for their HFR presentation). I'm the display-side expert, but with a good understanding of the interactions with source-side (camera blur) so this subject is a crossover for me. I'm now reaching out to a few HFR experts.

My questions to readers are as follows:

(1) Has there been any recent innovations in Ultra HFR? Video of frame rates above 120fps outside the laboratory, played realtime, not slomo?
e.g. Presentation of true-240fps video (non-slomo) on now currently-shipping true-240Hz eSports gaming monitors (more than one dozen models are now on the market).

(2) Are there any pre-existing articles in the HFR community, that accurately explain in a simple way:
e.g. How source persistence & destination persistence interact to create, a final human perceived display motion blur.


I am researching this, before I write about this.
I'd be happy to team-up with a known source-side expert (e.g. HFR expert) if you want to help me coauthor a flagship motion blur article for the HFR community worldwide.

Cheers,
Mark Rejhon
Founder, BlurBusters.com
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Cinematography of 2030s: Ultra HFR (1000fps at 1000Hz!)

Post by Chief Blur Buster » 02 Feb 2018, 14:27

--- Copy of reply to David Mullen on reduser.net ---

Thank you!
I am trying to research the state of HFR in the "ultra high framerate" sphere; there is little on realtime HFR above 120fps.
David Mullen wrote:All I recall are the old tests that Douglas Trumbull did for his Showscan process. He felt that at really high frame rates, over 100, the ability for the audience to perceive a difference was too small to offset the practical issues of shooting and projecting at those rates.
From what I am (faintly) aware of the old tests of Douglas, he was unable to push the frame rates on an exponential curve necessary to make it worthwhile.

What we discovered is you have to double in order to have a really noticeable effect.
e.g. 120fps -> 240fps -> 480fps -> 960fps

Basically, these steps halve motion blur. The diminishing points of returns continue, but doesn't quite stop, because it is equivalent to display motion blur of:
1/120sec blur -> 1/240sec blur -> 1/480sec blur -> 1/960sec blur

With experimential ultrahigh-refreshrate headroom now available for scientists/researchers/experimenters to experiment with, it has been observed that display persistence (refresh cycle length, for sample-and-hold displays) has the same behaviour as camera shutter speed. 480fps@480Hz on a flickerless display (like a 360-degree shutter), has exactly the same display motion blur, as a photograph taken with a 1/480sec shutter speed, for the same equivalent physical angular motionspeed for the display plane relative to the human eye.

Which is indeed noticeable. The jump from 120fps HFR->1000fps HFR is roughly as noticeable as the jump from 60fps HFR ->120fps HFR Yes, it's a much, much bigger jump that becomes mandatory to punch the diminishing points of return curve -- but we're comparing 1/60 versus 1/120 versus 1/1000 which is 16.67ms versus 8.33ms versus 1.0ms -- that's nearly the same apart (8.3ms delta versus 7.3ms delta). But it clearly outlines that overlooked "diminishing points of returns" headroom, you need a massively ginormous jump up in frame rates to be really noticeable. But it's quite useful for VR and Holodecks to try to achieve near-zero-persistence -- something 120fps HFR can't even do without strobing or blur side effects, for the aboveformentioned reasons.

Today, it's more pratical to just (about) experiment -- current off-the-shelf DLP chips can be attached to a custom firmware to actually display a monochrome image at a true-1000Hz refresh rate, thanks to the DMD's ultrarapid pixel switching speed. Several scientific suppliers now sell 500Hz and 1440Hz DLP projectors (albiet at often 5 figure prices) -- e.g. ViewPixx. My prediction is that ultrahigh refresh rates will eventually be a minor cost-add within 20 years to future displays. But right now I'm researching the source-side of the equation.

4K is cheap now, tomorrow, 1000Hz may be too.
David Mullen wrote:Keep in mind that 1000 fps is over a 5-stop exposure loss compared to 24 fps, though you can gain one-stop by not using a shutter at those rates. But even if it's just a 4-stop loss, your 800 ASA camera just became a 50 ASA camera.
I'm certainly aware! Past high speed cameras need to utilize a ton of light.

Several private brainstorms have privately been discussed amongst us about what will be needed in 20 years from now, to try to film for a VR or Holodeck environment in much more truer manner:
-- One theoretical camera being imagined, is a photon camera. After you recorded the video (essentially timecoded photons), you can play back the video file at any frame rate you want, whether 37fps or 24fps or 1000fps. Post-processing would allow full-brightness at low framerates to be multiplexed with the temporal resolution of ultrahigh framerates
-- One can convert a 1000fps video into a much-brighter 25fps video simply by stacking 40 frames into 1 each. It's still the same number of photons, so with a proper sensor & proper tech, with a good high-efficiency sensor, this can in theory work as perfectly in brightening 1000fps videos to the same brightness as a 25fps video. But when objects move, denoising algorithms are REALLY bad. However, natural learning has shown some shocking improvements -- artificial intelligence denoising algorithms can in theory re-brighten 1000fps videos, while keeping temporal artifacts at bay. Artificial-intelligence denoising algorithms would solve the problem of denoising artifacts.

I saw some really good artificial-intelligence scalers at CES 2018 that actually converted 1080p into 4K and 8K material in a really fantastic way -- it was as if it knew that blurry house and windows and tree leaves were actually the real thing, and artistically (in realtime!) put in real sharp wood, real sharp leaves, etc, creating detail whereas there wasn't before. In 20 years from now, AI algorithms can probably realtime convert a VHS videotape into a retina-sharp Holodeck environment, by automatically recognizing objects and using its own library of 3D objects and material, to replace the blurry stuff with the real thing. Pretty much a supercomputerful of realtime processing.

Some of us have commented, that the the exact same kind of process, could be used to apply AI-based denoising/deblurring to framestacked-brightened 960fps videos (which could theoretically also output 480fps, 240fps, 120fps, 60fps, 24fps, as divisors of 960 -- and these multiple source files of multiple different brightnesses (lower framerate being brighter and lower noise) by the AI algorithm -- to successfully ultra-brighten 960fps video without the darkness and noise.)

Back in 1930s and 1940s, you really needed bright light to expose color film even at just a mere 24 frames per second (e.g. Wizard Of Oz) -- far more light back then than for today's 1000fps. Tomorrow's breakthroughs, in a couple of decades, could make practical much-brighter 1000fps filming, or some other framerateless method necessary for VR realism without the side effects of using a frame rate (aka static imagery to represent moving images).
David Mullen wrote:At those high rates, the screen becomes essentially flicker less and seems like a clear window, assuming there is sufficient resolution.
At ultrahigh framerates, flicker doesn't need to be involved to eliminate motion blur.

Blurless sample-and-hold. Blurless 360-degree.

The ultra-high framerates make low-persistence possible without the need of black periods.

Which is better (for VR / Holodecks) because real life doesn't strobe and doesn't enforce a motionblur above-and-beyond human vision.

Real life has no frame rate either, and a frame rate has many side effects (stroboscopic effects, wagonwheel effects, stepping effects, persistence motion blur from eye-tracking across static frames) which affects passing a theoretical Holodeck Turing Test (unable to tell apart VR and real life). But since we can't go framerateless or analog-motion, ultra HFR is the next best thing.
David Mullen wrote:But I also think you run into a variant of the uncanny valley phenomenon when shooting narrative fiction -- the more "real" the process becomes, the more artificial the techniques of moviemaking appear -- costumes, lighting, make-up, etc. Which is one reason why this sort of approach works better in nature documentaries, for example, because everything in front of the camera IS real.
Yes, this is a consideration.

That said, virtual reality literally requires the realism factor or you get nausea/headaches. In a "Holodeck" situation, this is a very different phenomenon from just staring at a flat screen: You're unable to do things like depth of field, defocus, motionblur, etc -- the human eye needs to do that instead if you're in a virtual environment. Even motion blur (beyond human vision) creates major nausea in VR. This can become important when designing virtual reality movie, or a movie that you can eventually walk in.

Most of the time, this is computer generated graphics, but can also include 360-degree movies (currently often looks like a flat plane, and has lots of problems) -- but eventually when 360 VR movies becomes true 6dof stereoscopic 3D in one way or another (intense research is occuring as we speak) there is a lot of technological progress needed.

Some of this is futurist stuff, obviously, however, I'm fascinated by the topic of HFR beyond 120fps. Humankind will certainly definitely need to plan a path for eventual ultra-HFR for virtual reality purposes -- e.g. 360-degree 6dof films that actually look like being immersed into a Holodeck like the real thing instead of looking inside a hollow projection sphere.

All fascinating areas of study, and I'd be happy to collaborate ideas with a few bleeding-edge HFR people into "push beyond 120fps HFR". As a display-side guy, I have to reach across the universe to the source-side people, and work on long-term solutions that successfully solve the "Holodeck Turing Test" problem of successfully matching real-life even for fast motions like downhill skiing, racing, speedboating, etc -- where all motion blur needs to be naturally from the human eye rather any existing than in the source/display.

VR is growing leaps and bounds, but the ultimate test is when VR stops looking fake and looking identical to real life: I believe that the Holodeck Turing Test experiment is achievable within 1 to 2 human generation (ultrahigh-Hz retina-resolution VR headset + ultrahigh-framerate video) to successfully trick a person that VR is a transparent ski goggles and that they're staring into real life (even for fast motions). Yet it is actually virtual reality -- a Holodeck per se. But is going to require a lot of future co-operation between the displayside experts and the sourceside experts. Still, this is real "Within our kid's life" stuff!

--- Copy of reply to David Mullen on reduser.net ---
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Cinematography of 2030s: Ultra HFR (1000fps at 1000Hz!)

Post by Chief Blur Buster » 02 Feb 2018, 14:30

--- Copy of one of my replies on reduser.net ---
PatrickFaith wrote:I have to do a lot of testing with having a 8 to 30 fps acquisition embedded in a 60 to 240 fps distribution, the 24 fps at 180 shutter holds up amazing well in most cases. There's another thread where Michael Cioni of Panavision is talking, if you read between the lines on that there's a lot of theory going into shooting at 24 fps acquisition but distributing at much higher rates.
Interpolation?

Yes, clever interpolation techniques can be done to increase frame rates. That will probably be a necessary piece of the jigsaw puzzle, too.

In the past, interpolation was a really evil thing with lots of artifacts (soap opera effects). But the art of interpolation is rapidly improving, especially if you give it "all the information" possible. Parallax backgrounds, positional information, other missing info to avoid artifacts from guessing logic, etc.

One subcategory of SOE artifact (of many) is ultrasmoothness without reduced blur: The camera exposure length should be really short for low-framerate video intended to be converted to ultra-HFR. 24fps video with 1/48sec motion blur doesn't always "look right" when converted to a 120fps HFR video, unless there's a method of de-blurring (of which research now exists).

Source-based awareness also helps avoid a lot of interpolation problems. Video compression standards are heavily interpolation based (H.264 and H.EVC) so even Netflix, 4K Blu-Rays and E-Cinema is practically >99% mathematically based on predictive math arts found in interpolation -- the I-Frames, P-Frames, B-Frames, and the similar equivalents. It avoids problems because the codec 'knows' the original uncompressed. But an intermediate interpolator (e.g. Sony motionflow) is missing this, along with temporal/positional info, so it has to do a ton of 'guessing'. And the problem is compounded when you're doing 3 dimensions (holographically, true voxels or polygons, not stereoscopically).

Likewise, Oculus Asynchronous Spacewarp (virtual reality) converts 45fps into 90fps virtually laglessly without rerenders, by virtue of high-resolution positional knowledge (mouse trackers, head trackers, etc are available at ultra-high-Hz today -- many gaming mice already operate at 1000Hz now).

Some future HFR video may be recorded with ultra-precise gyroscopic/accelerator/positional information, to aid a computer in converting the video to a 3D environment, with less guessing. Smartphones have all the sensors needed, and is already being used to "scan" real world objects into 3D (and things like ARKit etc) so naturally all this happening already. However, this extrapolates into helping theoretical future cinematography techniques needed for ultra-HFR techniques and Holodeck-like environments.

Reducing the mandatory need for tracking points (e.g. pingpong balls) of traditional motion capture, not practical for things like recording documentaries and real-life, cameras (e.g. RED) might end up record ultra-precise telemetry too (e.g. gyroscope, accelerometer, AR data, synchronized with one, two, three, or other other recording angles of different cameras, etc) -- I imagine this is already being done by some of you guys for certain kinds of cinematography. (Whether it's for greenscreen help, or help with CGI overlays) -- but it looks for future HFR framerate-upconversion assist will also (additionally) eventually greatly benefit from improving-accuracy that's occuring in this telemetry data in the coming years (decades).

The more telemetry data recorded alongside the video, the more accurately and perfectly future algorithms are able to convert films into framerate-independence or convert films into true-3D environments with less man-hours of human help.

It's a massive amount of processing work and manpower, but rapidly falling. Retina 3D environments (Holdecks) capable of tricking a human, generated from a video, are currently still a figurative Olympus Mons on a faraway planet, compared to the Mount Everest challenges on this planet we're only barely scaling today ...but the light is at the end of the tunnel within a couple human generations (or even within maybe one, given an Apollo Mission push -- but there isn't one occuring -- so I give it two human generations to successfully pass the Holodeck Turing Test on an 'any-material' basis).

--- Copy of one of my replies on reduser.net ---
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Cinematography of 2030s: Ultra HFR (1000fps at 1000Hz!)

Post by Chief Blur Buster » 02 Feb 2018, 14:31

--- Copy of one of my replies on reduser.net ---
Alex Lubensky wrote:Personally, I don't think you'll find a lot of support for HFR among cinematographers. Nowadays a lot of TV panels push for HFR, online players such as MEGOGO push HFR into their player by default and have no possible way to cut it off in the app. Sports channels also push for HFR.

The thing is - any film or a movie should be projected at the speed it was shot, otherways you're changing the way you precieve the basics of the motion in the film. There is absolutely no need in "BLURLESS" image, more to say - I personally find a blurless or a HFR image distracting and repulsive. Speaking of the thing's I've tested - 60 fps/hz look awful, 120 fps/hz look even more distracting. It literally thows you away from the story into the rushing image.

The only film I've watched in HFR was The Hobbit, and there was an experimental reason to go for HFR - because of the way 3D was and is projected in the theaters.

There's a lot of reasons to push for HFR in gaming, but I don't see any to push for it in the film industry.
I pretty much agree: Today.

Being the age, I prefer the good old fashioned 24fps look when staring at a big screen.

But what about some the kids who want to be cinematographers for virtual reality? A lot of the young cinematographers I talk to (millenials) have increased interest in things like VR, the blending between games and cinematography, and also don't mind HFR nearly as much as the older generation.

The hard questions may be asked: What are the best ways of cinematography for VR, when the goal is full immersion?

VR can straddle the game + cinematography universe. Essentially, a game you can walk in. There's early Wright Brothers experiments that converts a video into an explorable 3D environment (Try it now: True 3D, not stereoscopic) that you can walk in -- many glitches now, and rough, but one day will eventually someday become Retina, with the help of AI. So tomorrow (one or two decade from now) -- YOUR film might be given to a computer, that then converts it into a walkable, explorable 3D world. "...Please create a Holodeck environment matching the one in this video..." -- this is already being done (in a very, very rough Wright Brothers way). Something like this is already being done in a Wright Brothers way as you notice, and it will take a decade or two, but it's something in our kid's lifetime now: Making movies/documentaries/films/videos/etc designed to be explored by tomorrow's audiences.

This also, ironically, may be one of the ways to avoid adding extra framerate -- simply tell a computer to convert it to a 3D environment, and let the computer render it at a higher framerate than the original material. Problem solved. Cinematographers wouldn't need ultra HFR, to supply material for that. This may be what happens instead of direct ultra-HFR -- basically the intermediate step. But there may also be best-practices, like using the highest "good" framerate available to your camera, etc -- the extra temporal resolution helps the conversion process.

For games, but as it becomes comforable (e.g. imagine future theoretical Apple-Oakley sunshades of 2040s, rather than bulky ugly VR headsets), there will be a bigger market for virtual movies and such. And as it all expands beyond this -- if you're trying to create a Holodeck-like environment (e.g. film something that you can eventually walk in), the whole ball of wax changes.

Also, there are many cases, that this is still useful on a flat display: e.g. fast-action sports look really good in ultra HFR. 4K60 decoders are capable of realtime 1080p240fps in realtime, and 4K120 video decoders are capable of realtime 1080p480fps -- these are sub-$1000 graphics cards now that can play 480fps HFR today, if such a display existed.

There is now 1000fps in some smartphones (e.g. in one Sony Experia phone) although only for 5-second slomo bursts due to lack of bandwidth - but tomorrow, 1000fps may be continuous recording on a smartphone. Due to its ultra-sensitive sensor, its 1000fps is much brighter than even a large 5-year old 1000fps camera -- I was impressed that you could get that 1000fps high speed slomo quality from a smartphone-size camera sensor! If progress goes well, the darkness/noise problem will eventually become a nonissue within ten or twenty years, especially with additional AI algorithms to allow blur-free exposures longer than a frame per frame (e.g. 1/1000sec with the brightness and low noise of of 1/25sec).

Now, if ultra-HFR became cheap in displays too within ten years, then the whole chain is available: Cheap 1000fps, cheap 1000Hz, etc. When that happens, there will be temptations.

60fps to 120fps HFR is NOT always a big enough jump (only 50% less motion blur for sample-and-hold displays) and not tempting enough for many cinematographers, but 60fps to 960fps (94% reduction in motion blur on sample-hold displays such as LCD or OLED) is a much larger jump that may make it "worth the dazzle" for some special cases.

e.g. special features, amusement rides, Holodeck Experiences as well as say, special editions of Imax "vacation" films where immersion is a bigger goal.

The 24fps jump to 48fps or 60fps or 120fps doesn't jump over the uncanny valley enough, but the 24fps jump to 1000fps does (with proper cinematography technique). Some of the ultra-HFR tech is already here today in pieces and bits -- sometimes in your pocket too -- but is not glued together yet in a complete realtime chain yet.

The uncanny-valley effect is leapable, and only a few people in the world (like me) has seen this happen, refreshrate-wise, and it is extraordinarily, extraordinarily tough requiring the quadruple digit refresh rates. The effect is broken by other flaws (e.g. camera blur, weird motions) but it's quite useful in many applications. 3D was a fad, but this is a lurking even-bigger thing: real retina-quality Holodeck experiences, or true 'window effect with no uncanny valley' -- never achieved before.

Thus, why my topic is "Cinematography of 2030s" -- including future career choices, etc.

Traditional cinematography will still exist. But it'll not be the only cinematography applicable. Soe cinematographers will want to expand career choices, or be prepared to cater to additional markets, like working with game companies, or filming with settings intentionally for a Holodeck-like environments, etc.

That said, experimentation continues. What's being seen now is absolutely incredible, and should be shared to more HFR cinematographers. As the tech becomes feasible and the extra bruteforce (in Hz, framerates, reality, immersion solutions, etc) slowly leaps over the uncanny-valley problems, many questions are being asked, especially in the what-ifs department.

--- Copy of one of my replies on reduser.net ---
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Cinematography of 2030s: Ultra HFR (1000fps at 1000Hz!)

Post by Chief Blur Buster » 02 Feb 2018, 14:35

--- Copy of one of my replies on reduser.net ---
Alex Lubensky wrote:And after all, you'll still have a whole century of beautiful cinematography done in 24p and established to be viewed in 24p. Only this can last the world of 24p for another half of a century.
I love 24p cinematography, so don't get me wrong! :)

Yes, 3D is a fad.
Some things stay a fad, and some things come back over and over.
Video games were originally thought to be a fad, then the 1982-1983 Video Game Crash.
Games recovered and is now bigger than Hollywood, while using cinematographic techniques too (e.g. motion capture).
3D crashed many times too. This won't be the last one. Holodecks are a totally different thing than 3D stereoscopic too.
In all likelihood, it's also a parallel blended industry like movies versus games.
Alex Lubensky wrote:Yes for VR, yes for 3D, yes for viewfinders, on-board monitors etc. But it doesn't relate to actual world of today's cinematography overall.
...That said, I've also seen various argument over the decades:

- Motion capture and pingpong balls is not cinematography
- CGI is not cinematography
- Digital is not cinematography
Etc.

What was your grandfather's cinematography isn't necessarily the same exact definition of today's cinematography. Simultaneously, there are attributes that overlaps universes, e.g. the overlap between games and movies -- especially motion capture done for games, to the tie-ins frequently done between games and moviesWhether, completely generated, or motion-captured, or insertions of various FMV into the game. So while games/VR are not necessarily cinematography, there are many cinematography techniques being utilized, so this is still certainly a legitimate (Albiet for today, somewhat outlier) topic thread when presented with this point of view.
Alex Lubensky wrote:The technology itself is a beautiful thing, but sometimes we're overhyped about it. Personally, I don't see almost any benefit of shooting HFR for viewing HFR for today's perception of the word "cinema".
There is still much overlap to future cinematography techniques. Concurrently, there will be times where 1000fps needs to be displayed on a flat rectangle, too -- there's still benefits there (once it's cheap enough to do so). Instead of 24p->120p->24p, you might have 24p->1000p->24p, if the capability is already in the cameras and the displays. Or it might be utilized differently. Who knows -- many unthought overlaps -- There'll be many spinoffs between both worlds: Imagine this, the Holdeck-like technologies (turning a movie into a flawless 3D environment) will make possible virtual camera angles (camera filming angles for cameras that wasn't even there in the original place) -- and those cinematography might actually be put at 24fps with traditional projection technique and traditional storytelling (even as the virtual world continues to be used for a game). Prevent a reshoot because of a missing camera angle or missing drone location etc. Just like for the past, CGI, digital, motion capture, and all the innovations, the ultra-HFR will have many spinoffs into cinematography that we haven't currently imagined yet. Given cinema 100 years ago, versus cinema today, versus what cinema will be 100 years later, pretty much brings this discussion into this scope of the word "cinema".
Alex Lubensky wrote:I wouldn't call VR movie a movie at all - it's a whole another story
Agree. Just to point out, I intentionally focus on the word "cinematography" and phrase "cinematography technique", and the broad-ranging spinoffs, as explained above.
Alex Lubensky wrote:I don't state it's wrong, it's just too different - from the way you create it, to the way you percieve it.
Today, it's foreign.
Tomorrow, it's not.
CGI. Digital. Pingpong balls and motion capture.
All foreign stuff to the people who filmed the 1939 Wizard Of Oz with 3-layer Technicolor.

IMHO, arguably, some may even dare to say: Potentially a vastly more different universe to us today. than this thread is to today's cinematographers. Yes, this thread is very foreign today. But less foreign than 1939 versus 2018 in many elements.
Alex Lubensky wrote:It's more like theatre, than cinema - bacause of the ways you as a viewer relate to the story.
Yes, maybe you're right.

Still -- it's the arts of presenting moving images to a viewer (interactive or not) -- and most of those (even games!) often borrow from the universe of cinematography technique.

--- Copy of one of my replies on reduser.net ---
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Cinematography of 2030s: Ultra HFR (1000fps at 1000Hz!)

Post by Chief Blur Buster » 02 Feb 2018, 15:47

Haven't seen anyone presenting 240fps non-slomo footage though, but I'm not a HFR expert myself, so maybe it's happening and I'm just not aware of it.
It's already happening in the laboratory.

You can do it too! It's easy and cost under $2000 now.

Since 2017, it has become possible for only $1500-$2000 to begin experimenting with 240fps HFR. The instructions are relatively simple, but produces interesting results, and costs only about $2000 to experiment with 240fps HFR in the laboratory.

Instructions: How to create & view 240fps HFR for under $2000

  1. ~$500+ - Get your favourite slo-mo camera. Begin with a cheap 1080p 240fps camera.
  2. ~$500 - Get a GPU capable of 4K60 playback = enough GPU power for 1080p 240fps.
    For 480fps 1080p HFR, get a GPU capable of 8K 30fps
    For 960fps 1080p HFR, get a GPU capable of 8K 60fps
  3. ~$500 - Get a true-240Hz gaming monitor (e.g. Viewsonic XG2530, ASUS ROG PG258Q, Acer XB252Q). There's now a dozen models that came out in last 12 months. Google "list of 240Hz monitors".
  4. Film something at 240fps, for one-eighth speed playback at 30fps. Save as .mp4 file.
  5. Install open source ffpmpeg. Run this ffmpeg command line to speed up slo-mo to realtime:

    Code: Select all

    ffmpeg -i slowmo240.mp4 -r 240 -vf "setpts=(1/8)*PTS" -an realtime240.mp4
  6. Play back on your monitor. 240fps HFR!
For Future cheap 480Hz and 960Hz:
For 480fps and 960fps HFR, the command lines are (for speeding up 30fps slo-mo to realtime)

Code: Select all

ffmpeg -i slowmo480.mp4 -r 480 -vf "setpts=(1/16)*PTS" -an realtime480.mp4
ffmpeg -i slowmo960.mp4 -r 960 -vf "setpts=(1/32)*PTS" -an realtime960.mp4
The "setpts" is your speedup factor.
30fps to 240fps = 8x speedup = "setpts=(1/8)*PTS"

At this stage, you need to purchase 5-figure scientific projectors at this stage to successfully play these files in high resolution real time true-480Hz (color) or true-1000Hz (usually monochrome, unless you do the 3-projector + color-filter trick). However, tomorrow's displays capable of higher refresh rates, will be able to pull it off. The ultra-HFR bar is falling relatively quickly in the computer world.

Workflow Issues

No movie editing software supports 240fps HFR today. And you need a separate microphone to record high-quality audio, because most slo-mo cameras won't properly record audio.

The best laboratory experimentation workflow (as of 2018):
1. Use your favourite movie software edit your film at original slo-mo.
2. Once done, output your master to an .mp4 file.
3. Use command line software to convert the video file into a higher frame rate.
4. Finally, use command line software to attach (separately recorded) audio to the video.

This is easiest for single-clip experiments (no scene changes). But if you are going to do scene changes, you can note time offset or frame offset of scene changes, and use that as your reference points -- to dub the separately-recorded audio by command line to different parts of a .mp4 file. This can be better automated (with a script or batch file)

Phantom Flex cameras will work too for realtime ultra-HFR too. However, since film software cannot succeed in editing realtime video, you have to master the film at an intentionally low frame rate (e.g. 30fps, 60fps, 120fps) then use a command line utility to speed up the frame rate as a final video-related step after all the mastering. Then afterwards, merge your separately-recorded audio to the file (via command line utility). As no cinema software supports ultra-HFR today. This allows you to use existing cinema software workflows, and do the framerate-speedup only as a final step.

I can help assist a world's first public demo

There are no known public exhibitions of ultra-HFR, but as of Year 2017, 240fps HFR works successfully with just a hobbyist budget. 1000fps or 1440fps HFR is now achievable on small-business budget (5 figures or low 6-figures) for experimentation.

As Blur Busters, I will be happy to work with anybody who wants to figure out how to set up a public exhibition (convention, etc -- e.g. NAB, etc) of 1000fps HFR.

My calculations is it is currently doable with presently available pieces of equipment that has never been married together. (e.g. three ViewPixx quad-Hz projectors running in color-filtered mode, overlapping 3 monochrome images into a true 960fps, 1000fps, or 1440fps HFR full-color) -- combined with intentionally speeded-up Phantom Flex video footage -- merged with concurrently-recorded audio on a separate high-quality microphone.

A demo should happen. It may be a gimmick and publicity thing, but it would pretty much stun the world that a thing is already possible today, and begin a lot of discussion of "what is it good for?" -- also witness CRT-clarity with zero need for camera blur and zero stroboscopic effect -- a magical effect only a few human eyes have seen -- that many eyes could easily feast on in a public demonstration! For one example, I think big-budget amusement park rides (And certain big customers) would pay a pretty penny for this 'display effect' that feels extremely unique, seen by so few eyes, so there might be a business case for some of the outfits whose employees/contractors regularly reads these forums.

I also am developing experimental software that can realtime-split color channels (R/G/B) into three separate fully synchronized video outputs -- 3 video outputs with 1000fps monochrome video of each color channel. This is perfect for commandeering three monochrome 1000Hz scientific projectors, putting a color filter on each of them, stacking them (or pointing the three of them to a dichroic mirror) and display a ultra-HFR color video! I am a computer programmer myself, so I can assist in ultra-HFR experiments, like a "world's first" exhibition (NAB, CEDIA, etc). Lots of starter advice and ultra-HFR due diligence offered free of charge in exchange only for Blur Busters credit. mark[at]blurbusters.com

Beyond that, if anyone so desired, I can assist in helping pull off a demo at one of the upcoming conventions. Let's start with a 1000fps realtime ultra-HFR video demo. AFAIK, this would be a world's first. With my skills, combined with your skills, this is something achievable and exhibitable this year (2018) or next (2019). This would be the "Farnsworth Experiment" showing a displayfeel that no members of the public have ever seen.

Player Software Requirements

Player software is extremely variable quality for ultra-HFR.

Radeon versus NVIDIA has major differences in playback quality with different video player software, for example. For some graphics cards, VLC plays better. On others, Windows Media Player worked better, but I've also even seen a web browser <VIDEO> element play full HFR properly (this works at 120fps).

Fiddling with settings in VLC helps -- e.g. using different video decode settings. A different brand of player may play smoother than others.

Overkill is your friend, too. Extra GPU horsepower headroom helps a lot -- a GPU capable of 8K 60fps often plays 1080p 240fps HFR really smoothly; just that a GPU capable of 4K 60fps is the minimum GPU you need for 1080p 240fps HFR. Headroom is your friend for de-stuttering ultra-HFR video.

OS Recommendations

Windows. Microsoft Windows 10 works fine at 240Hz, 360Hz and 480Hz in my lab. Intel i7 and GeForce GTX 1080 Ti recommended. I have 240Hz, 360Hz and 480Hz in-house.

NOTE: I have only field-visited 1000Hz+ displays, but given my 480Hz experiences, and today's 8K60-capable graphics cards being able to decode 1080p960 video realtime, I don't anticipate blocking factors.
At the moment, I'd suggest GeForce Titan Xp card for 1000fps HFR playback. Their video outputs can reach 1000Hz and even 1440Hz with the help of custom resolution utilities (ToastyX CRU, etc).


For this interim (2018) I currently do not recommend Linux for ultra-HFR yet at this stage due to VSYNC issues, but if you know of a Linux workflow that successfully does 120fps HFR, let me know -- I'll test it for 240fps HFR -- since I want to know specifics (graphics drivers, window manager, kernal version, modeline settings, VSYNC settings, etc)

Not sure of Mac, but if you try Mac, then SwitchResX is your best friend for forcing a Mac to ultra-high-Hz. I have heard of 120Hz and 240Hz successes.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Cinematography of 2030s: Ultra HFR (1000fps at 1000Hz!)

Post by Chief Blur Buster » 12 Jun 2020, 01:45

I've made some important updates to the UltraHFR article:

Ultra HFR: 240fps Real Time Video Now Possible Today. 1000fps Tomorrow. [UPDATED]

Image
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply