New term: "Frame Rate Amplification" (1000fps in cheap GPUs)
Posted: 14 Aug 2017, 17:55
Frame rate amplification technologies
aka 1000fps on cheap GPUs without reduced detail level
Now that we've tested a 480Hz display, and GPUs need time to catch up.
Our new term: "Frame rate amplifiers" or "Frame rate accelerators":
-- Oculus/GearVR just calls it "timewarp" or "reprojection".
-- Sony calls it "interpolation".
-- Other terms include "translation"/"transforms" or other lingo
Good news: Thanks to adding geometry/positional awareness, some new implementations are now virtually lagless. And NO soap opera effects & artifacts for some of the newer technologies, at least when playing games.
Active research is being done. Many aren't talking because of competitive reasons, but new scientific papers keep hitting ResearchGate (I have a peer reviewed conference paper there too) and the US Patent Office that covers the umbrella of ultra-high-Hz universe as well as various kinds of frame rate amplification technologies under various different kinds of trade names (or very specific terms such as "interpolation"/"reprojection"). Chief scientist at Oculus, Michael Abrash, said "... the sweet spot for 1080p at 90 degrees FOV is probably somewhere between 300 and 1000 Hz, although higher frame rates would be required to hit the sweet spot at higher resolutions..."
At Blur Busters, one prediction I have: Framerate amplification may be a important for strobeless low-persistence within 10 years, such as on future 480 Hz and higher displays. Maybe sooner.
There comes a point where frame rate amplification artifacts are less than the artifacts caused by other band-aids such as GPU blur effects or strobing, and may become the default preferred method of low-persistence for displays sometime between the years of 2020-2029 -- once frame rate amplification circuits are built directly into GPUs (thanks to VR, but also benefits ultra-high-Hz displays too).
Many modern researchers & VR scientists now agree that to have the equivalent persistence of 2ms flicker (black frame insertion, strobing, LightBoost, ULMB, Oculus OLED rolling scan) is the baseline. 2ms or better to even remotely approach CRT-league motion clarity. But how do we achieve this without strobing? The answer is very simple, but often jawdropping to some. In order to do this without strobing -- no black frame insertion, no strobing, no pulsing, no CRT scanning, no OLED pulsed rolling scan -- you would need all consecutive frames and refresh cycles to be visible for your target persistence. If you want 2ms persistence without black periods, you need (1000/2ms) = 500fps @ 500Hz to have the same motion clarity without adding any blackness between refresh cycles (avoiding common Blur Reduction techniques).
Acheving high frame rate at high Hz is a problem for today's GPU technology.
The good news is that interpolation/reprojection/timewarping is gradually becoming more free of artifacts, and eventually may be less objectionable than strobing disadvantages. In yesterday's HDTVs, these interpolators were quite abysmally terrible, and even Oculus' timewarping is not perfect. However, progress in frame rate amplification technologies is currently very rapid and with the newer virtually lagless implementations (Picture this: interpolation 240fps->960fps while keeping input lag of 240fps@240Hz) -- and with the progressively reduced artifacts (thanks to increasingly geometry-aware frame rate amplification) -- eventually, the benefits far outweigh the disadvantages even for mainstream use.
Today, frame rate amplification (Oculus' term "reprojection", "timewarp") is already being done by Oculus at a 2x factor, they call it "reprojection"). GPU is definitely the limiting factor, but 25 years ago, GPUs didn't even exist, now they've come up with their esoteric stuff like shaders, transforms, stream processors, etc, and it's possible to have virtually-lagless darn-near-artifact-free geometry-aware interpolators/translators/etc. GPUs have been constantly gaining added goodies in their processing pipeline and, eventually, there may be dedicate silicon for framerate amplification. Whatever is currently being done today has quite a lot of artifacts at 45fps->90fps but they do vastly diminish when amplifying 240fps->480fps, and then adding enhanced geometry awareness to the frame interpolator/translation/reprojection/timewarping (Which I'll now call "framerate amplifier stage of the GPU"), helps even further to reduce artifacts. Scientifically, it's a furious research subject nowadays -- unbeknownst to the mainstream.
There's no true google search result yet for "frame rate amplifiers" or "frame rate amplification", so if you hear that term after the date of this post -- then you know it was a term invented by us. It covers the whole universe of interpolators/reprojection/timewarping and anything that increases framerate without needing to rerender from scratch.
There is no standard term. So, we at Blur Busters, hereby generally call this "frame rate amplification technologies". (Manufacturers: There's no trademark, no copyright, this is a public domain term -- feel free to use it for your product).
Some of these technologies are starting to look fairly artifact free, once extra stuff is added to it (e.g. positional awareness, geometry awareness) and adding that extra knowledge to the interpolator erases a lot of artifacts found on common garden-variety HDTV interpolators of yesteryear. "Soap Opera Artifacts" are gradually disappearing from modern frame rate amplification technologies -- at least when it comes to realtime rendered content (games). Not all of it, but there comes a point where can potentially become less objectionable than phantom array effects of staying at a low Hz. (And flicker discomfort: People still have problems with 864Hz backlight PWM -- and they get headaches even with 240Hz strobing).
We're not quite there yet, but Oculus' reprojection actually looks much better than 45fps stutter. And users familiar with strobing already know 80fps-110fps stutter during LightBoost/ULMB is quite terrible (In fact, sometimes I wish I could enable reprojection during 100Hz ULMB mode, to allow 50fps games to look a lot better). Reprojection artifacts are not perfect, but stutter is often even worse -- for many. And if you quadruple refresh rate, then reprojection artifacts become that much fainter too.
Variable-rate strobing will be a big help (e.g. GSYNC+ULMB), but it's only an interim step in progress of displaykind, with disadvantages including variable-step-distance phantom array effects (even when running at flickerrates above flicker fusion thresholds and using adaptive flicker-shaping to eliminate flickerrate transitions)
A crazy prediction I know, but -- I think that frame rate amplification technologies will be the magic component making 480Hz and 1000Hz displays practical within our lifetimes.
Frame rate amplification also can be made to work without added lag (lookbehind only) to a higher multiple factor (e.g. 4x) thanks to added data such as low-resolution geometry and positional data (e.g. head movements for VR, controller data from controllers, etc), which can be used to create better-quality intermediate frames without a full GPU rerender.
Viola.
Done -- by the 2020s decade (or so) -- One would no longer need a super-powerful GPU to do 1000fps anymore! Just a GPU with a sufficiently powerful frame rate amplifier technology (F.R.A.T.? Not sure if that's a good acronym) built into its silicon as an additional pipeline stage in addition to shaders and stream processors, etc.
Just a GPU with a very good frame rate amplification technology (tomorrow's distant, lyreat improved version of reprojection). In that day in the future, you can have the motion clarity "ULMB/LightBoost without strobing", full HDR, LightBoost=10%, but very bright (1000nits if you want), full HDR, full persistence, with no strobing on a single mid-to-high-end GPU, if you wish. Huge elimination of stroboscopic stepping effects + simultaneous low motion blur.
Certainly, most people don't care, but 144Hz was more dismissed in 2012 ("who cares, nobody can see above 30fps") than even 480Hz currently is today. Nowadays you see 144Hz monitors at mainstream stores such as Best Buy and at Staples, not just at niche stores. Still niche -- but far easier and more accessible than getting an HDTV in the late 1990s. We don't dismiss progress, too quick -- it marches on, indeed.
Even today, people "I cant tell apart 120Hz and 144Hz and 165Hz" don't quite realize the bigger jumps needed (120Hz->240Hz->480Hz->960Hz) to easily notice the steps of motion clarity improvements, especially in the light of increasing resolution and FOV, and the evolving types of games currently being played. And due to GtG limits, the worst/unoptimized 240Hz LCDs to have more motion blur than the best fastest 144Hz LCDs -- and often, people don't run framerates high enough to milk the lowness of the motion blur of those new displays. For a gamer's "I fluctuate 100-to-200fps" upgrading from 144Hz to 240Hz won't see as dramatic clarity improvement as those gamers running nearly permanently->240fps (e.g. CS:GO).
This may not be a topic people care about right now, but I think frame rate amplification technologies will be critical for ultra-high-Hz displays of the 2020s -- even outside VR.
I've read enough already, that this is a research area that the smarter monitor manufacturers and GPU vendors (NVIDIA, Radeon) and the major VR vendors (Oculus, timewarp, reprojection) are already early-canary thinking about already. Even NVIDIA's 16,000Hz AR experiment had to use a frame rate amplification technology too. And scientists are experimenting with Viewpixx's 1440Hz DLP projector as we speak.
Researchers and engineers (not yet aware of this need) should be excited to pay attention -- if one need tomorrow's 480Hz or 1000Hz displays in the next decade or two, to perform properly on mid-range GPUs. Milking that Hz properly requires unique frames for each refresh cycle.
Tomorrow's mainstream gamers will not necessarily care about the Hz -- they'll just see "That looks clearer and more real life than this one". Much like how they love the new 120Hz iPad Pros, even without realizing the 120Hz.
aka 1000fps on cheap GPUs without reduced detail level
Now that we've tested a 480Hz display, and GPUs need time to catch up.
Our new term: "Frame rate amplifiers" or "Frame rate accelerators":
-- Oculus/GearVR just calls it "timewarp" or "reprojection".
-- Sony calls it "interpolation".
-- Other terms include "translation"/"transforms" or other lingo
Good news: Thanks to adding geometry/positional awareness, some new implementations are now virtually lagless. And NO soap opera effects & artifacts for some of the newer technologies, at least when playing games.
Active research is being done. Many aren't talking because of competitive reasons, but new scientific papers keep hitting ResearchGate (I have a peer reviewed conference paper there too) and the US Patent Office that covers the umbrella of ultra-high-Hz universe as well as various kinds of frame rate amplification technologies under various different kinds of trade names (or very specific terms such as "interpolation"/"reprojection"). Chief scientist at Oculus, Michael Abrash, said "... the sweet spot for 1080p at 90 degrees FOV is probably somewhere between 300 and 1000 Hz, although higher frame rates would be required to hit the sweet spot at higher resolutions..."
At Blur Busters, one prediction I have: Framerate amplification may be a important for strobeless low-persistence within 10 years, such as on future 480 Hz and higher displays. Maybe sooner.
There comes a point where frame rate amplification artifacts are less than the artifacts caused by other band-aids such as GPU blur effects or strobing, and may become the default preferred method of low-persistence for displays sometime between the years of 2020-2029 -- once frame rate amplification circuits are built directly into GPUs (thanks to VR, but also benefits ultra-high-Hz displays too).
Many modern researchers & VR scientists now agree that to have the equivalent persistence of 2ms flicker (black frame insertion, strobing, LightBoost, ULMB, Oculus OLED rolling scan) is the baseline. 2ms or better to even remotely approach CRT-league motion clarity. But how do we achieve this without strobing? The answer is very simple, but often jawdropping to some. In order to do this without strobing -- no black frame insertion, no strobing, no pulsing, no CRT scanning, no OLED pulsed rolling scan -- you would need all consecutive frames and refresh cycles to be visible for your target persistence. If you want 2ms persistence without black periods, you need (1000/2ms) = 500fps @ 500Hz to have the same motion clarity without adding any blackness between refresh cycles (avoiding common Blur Reduction techniques).
Acheving high frame rate at high Hz is a problem for today's GPU technology.
The good news is that interpolation/reprojection/timewarping is gradually becoming more free of artifacts, and eventually may be less objectionable than strobing disadvantages. In yesterday's HDTVs, these interpolators were quite abysmally terrible, and even Oculus' timewarping is not perfect. However, progress in frame rate amplification technologies is currently very rapid and with the newer virtually lagless implementations (Picture this: interpolation 240fps->960fps while keeping input lag of 240fps@240Hz) -- and with the progressively reduced artifacts (thanks to increasingly geometry-aware frame rate amplification) -- eventually, the benefits far outweigh the disadvantages even for mainstream use.
Today, frame rate amplification (Oculus' term "reprojection", "timewarp") is already being done by Oculus at a 2x factor, they call it "reprojection"). GPU is definitely the limiting factor, but 25 years ago, GPUs didn't even exist, now they've come up with their esoteric stuff like shaders, transforms, stream processors, etc, and it's possible to have virtually-lagless darn-near-artifact-free geometry-aware interpolators/translators/etc. GPUs have been constantly gaining added goodies in their processing pipeline and, eventually, there may be dedicate silicon for framerate amplification. Whatever is currently being done today has quite a lot of artifacts at 45fps->90fps but they do vastly diminish when amplifying 240fps->480fps, and then adding enhanced geometry awareness to the frame interpolator/translation/reprojection/timewarping (Which I'll now call "framerate amplifier stage of the GPU"), helps even further to reduce artifacts. Scientifically, it's a furious research subject nowadays -- unbeknownst to the mainstream.
There's no true google search result yet for "frame rate amplifiers" or "frame rate amplification", so if you hear that term after the date of this post -- then you know it was a term invented by us. It covers the whole universe of interpolators/reprojection/timewarping and anything that increases framerate without needing to rerender from scratch.
There is no standard term. So, we at Blur Busters, hereby generally call this "frame rate amplification technologies". (Manufacturers: There's no trademark, no copyright, this is a public domain term -- feel free to use it for your product).
Some of these technologies are starting to look fairly artifact free, once extra stuff is added to it (e.g. positional awareness, geometry awareness) and adding that extra knowledge to the interpolator erases a lot of artifacts found on common garden-variety HDTV interpolators of yesteryear. "Soap Opera Artifacts" are gradually disappearing from modern frame rate amplification technologies -- at least when it comes to realtime rendered content (games). Not all of it, but there comes a point where can potentially become less objectionable than phantom array effects of staying at a low Hz. (And flicker discomfort: People still have problems with 864Hz backlight PWM -- and they get headaches even with 240Hz strobing).
We're not quite there yet, but Oculus' reprojection actually looks much better than 45fps stutter. And users familiar with strobing already know 80fps-110fps stutter during LightBoost/ULMB is quite terrible (In fact, sometimes I wish I could enable reprojection during 100Hz ULMB mode, to allow 50fps games to look a lot better). Reprojection artifacts are not perfect, but stutter is often even worse -- for many. And if you quadruple refresh rate, then reprojection artifacts become that much fainter too.
Variable-rate strobing will be a big help (e.g. GSYNC+ULMB), but it's only an interim step in progress of displaykind, with disadvantages including variable-step-distance phantom array effects (even when running at flickerrates above flicker fusion thresholds and using adaptive flicker-shaping to eliminate flickerrate transitions)
A crazy prediction I know, but -- I think that frame rate amplification technologies will be the magic component making 480Hz and 1000Hz displays practical within our lifetimes.
Frame rate amplification also can be made to work without added lag (lookbehind only) to a higher multiple factor (e.g. 4x) thanks to added data such as low-resolution geometry and positional data (e.g. head movements for VR, controller data from controllers, etc), which can be used to create better-quality intermediate frames without a full GPU rerender.
Viola.
Done -- by the 2020s decade (or so) -- One would no longer need a super-powerful GPU to do 1000fps anymore! Just a GPU with a sufficiently powerful frame rate amplifier technology (F.R.A.T.? Not sure if that's a good acronym) built into its silicon as an additional pipeline stage in addition to shaders and stream processors, etc.
Just a GPU with a very good frame rate amplification technology (tomorrow's distant, lyreat improved version of reprojection). In that day in the future, you can have the motion clarity "ULMB/LightBoost without strobing", full HDR, LightBoost=10%, but very bright (1000nits if you want), full HDR, full persistence, with no strobing on a single mid-to-high-end GPU, if you wish. Huge elimination of stroboscopic stepping effects + simultaneous low motion blur.
Certainly, most people don't care, but 144Hz was more dismissed in 2012 ("who cares, nobody can see above 30fps") than even 480Hz currently is today. Nowadays you see 144Hz monitors at mainstream stores such as Best Buy and at Staples, not just at niche stores. Still niche -- but far easier and more accessible than getting an HDTV in the late 1990s. We don't dismiss progress, too quick -- it marches on, indeed.
Even today, people "I cant tell apart 120Hz and 144Hz and 165Hz" don't quite realize the bigger jumps needed (120Hz->240Hz->480Hz->960Hz) to easily notice the steps of motion clarity improvements, especially in the light of increasing resolution and FOV, and the evolving types of games currently being played. And due to GtG limits, the worst/unoptimized 240Hz LCDs to have more motion blur than the best fastest 144Hz LCDs -- and often, people don't run framerates high enough to milk the lowness of the motion blur of those new displays. For a gamer's "I fluctuate 100-to-200fps" upgrading from 144Hz to 240Hz won't see as dramatic clarity improvement as those gamers running nearly permanently->240fps (e.g. CS:GO).
This may not be a topic people care about right now, but I think frame rate amplification technologies will be critical for ultra-high-Hz displays of the 2020s -- even outside VR.
I've read enough already, that this is a research area that the smarter monitor manufacturers and GPU vendors (NVIDIA, Radeon) and the major VR vendors (Oculus, timewarp, reprojection) are already early-canary thinking about already. Even NVIDIA's 16,000Hz AR experiment had to use a frame rate amplification technology too. And scientists are experimenting with Viewpixx's 1440Hz DLP projector as we speak.
Researchers and engineers (not yet aware of this need) should be excited to pay attention -- if one need tomorrow's 480Hz or 1000Hz displays in the next decade or two, to perform properly on mid-range GPUs. Milking that Hz properly requires unique frames for each refresh cycle.
Tomorrow's mainstream gamers will not necessarily care about the Hz -- they'll just see "That looks clearer and more real life than this one". Much like how they love the new 120Hz iPad Pros, even without realizing the 120Hz.