New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
Post Reply
User avatar
Chief Blur Buster
Site Admin
Posts: 11895
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Post by Chief Blur Buster » 14 Aug 2017, 17:55

Frame rate amplification technologies

aka 1000fps on cheap GPUs without reduced detail level

Now that we've tested a 480Hz display, and GPUs need time to catch up.

Our new term: "Frame rate amplifiers" or "Frame rate accelerators":
-- Oculus/GearVR just calls it "timewarp" or "reprojection".
-- Sony calls it "interpolation".
-- Other terms include "translation"/"transforms" or other lingo

Good news: Thanks to adding geometry/positional awareness, some new implementations are now virtually lagless. And NO soap opera effects & artifacts for some of the newer technologies, at least when playing games.

Active research is being done. Many aren't talking because of competitive reasons, but new scientific papers keep hitting ResearchGate (I have a peer reviewed conference paper there too) and the US Patent Office that covers the umbrella of ultra-high-Hz universe as well as various kinds of frame rate amplification technologies under various different kinds of trade names (or very specific terms such as "interpolation"/"reprojection"). Chief scientist at Oculus, Michael Abrash, said "... the sweet spot for 1080p at 90 degrees FOV is probably somewhere between 300 and 1000 Hz, although higher frame rates would be required to hit the sweet spot at higher resolutions..."

At Blur Busters, one prediction I have: Framerate amplification may be a important for strobeless low-persistence within 10 years, such as on future 480 Hz and higher displays. Maybe sooner.

There comes a point where frame rate amplification artifacts are less than the artifacts caused by other band-aids such as GPU blur effects or strobing, and may become the default preferred method of low-persistence for displays sometime between the years of 2020-2029 -- once frame rate amplification circuits are built directly into GPUs (thanks to VR, but also benefits ultra-high-Hz displays too).

Many modern researchers & VR scientists now agree that to have the equivalent persistence of 2ms flicker (black frame insertion, strobing, LightBoost, ULMB, Oculus OLED rolling scan) is the baseline. 2ms or better to even remotely approach CRT-league motion clarity. But how do we achieve this without strobing? The answer is very simple, but often jawdropping to some. In order to do this without strobing -- no black frame insertion, no strobing, no pulsing, no CRT scanning, no OLED pulsed rolling scan -- you would need all consecutive frames and refresh cycles to be visible for your target persistence. If you want 2ms persistence without black periods, you need (1000/2ms) = 500fps @ 500Hz to have the same motion clarity without adding any blackness between refresh cycles (avoiding common Blur Reduction techniques).

Acheving high frame rate at high Hz is a problem for today's GPU technology.

The good news is that interpolation/reprojection/timewarping is gradually becoming more free of artifacts, and eventually may be less objectionable than strobing disadvantages. In yesterday's HDTVs, these interpolators were quite abysmally terrible, and even Oculus' timewarping is not perfect. However, progress in frame rate amplification technologies is currently very rapid and with the newer virtually lagless implementations (Picture this: interpolation 240fps->960fps while keeping input lag of 240fps@240Hz) -- and with the progressively reduced artifacts (thanks to increasingly geometry-aware frame rate amplification) -- eventually, the benefits far outweigh the disadvantages even for mainstream use.

Today, frame rate amplification (Oculus' term "reprojection", "timewarp") is already being done by Oculus at a 2x factor, they call it "reprojection"). GPU is definitely the limiting factor, but 25 years ago, GPUs didn't even exist, now they've come up with their esoteric stuff like shaders, transforms, stream processors, etc, and it's possible to have virtually-lagless darn-near-artifact-free geometry-aware interpolators/translators/etc. GPUs have been constantly gaining added goodies in their processing pipeline and, eventually, there may be dedicate silicon for framerate amplification. Whatever is currently being done today has quite a lot of artifacts at 45fps->90fps but they do vastly diminish when amplifying 240fps->480fps, and then adding enhanced geometry awareness to the frame interpolator/translation/reprojection/timewarping (Which I'll now call "framerate amplifier stage of the GPU"), helps even further to reduce artifacts. Scientifically, it's a furious research subject nowadays -- unbeknownst to the mainstream.

There's no true google search result yet for "frame rate amplifiers" or "frame rate amplification", so if you hear that term after the date of this post -- then you know it was a term invented by us. It covers the whole universe of interpolators/reprojection/timewarping and anything that increases framerate without needing to rerender from scratch.

There is no standard term. So, we at Blur Busters, hereby generally call this "frame rate amplification technologies". (Manufacturers: There's no trademark, no copyright, this is a public domain term -- feel free to use it for your product).

Some of these technologies are starting to look fairly artifact free, once extra stuff is added to it (e.g. positional awareness, geometry awareness) and adding that extra knowledge to the interpolator erases a lot of artifacts found on common garden-variety HDTV interpolators of yesteryear. "Soap Opera Artifacts" are gradually disappearing from modern frame rate amplification technologies -- at least when it comes to realtime rendered content (games). Not all of it, but there comes a point where can potentially become less objectionable than phantom array effects of staying at a low Hz. (And flicker discomfort: People still have problems with 864Hz backlight PWM -- and they get headaches even with 240Hz strobing).

We're not quite there yet, but Oculus' reprojection actually looks much better than 45fps stutter. And users familiar with strobing already know 80fps-110fps stutter during LightBoost/ULMB is quite terrible (In fact, sometimes I wish I could enable reprojection during 100Hz ULMB mode, to allow 50fps games to look a lot better). Reprojection artifacts are not perfect, but stutter is often even worse -- for many. And if you quadruple refresh rate, then reprojection artifacts become that much fainter too.

Variable-rate strobing will be a big help (e.g. GSYNC+ULMB), but it's only an interim step in progress of displaykind, with disadvantages including variable-step-distance phantom array effects (even when running at flickerrates above flicker fusion thresholds and using adaptive flicker-shaping to eliminate flickerrate transitions)

A crazy prediction I know, but -- I think that frame rate amplification technologies will be the magic component making 480Hz and 1000Hz displays practical within our lifetimes.

Frame rate amplification also can be made to work without added lag (lookbehind only) to a higher multiple factor (e.g. 4x) thanks to added data such as low-resolution geometry and positional data (e.g. head movements for VR, controller data from controllers, etc), which can be used to create better-quality intermediate frames without a full GPU rerender.

Viola.

Done -- by the 2020s decade (or so) -- One would no longer need a super-powerful GPU to do 1000fps anymore! Just a GPU with a sufficiently powerful frame rate amplifier technology (F.R.A.T.? Not sure if that's a good acronym) built into its silicon as an additional pipeline stage in addition to shaders and stream processors, etc.

Just a GPU with a very good frame rate amplification technology (tomorrow's distant, lyreat improved version of reprojection). In that day in the future, you can have the motion clarity "ULMB/LightBoost without strobing", full HDR, LightBoost=10%, but very bright (1000nits if you want), full HDR, full persistence, with no strobing on a single mid-to-high-end GPU, if you wish. Huge elimination of stroboscopic stepping effects + simultaneous low motion blur.

Certainly, most people don't care, but 144Hz was more dismissed in 2012 ("who cares, nobody can see above 30fps") than even 480Hz currently is today. Nowadays you see 144Hz monitors at mainstream stores such as Best Buy and at Staples, not just at niche stores. Still niche -- but far easier and more accessible than getting an HDTV in the late 1990s. We don't dismiss progress, too quick -- it marches on, indeed.

Even today, people "I cant tell apart 120Hz and 144Hz and 165Hz" don't quite realize the bigger jumps needed (120Hz->240Hz->480Hz->960Hz) to easily notice the steps of motion clarity improvements, especially in the light of increasing resolution and FOV, and the evolving types of games currently being played. And due to GtG limits, the worst/unoptimized 240Hz LCDs to have more motion blur than the best fastest 144Hz LCDs -- and often, people don't run framerates high enough to milk the lowness of the motion blur of those new displays. For a gamer's "I fluctuate 100-to-200fps" upgrading from 144Hz to 240Hz won't see as dramatic clarity improvement as those gamers running nearly permanently->240fps (e.g. CS:GO).

This may not be a topic people care about right now, but I think frame rate amplification technologies will be critical for ultra-high-Hz displays of the 2020s -- even outside VR.

I've read enough already, that this is a research area that the smarter monitor manufacturers and GPU vendors (NVIDIA, Radeon) and the major VR vendors (Oculus, timewarp, reprojection) are already early-canary thinking about already. Even NVIDIA's 16,000Hz AR experiment had to use a frame rate amplification technology too. And scientists are experimenting with Viewpixx's 1440Hz DLP projector as we speak.

Researchers and engineers (not yet aware of this need) should be excited to pay attention -- if one need tomorrow's 480Hz or 1000Hz displays in the next decade or two, to perform properly on mid-range GPUs. Milking that Hz properly requires unique frames for each refresh cycle.

Tomorrow's mainstream gamers will not necessarily care about the Hz -- they'll just see "That looks clearer and more real life than this one". Much like how they love the new 120Hz iPad Pros, even without realizing the 120Hz.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on: BlueSky | Twitter | Facebook

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
RealNC
Site Admin
Posts: 4147
Joined: 24 Dec 2013, 18:32
Contact:

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Post by RealNC » 14 Aug 2017, 18:43

Horsepower is the thing that's lacking currently. No GPU is capable of doing interpolation from 240->480 using GPGPU. AFAIK, the complexity of interpolation increases quadratically. Interpolation from 60 to 120FPS is four times as demanding as interpolation from 30 to 60. So for 240->480, you'd need a GPU much stronger than the 1080 Ti. That problem could be solved in a couple GPU generations. However, at that point, there would be no power left to actually produce those initial 240FPS to interpolate from...

There's also the issue of interpolation needing more GPU power than it would take to produce those frame rates natively. For example, 240->480 interpolation might be more demanding than just rendering at 480FPS to begin with.

So this has to be done on dedicated silicon. I don't think it's the GPU's task to do interpolation.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
Chief Blur Buster
Site Admin
Posts: 11895
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Post by Chief Blur Buster » 14 Aug 2017, 18:55

RealNC wrote:Horsepower is the thing that's lacking currently. No GPU is capable of doing interpolation from 240->480 using GPGPU. AFAIK, the complexity of interpolation increases quadratically. Interpolation from 60 to 120FPS is four times as demanding as interpolation from 30 to 60. So for 240->480, you'd need a GPU much stronger than the 1080 Ti. That problem could be solved in a couple GPU generations. However, at that point, there would be no power left to actually produce those initial 240FPS to interpolate from...
This is old style thinking ;) ... This is thinking of traditional interpolation.
RealNC wrote:So this has to be done on dedicated silicon.
Correct.
RealNC wrote:I don't think it's current GPU's task to do interpolation.
Fixed that for you.

Why do GPUs need transform & lighting? Why did GPUs get shaders? Why does GPUs now have stream processors?

Why do we need GPUs in smartphones? All that 3D horsepower to just bit-blit a scrolling window? Overkill. Insane.

But now, GPUs are standard even in a $75 Android device in a developing market -- that runs circles around a 3dfx Voodoo2 SLI.

In tomorrow's GPUs, frame rate amplification silicon would be standard. It will just be another GPU "pipeline" thing just like "T&L", "shaders", etc.

Also, frame rate amplification can be done using a completely different technique than traditional GPU processing, and there's logic that is optimizable specifically for that purpose that has no equal in current GPUs. Much like an "ASIC-boost" -- doing more with the same number of transistors -- by attacking the problem more efficiently from a different angle.

It might, in theory be like a "very-different-designed mini-GPU that runs in tandem with the main-GPU". Just like the special cloud-computing GPUs that doesn't have the ability to generate 3D graphics efficiently. But we don't EVEN need that many transistors to do certain kinds of 1000fps reprojection. Heck, maybe it'll use low-resolution voxel cubes as 3D reprojection geometry awareness, or maybe it'll use bounding boxes (from collision detection) as the low-resolution geometry source for reduced-artifact lookbehind-only (lagless) reprojection algorithms that runs at low-overhead at high multiplier factors (e.g. 3x, 4x, maybe even 8x). Who knows? It already looks like that the big GPU engineers are actually already at work on this type of creativity, and if they weren't aware till recently, they now are, thanks to VR research, which has spinoff applications to ultrahigh-Hz displays. And who knows, it's also possible that Z-Buffers become multilayered too, creating low-overhead reprojection algorithms. Amazing skills -- the world's best algorithm optimizers are at some of the big companies. In tomorrow's GPUs, it could then makes reprojection a much more trivial task. Tricks are increasingly reducing the overhead of artifact-free frame rate amplification.

Entry-level versions of this inbuilt silicon extra to GPUs would just serve phone VR, while midrange and highend versions could target mainstream desktop gaming monitors (by then, could even be 480Hz+ in the 2020s) in addition to VR headsets.

The applications of the technology gets broader, the more lagless/artifact-free the frame rate amplification technology becomes, especialy if they decide it becomes a "free minor addition" that's standard in most GPUs in ten years from now.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on: BlueSky | Twitter | Facebook

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
RealNC
Site Admin
Posts: 4147
Joined: 24 Dec 2013, 18:32
Contact:

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Post by RealNC » 14 Aug 2017, 19:08

The industry has to also be brave enough to push for this. It's very rare to give people something they didn't ask for. Reprojection in VR wasn't done because people asked for it; it was done because it was needed.

VRR is one of the few things that was offered without anyone asking for it, and I think that was a very brave move by NVidia (not a fanboy, never was, but credit to where it's due.) We need more of that. Just because people aren't asking for a move towards 1000Hz/FPS doesn't mean the industry shouldn't attempt to go there.
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
Chief Blur Buster
Site Admin
Posts: 11895
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Post by Chief Blur Buster » 14 Aug 2017, 19:15

RealNC wrote:The industry has to also be brave enough to push for this. It's very rare to give people something they didn't ask for. Reprojection in VR wasn't done because people asked for it; it was done because it was needed.
You're damn right!

Count on it. :D

We do change industry to an extent, by what we do.

As we rapidly expand Blur Busters' focus over the coming months and years, we are going to be part of this hard push in the industry to make ultra-high-Hz economically practical.

Little known -- but Oculus went low-persistence earlier than they did thanks to me (Blur Busters). They then eventually reached out to the smart ones (Michael Abrash, etc) long after being fully convinced by me to go low-persistence back in Blur Busters' early "scanningbacklight.com" days. [There's a very long Kickstarter backstory behind this -- and they got access to an early beta of TestUFO six months before TestUFO launched!]. This was one year before Facebook bought 'em out. Yes, that was certainly a jawdrop, it happened long after I last talked to them -- Remember, this was essentially about a year before any controversies/Facebook/etc -- I'm talking about ancient Kickstarter history where I was actively involved in their research to lower the persistence of their VR. ...

About my behind-the-scenes role in the Kickstarter version of Oculus Rift -- this isn't a topic I talk about much -- and yes, probably missed an opportunity (that I don't regret, thankfully: Blur Busters remains independent) of potential Facebook riches -- and you know, the news controversies involving Oculus' original founder. I had stopped active involvement after around the kickstarter time (you know, the hobby days when everyone was naive and focussed on tech). Originally, I thought it was really longshot tech. But, certainly, you can blame the 1-to-2-year-earlier arrival of a rolling scan OLED on Blur Busters' existence -- long before they hired Michael Abrash and John Carmack, they jumped onto a relentless path to attempt to lower persistence only because of Blur Busters (hiring the big names, important science, spending huge amounts on rolling scan research, etc). It truly totally surprised me, how continuously & single-minded Oculus chased the low-persistence golden pot after my convincing them so early on. It's wonderful that they did, even if I missed out a lot after the Kickstarter.

I have no doubt that the low persistence OLED VR arrived about 1-2 years earlier in humankind, thanks to the existence of Blur Busters -- I forced the people behind the Kickstarter to experiment with LightBoost displays. They used a beta version of TestUFO as the internal motion tester for the Kickstarter version of the Rift & their attempted subsequent experimentation at strobed LCDs. Apparently, that probably weren't good enough in mobile format -- as they went the OLED rolling scan route. But I triggered their low-persistence path. Even today, I'm sure 99% of Oculus employees don't even know Blur Busters' behind the scenes role in the Rift Kickstarter (their early access to an early beta TestUFO, their obtaining a LightBoost monitor, etc).

We all already know that Blur Buster's popularization of LightBoost, ultimately convinced NVIDIA to add ULMB to their monitors. And now strobing is a feature found in more than half of high-Hz gaming monitors (see 120Hz monitors). All thanks to the LightBoost promoting that Blur Busters did a few years back. Back at the time, various reps at NVIDIA in 2013 confirmed that they added ULMB because of all the LightBoost demand.

And, many testers are adopting the pursuit camera invention of mine (free for use). Even a hardware device manufacturer adopted the pursuit camera principle in a modified form.

So....Blur Busters does change industry -- to an extent! Despite whatever bravado that sometimes Blur Busters does, it's serious technology -- and we've been serious at pushing the frontiers of eliminating motion blur -- and we will be extremely hard on pressing new strobe-free methods of low-persistence.

I'd rather see true-960Hz displays (with frame rate amplification technologies) arrive in 2021 than 2025, for example -- and Blur Busters might have a percentage pull at that -- given we educate even engineers (who know their FPGAs & TCONs inside out, but don't know their MPRTs/persistence/strobing & how to make GtG less of a factor in motion blur). So, who knows?

Originally a few years ago -- I thought it wouldn't happen easily this century, but what I've learned since, has now fully convinced me that "strobeless LightBoost" is now probably practical at reasonable prices in less than 10 years -- even at troubled/poor Moore's Law curves. The domino that just fell was reprojection for VR -- but that's now only a start.

You can count on continuous advocacy of high-Hz from us, which can educate even OLED/LCD engineers that have mistaken scientific beliefs in the causes of motion blur. (Even half of them, many in China, doesn't even know about strobing).

Even the 480Hz prototype display uses an off-the-shelf LCD [with a custom TCON board]. A kit will be available by zisworks if you want to modify your common cheap garden variety cheap 28" TN 4K display (after de-bezelling it -- custom electronics won't fit inside most monitor cases). So we're not even milking current panels to their limits yet. And that's without adding custom overdrive electronics yet. 480Hz isn't expensive rocket science, nobody is doing it because nobody thinks it's important. One is surprised how far even current panels can be pushed, LCD makers have been pushing GtG uselessly for a long time until they realized strobing is simultaneously needed, and then, of a sudden, 1ms TN (okay, not perfect 1ms, but 80% perfect actually exists -- 1ms GtG10%-90%). You'd be surprised that weak links lie elsewhere (GPUs), rather than some panels themselves. Custom expensively-calibrated overdrive lookup tables will make 480 Hz quite a sight to behold in a few years from now, especially if the resolution can be increased to full panel resolution. Now after that happens, the weak link is the GPU to generate the frame rates needed for strobeless low-persistence. Obviously. And the subject of this thread, too....

Anyway, back on topic, I'm very confident that various forms of frame rate amplification technologies will eventually happen -- and probably a bunch of proprietary brands coming (e.g. I can easily imagine: A sequel to GSYNC specially designed for eSports/1000Hz).

Also, cell phones didn't need GPUs just to dial a number. If you told any engineer just ten or fifteen years ago, they'd scoff at the need for a GPU in a phone! But eventually, GPUs found their way into virtually everybody's pocket. Now they can pan a 3D GPS map and enjoy finding the nearest restaurant.

As the best VR headsets demand the frame rate amplification technology, that gets built into GPUs, and that eventually filters down.... to midrange GPUs that is also.....taken advantage by tomorrow's high-Hz monitors (480Hz and up). I give it only a decade before that happens -- apparently ultra-high-detail 1000fps isn't as silicon-unobtainium as I originally thought it was -- and I now believe, it's actually a "next generation" thing.

It's an early canary topic. Yes. But I'm now convinced.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on: BlueSky | Twitter | Facebook

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
RealNC
Site Admin
Posts: 4147
Joined: 24 Dec 2013, 18:32
Contact:

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Post by RealNC » 14 Aug 2017, 20:03

I just took a trip through wikipedia on Occulus and all things VR-related. It's certainly not mentioned anywhere that if it wasn't for testufo, VR might have been a blurry experience :D

They should make a documentary :mrgreen:
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

User avatar
Chief Blur Buster
Site Admin
Posts: 11895
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Post by Chief Blur Buster » 14 Aug 2017, 20:08

RealNC wrote:I just took a trip through wikipedia on Occulus and all things VR-related.

They should make a documentary :mrgreen:
Perhaps. There's a book being made, at the moment, at least.
Who knows, some moviemaker might turn THAT into a documentary.

I think this is the first time I'm writing about my involvement in the Oculus Rift Kickstarter. A book author has already interviewed me about my role in triggering their chase (and, eventually, succesful catching) of the low-persistence golden goose. And confirmed by Palmer Luckey directly to the book author for an upcoming VR-related book. [I hope I did the right thing, revealing my involvement. Gah.]

-- Please note. I'm in Canada, Blur Busters is a Canadian site, and I have no comment on politics controversies. Knew totally nothing about the founder's politics back then in Kickstarter days. It was a 100% tech focus, trying to solve VR motion blur. It was just pure 100% tech fun...and lit the fuse for a very important VR advancement! --

John Carmack and Michael Abrash are much bigger names than I am, and they have far bigger pull than Blur Busters does. They also independently came up with many blur reduction principles and have similiar understandings of motion blur science as I do. So they deserve credit. But I was involved first, long before then -- somebody (me) had to convince the early Oculus Kickstarter team to decide to start working on low persistence. Oculus appears to have chased them down because of me convincing (back in the Kickstarter days) of the huge importance of the low-persistence path in VR. Without me, they wouldn't have chased the big names (John Carmack, Michael Abrash, etc) down as damn quickly as they did, and thus, wouldn't have accelerated Oculus' literally "blank-cheque" spending on trying find a low-persistence breakthrough for VR.

In fact, Blur Busters was born because of a tweet reply to me from John Carmack (September 21, 2012). Back in Blur Busters' hobbyist days, we were originally http://www.scanningbacklight.com (that URL redirects to Blur Busters now).

Irony: Ziswork's 480Hz mod has an open source strobe backlight (best at lower Hz) with source code installed via Arduino 1.8.2. 4-segment so it can also double as a rudimentary scanning backlight, too. Full range persistence/pulse width/voltage boost adjustments. Funny how Blur Busters circles back to roots, with an "Arduino" (compatible) scanning backlight!

So, Blur Busters (ScanningBacklight.com back then) convinced the whole Oculus Kickstarter team to chase the very important industry people as a result of my aggressive convincing of the Rift Kickstarter chase the low-persistence path -- and this is why, very, very early on -- they hired John Carmack and Michael Abrash -- so darn quickly. After they got the lightboost display from my nagging, it was as if low persistence was "Universe Big Bang" importance to Oculus, it was really truly "blank-cheque this feature" at Oculus... And long after I was no longer involved, many dominoes rapidly fell at Oculus. Even to this day -- I don't even think John Carmack and Michael Abrash realizes how their jobs are a result of me convincing Oculus' founder to chase them down. ;)

The important takeaway: We're Blur Busters -- and have influenced the display industry more than many people think.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on: BlueSky | Twitter | Facebook

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

thatoneguy
Posts: 211
Joined: 06 Aug 2015, 17:16

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Post by thatoneguy » 18 Aug 2017, 23:56

Woah this is neat :o
If it actually works properly we've hit gold :mrgreen:

Too bad we're stuck with fixed pixels and no native support for legacy interlaced content else this would be the real CRT successor.

I wonder if Lightguns would finally work with this thing.

EDIT: Could this technology be implement in a typical TV?
Because this would be one of the biggest revolutions in TV history. Good lord...just imagine finally watching Sports without motion blur in a 100+ inch tv.

Glide
Posts: 280
Joined: 24 Mar 2015, 20:33

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Post by Glide » 19 Aug 2017, 12:41

Great post, and I agree that it's probably the best option we have for low-persistence in the future.
The only issue is that, as I understand it, these techniques work for camera tracking, but your game is still animating at whatever framerate it is natively running at.
Unfortunately, that can look really bad. Bioshock 1 & 2 (original release) are examples of games which only update their physics/animations at 30Hz, and look really bad running at 60 FPS+. (though there is a mod for the original which lets you set it to anything you want)
Other games update at 50/60Hz when running at 100 FPS+. While I can't think of any examples for that off the top of my head, it's also quite jarring to see.
Now if you're talking about rendering 100 FPS+ and updating at 1000Hz+ maybe it's less of an issue - especially if you were to use per-object motion blur on everything animating.

But you don't get that issue with low-persistence strobing.
With strobing, animations end up looking smoother rather than making their low framerate stand out.

I'm not trying to say that it won't work. If anything, NVIDIA's demonstration of a 16kHz AR display validates the idea.
Only that there may be additional challenges to making it look good.
Chief Blur Buster wrote:Good news: Thanks to adding geometry/positional awareness, some new implementations are now virtually lagless. And NO soap opera effects & artifacts for some of the newer technologies, at least when playing games.
"Soap Opera Effect" is just a disparaging term used by dinosaurs to describe smooth or high framerate video playback.
It doesn't matter to them if it's interpolated or native HFR. They are enemies of progress.

thatoneguy
Posts: 211
Joined: 06 Aug 2015, 17:16

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Post by thatoneguy » 19 Aug 2017, 18:22

Glide wrote:
But you don't get that issue with low-persistence strobing.
With strobing, animations end up looking smoother rather than making their low framerate stand out.
Technically it should be the same thing here since with 1000fps@1000hz or 960fps@960hz you're getting into 1ms persistence territory which is the same as strobing at 1ms except there's no brightness loss/flicker.

Post Reply