Blur Busters Forums

Who you gonna call? The Blur Busters! For Everything Better Than 60Hz™ Skip to content

New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers. The masters on Blur Busters.

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Postby Chief Blur Buster » 15 Dec 2017, 00:08

Possibly, but, I think there will still be strong economic pressures in the direction of FRAT. Just will take up to a decade to borne itself out.

Really good FRAT will still need lots of horsepower and silicon. You may still need to, say, consume 50% of a powerful future Titan doing conversion of 100fps to 1,000fps. It will initially be a high-end-only feature, possibly.

It may even be accomplished by game engines instead, given sufficient GPU capability, too - increasingly improved versions of Oculus timewarping, until AMD or NVIDIA one-ups the other by introducing GPU/driver level based FRAT.

There may be unconventional algorithms in play, too. Heck, maybe even realtime framerateless beamtracing algorithms that can be denoised to any frame rates, instead.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6266
Joined: 05 Dec 2013, 15:44

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Postby open » 16 Dec 2017, 12:04

BattleAxeVR wrote:The cynic in me thinks it's unlikely that big GPU manufacturers would back this type of shortcut to improving framerates, as it would cannibalize their sales.

For example, why pay for a 1080 ti that can hit an average of 50 fps on a 4K game, which then gets interpolated to 60 fps by some additional chip, when you can just buy a 1050 which can only manage 30 fps but has the same type of chip?

See the conflict of interest here? It's an incentive to not upgrade, or when you do upgrade to a new gen GPU, you pick a much cheaper model.

Sure, some people will pay more and turn off interpolation because they prefer "real frames" (if you work in games you realize how not real most of the frames are. E.g. temporal antialiasing and reprojection already synthesize "fake frames" that you see all the time from prior frames).

But the smart bet is on them not being interested in providing this service, as it will ultimately hurt their bottom line. Heck, there are PS4 games which only try to hit 60 fps for VR games, and rely on the reprojection to hit 120 fps for display in the headsets. If you can do that for 60 -> 120 you can do it for 30 -> 60 or 45 -> 60 as well, and if the quality is decent enough, who cares if the frames are real or fake? It's academic and a pedantic exercise. Modulo interpolation artifacts, which are being reduced all the time.

The latest Pixelworks interpolation engine on recent projectors like the Optoma UHD65 is apparently nearly 100% artifact free. And latency is of course something people are also working on as well. But if you can hit, say, 120hz, then one or two frames of latency added is only 8-16ms which is not very much. Currently these engines add like 100ms or more of latency but that figure is likely to drop over time.

What I write here is not appropriate for virtual reality of course, which demands max constant framerate at least in terms of reprojecting to the current head orientation and eventually eye focus, if not actually showing updated animations and scene content too. And that will likely go into custom hardware instead of additional compositor passes as they are now, so perhaps they will open this up for non-VR games too eventually, but I actually think they have a financial incentive to not do that.

Or hold off allowing it for 2D games because what "FPS can it do" is what sells GPUs in the gamer space at least.

One thing I do hope for is that G-Sync will perish thanks to HDMI 2.1 supporting Adaptive Sync standard VRR.

Xbox One X having variable refresh rate support could make the case for TV manufacturers to add this feature to their "gamer" friendly lines, and eventually to all TVs since it'll be embedded in commodity controllers which get passed around / reused between manufacturers.

1050 at 30 fps to 1080 ti at 50 seems a bit off. This is a necessary tech for high refresh displays. I hope gsync doesn't go away. Its better in many ways which you can read about on this forum. Presonally, it's better because of the lower input lag hit alone.

We're talking about techs that have huge performance requirements. Being able to use those refresh rates and techs is going to be a big incentive for buyers. Doubtful that people on the high end of nvidias price profiling sceme (now paying up to 3000 for increased gaming gpu performance) are going to want a low tier card where they will inevitably have to sacrifice quality at some level.
open
 
Posts: 223
Joined: 02 Jul 2017, 20:46

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Postby BattleAxeVR » 18 Dec 2017, 18:57

I own a G-sync monitor, and yes, it's pretty great, but its raison-d'etre is that game engines, having a variable work-load per frame, have variable frame rendering times, and it's better to slave the display to the GPU than the other way around and deal with stuttering and all the other problems with V-sync we all know and love.

The reason I wrote what I did, is from what I learned as a VR developer, namely, that a constant, perfectly synced, max framerate is necessary to avoid motion sickness. This fact makes VRR tech inappropriate for VR. Then I started to think to myself, well, why is it acceptable for 2D games to have variable smoothness. It's unnatural. It's certainly (much) better than V-sync, but both are a hack to compensate for the engine not being able to sustain max FPS at all times.

This is the problem that FRAT is the ideal tech to solve.

It's not about the specific numbers between 1050 vs 1080 Ti or whatever, that was just an example. Variable refresh rate is better than constant refresh rate for variable frames per second, but variable frames per second is itself inferior to constant frames per second. And it's very hard to reach super high, constant frames per second, the higher the resolution and refresh rate you are talking about. It's a war that cannot be won. Not only that, but it's inefficient to render that many unique frames, since the higher the native framerate, the better interpolation works. (less magnitude error = less obvious artifacts). And there are interpolation engines now which are very good, almost artifact-free.

It's approaching a "solved problem" at least in the 120hz TV range. We're just talking about displays of the with much better refresh rates.

And the fact that current interpolation chips can barely even do 4K24 to 4K60, at best. Videogames, especially on consoles, have often sacrificed peak FPS for constant but much lower FPS, like 60 or 30fps (on consoles), in order to get better looking pixels. But we can have both.
BattleAxeVR
 
Posts: 44
Joined: 14 Dec 2017, 11:38

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Postby Chief Blur Buster » 19 Dec 2017, 09:49

Yes, variable refresh rate is a great fix for variable frame rates.

Yes, constant framerate is really ideal for VR.

In theory, VRR tech isn't necessarily 100% inappropriate for VR if it is low-persistence VRR with constant-lag (not variable lag). Ideally via overkill framerates, e.g. VRR between 250Hz through 1000Hz for 1/250sec-to-1/1000sec persistence. However, you could also use strobed VRR too, so you can simply use lower refresh rates instead, to achieve the same persistence. Indeed, it would be far less motion-sicky than stutter.

(Those who understand persistence, know you either need strobing or overkill Hz to achieve ultra-low persistence.)

But this may be rendered mostly moot when Frame Rate Amplification Tech (FRAT) starts successfully achieving 1000fps laglessly & artifactlessly. And 1000Hz almost behaves like VRR, because persistence varies only in 1ms increments (refresh cycle granularity is so fine, you can more easily vary the persistence). Games modulating 30fps through 200fps (say, when FRAT is turned off) on a 1000Hz display would look darn near exactly like a G-SYNC display, since the refresh cycle granularity is so fine, that it behaves essentially VRR.

EDIT

___________

A Blur Busters Holiday 2017 Special Feature

Coming sooner than expected within our lifetimes in the 2020s:
Image\

Read more:
Blur Busters Law: The Amazing Journey To Future 1000Hz+ Displays
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6266
Joined: 05 Dec 2013, 15:44

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Postby Chief Blur Buster » 23 Feb 2019, 21:14

New Frame Rate Amplification Tech:
NVIDIA's new Deep Learning Super Sampling (DLSS)

There are many terminologies for various kinds of frame rate amplification technologies.
  • Asynchronous Space Warp
  • Deep Learning Super Sampling (DLSS)
  • Interpolation (especially AI interpolation)
  • Etc.
NVIDIA's RTX 2080 Ti new "Deep Learning Super Sampling" (DLSS) is a new form of frame rate amplification technology!
It is not really classic interpolation based.

And the Oculus VR feature that laglessly converts 45fps to 90fps doesn't add any lag for my VR.

Frame rate increasing techniques (that shortcuts full-resolution GPU renders for every full-resolution frame) can be non-interpolation-based and can be near lagless.

Frame rate amplification technologies (F.R.A.T.) is the genericized terminology I've invented for all the umbrella of everything that increases frame rates without doing full GPU frame renders for all frames, e.g. rendering shortcuts that reuses previous frames intelligently.

They also work best when they're well-integrated with the rendering workflow (e.g. access to Z-buffer, access to geometry data, access to 1000Hz controller data). That's what makes newer F.R.A.T. systems like NVIDIA DLSS smarter, less laggy, less artifacts, and less black-box, than classic interpolation of yesteryear.

Blur Busters is a huge believer in FRAT (Frame Rate Amplification Technologies) in the current slow Refresh Rate Race to future Retina Refresh Rates. Even if the tech may take a decade or two to achieve universally cheap quad-digit frame rates.

They use extremely different methods to create results.
  • Some use a lower-resolution base and upscale (thanks to internal AI knowledge of what a high resolution frame looks like)
  • Some fills in missing frames using AI algorithms to prevent lag
  • Some fills in missing frames using some reprojection shortcuts, to prevent lookforward lag
  • Some require both lookforward and lookbehind with a rudimentary math formula (classic interpolation)
  • Weird combinations of two more more of the above
  • Other future techniques not yet invented
There are some emerging algorithms that can later scale to 10:1 performance (e.g. 1000fps generated out of what normally requires a 100fps GPU), possibly via a co-CPU or co-silicon dedicated to frame rate amplification. So this will be an important enabler of ultra-Hz in the next human generation.

Since witnessing the results of all of these, I've become a huge believer in F.R.A.T. becoming important for "cheap 1000fps on cheap 1000Hz" in ten or twenty years from now. It's necessary achieving cheap ultra-Hz in a very slow, gradual refresh rate race to future retina refresh rates, now that we've almost maxed-out resolution and dynamic ranges. With maxed-out resolutions, come increased incentive to go boldly milk where nobody has milked -- like continuing to work towards cheaply maxing-out refresh rates.

4K displays were laboratory-only curiousities in year 2000 before the five-figure-priced IBM T-221 desktop monitor came out. Now less than twenty years later, 4K is purchased at Walmart for cheaper than you paid for your 27" RCA CRT tube in year 1985.

The refresh rate race will be slower than the resolution race. But will be a perpetually continuous pressure for the rest of the century. As almost every reputable scientist, display researcher, and vision researchers now confirm the journey is long. All the indirect effects like stroboscopics and/or persistence blur. Since we're nowhere remotely near retina refresh rates.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6266
Joined: 05 Dec 2013, 15:44

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Postby Chief Blur Buster » 23 Mar 2019, 15:01

New Frame Rate Amplification Tech:
Render Some Frames Full Resolution, And Render In-Between Frames At Lower Resolution

New paper about a frame rate amplification technology (FRAT):
https://www.cl.cam.ac.uk/research/rainbow/projects/trm/

Basically render some frames as high resolution, but other frames as low resolution.

Use the high resolution frames to help enhance the low resolution frames.

This is a very clever technique that will be useful along the long term Refresh Rate Race to Retina Refresh Rates (1000Hz Journey!)

"Our technique renders every second frame at a lower resolution to save on rendering time and data transmission bandwidth. Before the frames are displayed, the low resolution frames are upsampled (improved with intelligence of knowing the earlier high-resolution frames) and high resolution frames are compensated for the lost information. When such a sequence is viewed at a high frame rate, the frames are perceived as though they were rendered at full resolution."

The future of frame rate amplification technologies is really exciting and covers a huge universe of a lot of researchers that I have been paying attention to for several years.

Many Frame Rate Amplification Tech (FRAT) tricks are roughly resembling the "Do some fully, do others partially" workflow metaphor of video macroblock codecs (I-Frames, P-Frames). The classic macroblock codecs of -- MPEG1, MPEG2, MPEG4, H.264, H.265, etc. In your smartphone videos, your Netflix and your YouTube streams -- only a few video frames are actually fully compressed frames.

With macroblock video -- predicted frames in between real frames -- The rest of frames in between are predicted and "filled-in" based on those frames. It's such a refined art that humans don't even notice during good compression ratios, despite there being approximately 1 second in between those fully-captured frames -- at least when the video is not overcompressed.

This will even apply to real time ray tracing (GeForce RTS, DirectX RayTracing DXR, etc), which needs real time denoising. And having more rays in some frames -- will make it easier to denoise less-raytraced frames -- and saving rays in some means more framerate is achievable with real time raytracing! So the "do some fully, do some partially" workflow is potentially very universal regardless of traditional rendering or real-time raytracing or future rendering workflows.

This is very different from the way Oculus Rift converts 45fps to 90fps via the Asynchronous Space Warp algorithm. But the concept is very roughly similar in the sense that a frame rate is multiplied via "filled-in" in-between frames that are not a full-resolution GPU render -- even if via a very different technique!

The great news is that these algorithms are not necessarily mutually exclusive! It's possible to stack the techniques (2x -> 2x = 4x) or via a combined algorithm. The techniques could potentially be combined to keep increasing frame rate amplificaton ratios from today's 2x framerate to 10x frame rate, and even more artifactlessly! Especially if the intervals between full-resolution frames remain very brief, few humans will quickly notice the issues.

I now fully anticipate that many FRAT tricks will eventually combine in the future -- to utilize such principles -- to eventually achieve the 10:1 amplification ratios needed (100 frames per second full GPU renders to generate 1000fps for 1000Hz displays) by the 2020s-2030s decade.

If humans often cannot notice those predicted frames in macroblock-video despite 1 seconds in between -- then I fully anticipate that humans will easily live with well-predicted frames lasting only 0.01 seconds (at 100 frames per second of full-resolution GPU renders) -- once the various frame rate amplification technologies are more properly optimized -- to even more visually lossless and lagless state -- and also put directly into dedicated hardware into fast silicon (FRAT-specific silicon).

Then the GPU is not a limiting factor for 1000Hz monitors!
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6266
Joined: 05 Dec 2013, 15:44

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Postby Chief Blur Buster » 04 Apr 2019, 20:50

New Frame Rate Amplification Tech:
Asynchronous Space Warp 2.0 by Oculus Rift

Today, Oculus just released the new Asynchronous Space Warp 2.0 algorithm which actually now uses the depth buffer for improved parallax correction, to greatly reduce artifacts of frame rate amplification.

Also, it seems that it is now capable of frame rate amplification ratios beyond 2:1 as it reportedly will work with frame rates of less than 45fps for the 90Hz VR headset. Some tests will need to be made in the near future!
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6266
Joined: 05 Dec 2013, 15:44

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Postby Chief Blur Buster » 05 Apr 2019, 00:31

Today, presenting a new article derived from this thread.

Image

Frame Rate Amplification Technologies (FRAT)
More Frame Rate With Better Graphics For Cheaper


Can Share This: Facebook | Twitter

Hope you enjoy reading this article;
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6266
Joined: 05 Dec 2013, 15:44

Previous

Return to Area 51: Display Science, Research & Engineering

Who is online

Users browsing this forum: No registered users and 1 guest