New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Post by Chief Blur Buster » 28 Jun 2020, 21:20

thatoneguy wrote:
21 Jun 2020, 04:39
Chief Blur Buster wrote:
15 Jun 2020, 21:40

For 1080p video (on 1080p LCD) versus 1080p video (stretched to fill 8K LCD), no, the motionblur looks the same.
This doesn't make sense to me from a physical perspective. Isn't upscaled 8K(from 1080p) still technically 8K(even if not native)?
I mean you still have 16 times the pixels.
Scaling difference asides, it's the same original pixelwidths, so the blur difference is the same.

It's the same reason why when you use www.testufo.com and use Chrome to browser zoom 200% or 50%, the motion blur remains the same amount relative to its own original objects sizes (on a relative blur size basis -- relative to image size). Because it's relative to the original pixel sizes. Same concept.

Now, if -- instead of simple scaling -- if you had a sharpness-enhancing scaler / video processor (upconverting 1080p or 4K to look close to 8K), then it'd look different; bigger delta between motion image and stationary image.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Post by Chief Blur Buster » 29 Jul 2020, 20:55

I accidentally came across a paper using "Frame Rate Amplification" terminology:
https://dl.acm.org/doi/abs/10.1145/3383812.3383836

It seems that the terminology that Blur Busters coined ("Frame Rate Amplification"), is now being used other researchers elsewhere, to generally describe algorithms that multiplies the frame rate.

Cool, we trailblaze! (again)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

blurfreeCRTGimp
Posts: 42
Joined: 28 May 2020, 20:36

Re: New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Post by blurfreeCRTGimp » 28 Aug 2020, 16:54

Why cant we use an Asynchronus space warp technology on standard displays as is done in VR displays? I was just looking at the new Asus 360hz display, but in no way could I drive that on my 1060 3gb. I'm running a 144hz 1080p monitor right now, and I'm wondering why we cant just use say 3 sliced time warp bands on our LCDs so that we can benefit from high hz motion without needing an insane GPU?

User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Post by Chief Blur Buster » 28 Aug 2020, 16:58

blurfreeCRTGimp wrote:
28 Aug 2020, 16:54
Why cant we use an Asynchronus space warp technology on standard displays as is done in VR displays? I was just looking at the new Asus 360hz display, but in no way could I drive that on my 1060 3gb. I'm running a 144hz 1080p monitor right now, and I'm wondering why we cant just use say 3 sliced time warp bands on our LCDs so that we can benefit from high hz motion without needing an insane GPU?
Stutters can cause headaches/nausea/barf in virtual reality, so stutter-elimination is much more serious in VR than non-VR.

So there's a bigger incentive for spacewarp-type stuff to arrive to VR first before non-VR. However, it's going to be useful to do these frame rate amplification technologies (FRAT) for future games on ultra-high-Hz monitors.

I suspect a future variant will eventually simultaneously combine spatial FRAT (like DLSS 2.0) with temporal FRAT (like ASW 2.0). I'm hoping one future GPU might announce a surprise.

Speculation here, but I wouldn't be surprised if a theoretical DLSS 3.0 or DLSS 4.0 supports a spacewarp-type technology, albeit it might tied to a specific future GPU such as RTX 3080 or such. Crank high detail at high frame rates for the win!
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

blurfreeCRTGimp
Posts: 42
Joined: 28 May 2020, 20:36

Re: New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Post by blurfreeCRTGimp » 28 Aug 2020, 17:03

LOL Here is hoping that is indeed one of their surprises. I have been searching high and low for a high end CRT because I'm getting tired of having to run my software at 144hz native just so it doesn't look like crap. Do you think that they will get DLSS to be software agnostic?

1000WATT
Posts: 391
Joined: 22 Jul 2018, 05:44

Re: New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Post by 1000WATT » 28 Aug 2020, 18:46

I would be very surprised if green or red do it as a simple toggle on \ off in nvcp. After users get such technologies. It will be difficult to sell video cards next year.
But it's so nice to dream.
I often do not clearly state my thoughts. google translate is far from perfect. And in addition to the translator, I myself am mistaken. Do not take me seriously.

blurfreeCRTGimp
Posts: 42
Joined: 28 May 2020, 20:36

Re: New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Post by blurfreeCRTGimp » 31 Aug 2020, 17:03

I was thinking earlier today, couldn't a software developer make an app sort of like the blur busters strobe utility, that could enable Oculus ASW on the desktop?

We already have apps like bigscreen, virtual desktop, etc. couldn't we trick a PC into thinking the main monitor is a rift and thereby hack in ASW without waiting on manufacturers?

User avatar
Chief Blur Buster
Site Admin
Posts: 11648
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Post by Chief Blur Buster » 02 Sep 2020, 10:26

blurfreeCRTGimp wrote:
31 Aug 2020, 17:03
I was thinking earlier today, couldn't a software developer make an app sort of like the blur busters strobe utility, that could enable Oculus ASW on the desktop?
Theoretically you can do this, but there can be more artifacts (like interpolation)

Proper frame rate amplification technologies require a bit closer links to game engines, like controller movements and game engine intentions, so that motion vectors are genuine rather than faked.

For example, gaming mice are 1000Hz, and it's quite useful for a frame rate amplification technology to have access to as many movement vectors as possible, temporally, to produce additional frames indistinguishable from original, rendered-from-scratch frames.

Game engines need to integrate NVIDIA DLSS in order for it to be able to be enabled. Likewise for Oculus' ASW 2.0, not all VR games support ASW. All the good emerging FRAT technologies integrate more with the source.

The benefits of game engine integration is hugely beneficial to artifact-free frame rate amplification. Eventually FRAT can get so good -- that, say instead of just 50-75% humans not noticing the artifacts, we could get >99% of humans not noticing the artifacts, even videophiles -- that FRAT frames are just as good as original completely-rendered frames.

Quality jumps of non-black-box FRAT (FRAT that integrates more into engine) provides massive quality leaps -- can be like the difference between an over-compressed YouTube video or old VGA cellphone video, and a lightly-compressed digital cinema video file intended for a cinema.

Standalone FRAT is possible -- they already exist for 2D based media -- like video interpolation -- but have more disadvantages in latency and artifacts.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

elexor
Posts: 169
Joined: 07 Jan 2020, 04:53

Re: New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Post by elexor » 06 Sep 2020, 20:46

I'm surprised we don't see some kind of simple timewarping of camera movement in next gen consoles games. camera movement speeds are not that fast on a console it can't be that hard to do. you could still have the gameframerate of 30fps with motion blur and the camera movement could be timewarped to 120fps that would make those game significantly more enjoyable. right now for those 30fps games you have to move the camera around incredibly slow for it not to turn the whole screen into a blury stairsteppy mess.

User avatar
EeK
Posts: 28
Joined: 19 Jun 2019, 02:52

Re: New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Post by EeK » 22 Sep 2022, 22:02

I'm sure by now you all have heard about the new 40-series GPUs by Nvidia and, most importantly to the subject of this thread, the latest DLSS 3, which adds Optical Multi Frame Generation.

I'd love to hear the Chief's opinion on it, as he's predicted this tech a long time ago, and it seems to be happening much sooner than he anticipated here.

Relevant quote:
Chief Blur Buster wrote:
12 Sep 2022, 00:06
We’re actually seeing elements of frame rate amplifiers borne itself out of research as we speak — in a left field area. DLSS is using neural network AI, as you already know — and AI is rapidly improving.

You’ve seen those AI art-engines (DALL-E, Midjourney, et al). DLSS 5.0 on steroids will probably be realtime inpainting those in-between frames. I heard a rumor that the top end of the RTX 4090 is capable of generating AI art in just 1-2 seconds, and DLSS is a much simpler operation than AI art. Soon, we can generate “AI-art-movies” at realtime framerates in a couple or more GPU generations later — I would predict.

Virtual-reality reprojection algorithms converts 45fps feedstock into 90fps view, eliminating stutters without adding interpolation latency. So the technology is here today, if you own a recent VR headset.

Now in the future …

GPU vendors can redirect some of the new ‘AI skills’ from AI-art-processing capability called “inpainting” (and similar algorithms) as a turbocharged DLSS with a 5:1 or 10:1 frame rate amplification ratio, possibly combined with temporally-dense raytracing algorithms too (NVIDIA cited my TestUFO in their research paper). This is all exciting developments.


Essentially, a merger between NVIDIA DLSS + Oculus ASW + subvariants of AI-art techniques like AI-based inpainting algorithms will make the 1000fps frame rate amplifier a reality within a decade or so — on a single commodity GPU.

That said, the silicon manufacturing shortage and geopoliticals (TSMC…) may set frame rate amplification back a few years. But UE5-quality 1000fps should become practical eventually.

A frame rate amplifying co-processor (or dedicated silicon) may be required, to keep the GPU workload down — given the power requirements of GPUs it may need to be done as a chiplet approach (bandwidth…), but as long as the AI engine has access to 1000Hz controller inputs as well as the Z-buffer to generate parallax-correct (artifact free) AI based frame rate amplification.

Post Reply