LFC full range as HighFrameRate Blur Reduction?

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
Post Reply
elvn
Posts: 4
Joined: 18 Oct 2022, 08:21

LFC full range as HighFrameRate Blur Reduction?

Post by elvn » 18 Oct 2022, 08:54

I was wondering about whether operating on a base fps of say 120fps to 125fps, or 200fps with LFC as if that whole range was considered "low" frame rate would be viable, at least someday tech wise, in order to get very high fps on an up to 1000Hz OLED.

That is, taking ~125fps (at 4k, 10bit HDR, slight dsc perhaps over hdmi 2.1 if necessary) x 8, or 200fps x 5 to 250fps x 4 over dp 2.0, to achieve 1000fpsHz on a 1000Hz OLED.

I realize that LFC, in this scenario 200fps and less is considered "Low", is not interpolating or inserting frames but is instead repeating them or their refresh but I'd like to know if this would be theoretically viable as a way to take a healthy base frame rate motion articulation/definition wise and then increase motion clarity on the display end in order to bypass the bottleneck of ports and cables and "fatten up" the signal on the display end.

Additionally perhaps AI upscaling hardware on the display end as well once we get 8k screens in order to upscale 4k to 8k, again bypassing the port+cable bandwidth bottleneck of sending 10bit 8k 1000fps.

. .

4k 10bit 200fpsHz = ~ 59.72 Gbps.

4k 10bit 250fpsHz = 74.65 Gbps

. .

DisplayPort 1.3–1.4 = 25.92 Gbit/s

HDMI 2.1 = 41.92 Gbit/s

DisplayPort 2.0 = 77.37 Gbit/s

. . . . . . . . . . . . . . . . . .

3840 x 2160, 4k at 500fpsHz = 12 bit: 4,147,200,000 pixels/second = 174.18 Gbit/second, 10bit: 149.30 Gbps

3840 x 2160 4k at 1000fpsHz = 12bit: 8,294,400,000 pixels/second = 348.36 Gbps , 10bit: 298.60 Gbps Gbps

7680 x 4320 8k at 500fpsHz = 12 bit: 16,588,800,000 pixels/second = 696.73 Gbps , 10bit: 597.20 Gbps

7680 x 4320 8k at 1000fpsHz = 12bit: 1,393.46 Gbps, 10 bit: 1,194.39 Gbps

We could use DSC 2:1 rather than 3:1 and get some reductions but it again wouldn't be a pure native result anymore so lets put that aside for the moment.

. . . . . . . . . . . . . . . . . .

VR is also going to need some serious frame duplication beyond what it is doing in the longer future outlook whenever VR/MR/AR display resolution gets high enough per eye to actually get decent PPD. Some of the best VR headsets now are only around 30 to 32 ppd and that is only in the very center. They have to run two different screens too so once they get very high rez to say 60 - 80 ppd those combined resolution's bandwidth is going to be crazy.

. . . . . . . . .

=============================================================

The things in quotes are from blurbusters.com
LFC simply tries to predict a repeat-refresh to occur between two frames. And as you see in high speed video of LCD refreshing, www.blurbusters.com/scanout -- it takes 1/240sec to refresh all pixels (in a top-to-bottom fadesweep) = a monitor busy for 4.2 milliseconds refreshing a single 240Hz refresh cycle. This will remain constant a lower refresh rates on a VRR monitor, so even at 50 frames per second, the monitor still only needs 1/240sec to refresh.

LFC algorithms are very reliable with steady low frame rates, because it's easy to predict a repeat-refresh right in between. A repeat-refresh in an ideal situation is a no-operation (you see no visible effect on the screen because an image is being replaced by a duplicate image, so you can't tell LFC from non-LFC)

But LFC fails when frametimes vary a lot, so sometimes the repeat refresh starts, then the game finishes rendering a frame, and then suddenly the game is waiting for the monitor to finish repeat-refreshing (an old frame) before it can display the new frame. Thus, stutter. The good news is that this becomes less at higher Hz.

The LFC collision window is always max-Hz. (A frame-finish-rendering being forced to wait for a monitor-still-busy-repeat-refreshing). So the higher the VRR Hz, the smaller the LFC collision window is. On a 48Hz-240Hz VRR monitor, the LFC frame-vs-rerefresh collision window creates a maximum of 4.2ms (1/240sec) stutter in the worst-case scenario. The average LFC collision will be the halfpoint of that since the stutter error will be between [0...4.2ms]. Now if you got a lower maximum Hz such as 144Hz, your LFC collision window would be 6.9ms (1/144sec), so LFC stutters are worse on a 144Hz monitor than 240Hz monitor. So, if you're so worried about LFC stutter, make sure your max-Hz is higher to compensate.

Now if you buy that new 360 Hz monitor (future model, not sure of VRR range), and if it uses LFC algorithms (both NVIDIA and AMD use similar algorithms now), stutters from LFC algorithms on 360Hz will be at most, a 2.8ms stutter (2.8 pixel stutterjump at 1000 pixels/sec motion) at worst case, but a random number between [0...2.9ms] would be only 1.4ms stutter average (1.4 pixel stutterjump at 1000 pixesls/sec motion). At this point, without a strobe backlight, this begins to become hidden in low-framerate stutter, since 48fps at 1000 pixels/sec creates (1000/48) = 20.8333 pixels of motion blurring, or 20.8333 pixels of objectjump.

48Hz-240Hz VRR monitor, the LFC frame-vs-rerefresh collision window creates a maximum of 4.2ms (1/240sec) stutter in the worst-case scenario."

" The average LFC collision will be the halfpoint of that since the stutter error will be between [0...4.2ms]."

" Now if you got a lower maximum Hz such as 144Hz, your LFC collision window would be 6.9ms (1/144sec), so LFC stutters are worse on a 144Hz monitor than 240Hz monitor."
1000Hz display: 1/1000Hz = MAXIMUM of 1ms error, 1 px which is negligible (just like 1px of sample and hold blur is on a crt). Average would be half of that ~ 0.5ms <----- ????

. . .
So with a HUMONGOUS variable refresh rate range, e.g. 48...360, then the LFC stutters completely fall into the noisefloor of low-framerate stutter! Y'know (with proper drivers & proper LFC algorithm) even 1.4ms stutter error being completely lost in 20.8 pixel stutter at 48 frames per second. Big whooooop-deeee-do. The virtue of a massive VRR range works in the favour of LFC!

To play it safe, please stick to high-rated VRR. There are artifacts of cheap uncertified VRR (generic adaptive sync with no AMD or NVIDIA certifications) that can look worse than LFC artifacts.
LFC becomes unnoticeable with wide VRR ranges like "48Hz-360Hz" instead of "48Hz-120Hz"

  • TL;DR:

    - LFC doesn't add any stutter if you have consistent low framerate (like a perfect 30fps movie)

    - LFC can worsen stutter for volatile low framerates (frametimes varying frequently across refreshtime of min-Hz).

    - LFC stutter error is directly proportional to max-Hz [/B] - LFC stutter error (in milliseconds) averages out to equalling half the duration of a max-Hz (e.g. 2.1ms for 240Hz).

    - Thusly, LFC becomes unnoticeable with wide VRR ranges like "48Hz-360Hz" instead of "48Hz-120Hz".\

    - Thusly, if worried about LFC stutter.....Framepace your low framerates well to help LFC work better\.....And get the biggest VRR range you can afford\ - Premium VRR (G-SYNC certification and higher-end FreeSync) is worth it for other reasons than LFC too, but depends on goals


. . .

I would bet smarter LFC algorithms will "watch" the framerate range rather than instantaneous framerate, and simply go into permanent-LFC even at higher Hz, e.g. 31fps or 32fps becomes LFC on a GSYNC "30-144" monitor if it is seeing framerates becoming volatile across the LFC boundary (i.e. 28-35fps with a 31fps or 32fps average). Basically, smart LFC algorithms watch the framerate range, and if the framerate valley falls into LFC range, then the whole range becomes temporarily perma-LFC to become flawless LFC because you cannot have a partially-LFC framerate range without LFC collisions. Then when the framerate range finally stays above the LFC floor, then LFC can deactivate completely seamlessly. That's how smarter LFC algorithms can avoid stutter for minor framerate volatility that "fuzzes across" the LFC boundary.


. . .

I was suggesting or at least pondering about not only doing LFC (low framerate compensation) , but also doing it on the high end with refresh/frame repeating perhaps even with hardware on the displays themselves. Basically, they are all "Low" frame rates by comparison to 1000fpsHz so could be compensated. A little frame insertion (even by one frame for 2x the framerate) maybe on the low end if you need it to get to ~ 200fps (of motion definition) in the first place, to buff up the motion definition aspect. Motion definition aspect of high fpsHz has diminishing returns after a point though. I'm more focused on (pun intended) the blur reduction of much higher Hz ranges .

https://i.imgur.com/KlIRG0B.png

.... At least as I understand it, the blur reduction is more about the raw refreshes/redraws in order to "wipe" our retinas with a refresh cycle rather than the difference or uniqueness of the individual frames (kind of like if BFI operated on the same unchanged unique frame position of a scene more than once to maintain a consistent "shutter speed", or a crt redrawing at a fixed rate independent of the game frame rate).

https://blurbusters.com/high-speed-vide ... at-960fps/

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: LFC full range as HighFrameRate Blur Reduction?

Post by Chief Blur Buster » 18 Oct 2022, 20:20

Great question worthy of being moved to the Area 51 Laboratory!

And in fact, one of our forum members tried something similar out... So here's my reply.

Based on what I read of your posts in other forums, you seem to already know most of this stuff, so I write this post for multiple readers. So forgive me, if I write researcher-vetted walls of text here...

Changing LFC-trigger thresholds can be done on an AMD card with ToastyX CRU FreeSync-range edits, so I can create 200Hz LFC on a 500Hz FreeSync panel!

The TL;DR Version

Display-Side VRR-LFC "Interpolation" As Attempted Motion Blur Reduction
This could work, but has some major black-box problems

(This answer is specifically for elvn, not other readers who often ask questions about LFC for blur reduction)
Basically interpolation kicking when going below a high-LFC threshold. The problem is for interpolation to become "perfect" it needs to know the geometry of what is shown on the screen (Z-Buffer, 3D geometry). For frame rate amplification technologies to become more artifactless, it needs access to some knowledge data (Z-buffers, 1000Hz controller data) to reduce artifacts. Oculus Rift ASW is superior to interpolation and less laggy because it "knows" the Z-Buffer. But if you only transmit refresh cycles to the monitor, you don't have the extra metadata needed to make the frame rate amplification perfect. Frame rate amplification can include interpolation / extrapolation / reprojection / AI upsampling -- often an increasingly simultaneous combination of the above. I've already written an article about how frame rate amplification needs extra metadata such as Z-buffers (not transmitted over a DisplayPort cable), at www.blurbusters.com/frame-rate-amplification-tech ....

I even mention that a co-GPU can be added to the display theoretically, but you also would have to invent a new display signal that added lots of metadata to avoid artifacts caused by black-boxness of interpolation. You need to transmit a LOT of metadata (controller data for reprojection, Z-buffer data for eliminating parallax artifacts, etc) from the computer to the display, if you're wanting a more flawless frame rate amplification on the display side. This could still be a lot less bandwidth than 8K 1000fps 1000Hz, but optical fiber video cables can pull that off uncompressed, by the time we're ready. The problem is we're not yet sure how powerful a co-GPU needs to be built into the display for display-side frame rate amplification (which I mentioned as a theoretical "G-SYNC 2").

This is a fun thought exercise, and all of this has not yet been tested out, but theoretically is viable, but will it be cheaper than a simple optical-fiber video cable. 8K 1000fps 1000Hz will require ginormous amounts of internal processing bandwidth, as the last vestiges of multicore processing is exhausted, so parallelization of the frame rate amplification pipeline becomes critical, e.g. shingled rendering in different cores (on-die SLI type techniques). But where should the engineering/expense go, is the question? We already know that an optical video cable can be enough for 8K 1000fps 1000Hz, so is it cheaper to do the fully computer-side GPU approach, or a GPU+coGPU approach?

We already have coGPU behaviors when doing AirLink streaming on Quest 2. For example, the PCVR might fall to 45fps, but streamed over AirLink, and the Oculus Quest 2 GPU does 3dof reprojection to 90fps successfully to eliminate nausea during headturns. So basically the Quest 2 is already doing on-VR frame rate amplification of an externally GPU-generated signal. So there's already precedent (yay) -- at least for low fixed-Hz frame rates!

In theory, the Z-buffer could be streamed too over the AirLink signal, in order to do 6dof reprojection, not just 3dof reprojection. In addition, this would be excellent for VR streaming (from the cloud), to feel lagless in body movements, regardless of streaming delay. Basically cloud GPU + on-headset reprojecting co-GPU.

The bottom line: To apply this to Ultra HFR on existing computer monitors, we need to invent a new display signal that includes a lot of metadata required for more flawless and lagless frame rate amplification technologies (whether it be interpolation / extrapolation / reprojection). We would also need to invent new Vulkan APIs that puts reprojection close to the metal, so games can communicate directly to the display (whether VR or non-VR) to allow FPS mouseturns at 1000fps even if the game is running at only 100fps. Reduce the combined transistor count of (GPU + coGPU + video cable transceivers).

That would be a fun napkin exercise... The local GPU can do part of the frame rate amplification and the remote (simpler) GPU can do the rest of frame rate amplification. Now that being said, you kind of want good AI autocomplete algorithms for filling-in parallax data, so it's also advantageous to make it the same GPU, but it then becomes a giant GPU (ala RTX 4090). So creative approaches will be needed for large-ratio frame rate amplification technologies.

Sample-And-Hold LFC As Attempted Motion Blur Reduction
Tested and confirmed useless, whether low frame rates or high frame rates (HFR)

LFC on sample-and-hold is still the same motion blur as non-LFC, so we'll skip that, since LFC=nonLFC in motionblur during sample-and hold, so we'll skip that. We've actually tested that 100fps LFC and 100fps non-LFC looks the same on current VRR displays, assuming there's no major overdrive differences (LFC can have different overdrive artifacts, but assuming there's no overdrive appearance difference between different frame rates and refresh rates). So LFC has absolutely zero motion blur reduction benefit. You need to use frame rate amplification instead if you need to reduce the motion blur of low frame rates (that are too flackery for framerate=Hz strobing). There can be benefit to HFR-LFC if it eliminates overdrive artifacts, e.g. your monitor has less overdrive artifacts with LFC enabled at 100fps+, than without LFC enabled. But that doesn't eliminate motion blur, since MPRT100% can never become less than frametime on a sample-and-hold display, even if GtG=0.

Strobed LFC As Attempted Motion Blur Reduction
Tested and confirmed duplicate image artifacts, whether low frame rates or high frame rates (HFR).

Now, that leaves explaining the artifacts of LFC during strobing, which is akin to CRT 30fps at 60Hz, but applies to any frame rate below strobe rate. LFC creates multi-strobe artifacts. LFC is repeat refresh cycles, which all have their own unique strobes if you combine strobing+LFC.

Now for the long version, if you want to go beyond TL;DR and dive in the famous Blur Busters wall of texts.

The Long Version

(This is mostly for other readers, because I often get questions, so I'm centralizing my reply here, because people sometimes keep asking me if there's a way to add motion blur reduction to LFC!)

Using LFC at same time as VRR Strobing Creates Duplicate Image Effects

I can tell you that one of our forum members hacked a BenQ XL2546 to enable "VRR-DyAc" via a modified strobe backlight.

The problem is that LFC creates multi-strobe effects which creates duplicate images:

Image

This is even software-simulatable in a TestUFO animation (custom hacked TestUFO link). First, we set a frame rate to a frame rate compatible with software-simulation of multi-strobing, and then create the CRT 30fps-at-60Hz effect (but it requires 4 sample and hold refresh cycles to do the double-image effect, and 6 sample and hold refresh cycles to do the triple-image effect). So use the highest refresh rate, turn off strobing, and click one of these two links:
If your display is only 60-165Hz, you will easily see a double image effect at 960pps:


Unfortunately LFC (software-based via driver for FreeSync) starts creating annoying duplicate image effects as LFC enables during strobed hacked DyAc-FreeSync-VRR

Here's the forum member's hacked BenQ XL2546:
[Monitor Electronics Hack] World first zowie XL2540 240hz 60hz singlestrobe Experiment with VRR-DyAc!

He became so annoyed by the duplicate image effects because of LFC sticky behaviors. HIs hacked XL2540 could single-strobe at 60Hz DyAc-VRR, but sometimes even though VRR range ended at 48Hz minimum, LFC stayed activated until frame rates returned well above 60fps (sticky behavior). So this user programmed a utility to compensate for the annoying strobed LFC duplicate-image effect:
nvLFCreset an experimental nvapi trayapp to prevent LFC sticking for gsync compatible monitors

Note: On this hacked XL2540 with user hacked "DyAc-VRR", the Blur Reduction was disabled in the monitor menus, and he added a hardware-based strobing hack by modifying the backlight electronics. What was found out that repeat-refresh-cycles (from LFC) created duplicate images from mutli-strobing.

Duplicate Images Effect is Confirmed Universal For All Kinds Of Multi-Strobing

During eye-tracking all kinds of multi-strobing the same frame creates duplicate image effects.
- Low frame rates (e.g. CRT 30fps at 60Hz)
- Strobed LFC
- bad strobe crosstalk (GtG too slow to erase refresh cycle, so it's defacto repeat refresh)
- PWM dimming
- Even software-based BFI!

All of them.
All tested.
All create duplicate images.
It's a law of physics issue.

Your eyes are analog trackers -- your eyeballs are in different position whenever the unchanged frame is repeat-flashed -- stamping a duplicate copy (by persistence of vision) as you track eyes on a multi-strobed object.

Conclusion

Good Blur Reduction Requires
1. Stroberate=Framerate=Hz.
2. If not storbing, keep frametime=refreshtime short.

4ms strobe flashes (framerate=Hz, any Hz) has same motion blur as 250fps 250Hz sample-hold
2ms strobe flashes (framerate=Hz, any Hz) has same motion blur as 500fps 500Hz sample-hold
1ms strobe flashes (framerate=Hz, any Hz) has same motion blur as 1000fps 1000Hz sample-hold

Strobing is a band aid, but you need to strobe only once, to avoid duplicate image effects. So low frame rates (e.g. 25fps) becomes super-flickery.

That's why for strobless blur reduction (sample and hold) you need giant geometric upgrades, e.g. 60Hz -> 144Hz -> 360Hz -> 1000Hz (at GtG=0) to feel major upgrades. Much like it's hard to tell apart a 4K-vs-5K display, but easy to tell apart 1080p-vs-4K display spatially. The Hz problem is also temporally true.

Even a 480Hz strobed display amplifies phantom array effects during LFC strobing:
The Stroboscopic Effect of Finite Frame Rates.

Don't forget that display motion blur behaves very differently if your eyes are stationary versus moving. Here's a TestUFO Custom Variable Speed Eye Tracking Demo -- stare at the 2nd UFO on any sample-and-hold display for at least 15 seconds while it bounces from stationary slowly to full speed and decelerates to stationary. That's the motion blur continuum, where stutters-blends-to-blur and blur-blends-to-stutter.

Persistence motion blur (of high sample-and-hold frame rate) and sample-and-hold stutter (of low sample-and-hold framerate) is the same thing. The stutter-to-blur continuum is also visible in VRR ramping animations too, www.testufo.com/vrr ...

Doubling Hz+fps halves motion blur without needing strobing. (This assumes GtG=0, then this automatically means MPRT100%=frametime=motionblur .... simple math! Motion blur is THE frametime, and frametime is THE motion blur, when it comes to sample-and-hold physics, when removing the LCD GtG error margin). Nonzero GtG is part of why 240Hz-vs-360Hz is only a 1.1x difference instead of a 1.5x difference (only occurs at GtG=0). That's why I am a huge fan of high-Hz OLEDs and MicroLEDs, you can double Hz and frame rate to exactly perfectly halve motion blur, and keep doing it.

To match LightBoost (2ms MPRT) without strobing, you need 500fps 500Hz OLED or MicroLED, framerate=Hz, GtG=0

To match Oculus Quest 2 (0.3ms MPRT) without strobing, you need 3333fps 3333Hz OLED or MicroLED, framerate=Hz, GtG=0

Strobing eliminates the need for ultra high frame rates, but strobing even an ultra high frame rate still causes problem. 1000Hz strobing still produces a phantom array effect if your resolution and motion speed is fast enough. 8000 pixels/sec at 1000Hz strobe on a 1000Hz 8K display creates duplicate images spaced every 8 pixels apart.

We actually have access to experimental 1000Hz displays, and we were able to reliably extrapolate perfectly. Enough experiments have confirmed that 1000Hz is definitely not retina refresh rate once you've got enough resolution and enough FOV -- this is caused by the Vicious Cycle Effect, where extra resolution and FOV amplifies Hz limitations.

In VR, you could even use an eye-tracker and add a GPU motion blur effect to blur the difference between eye-motion vector and moving-object vectors; so you can solve all 4 combos (stationary/moving gaze/objects) concurrently, assuming you kept pulse widths smaller than 1/(maximum pps of eye-trackability) MPRT. Then you've retina'd everything except flicker of strobing. While still using lower frame rates and strobing. But that is DOA for multiple-viewer external displays..

While 1000Hz might be a retina resolution for a 24" 1080p display viewed from 4-8 feet, the retina resolution goes up when angular resolution goes up (up to spatial retina resolution), especially if the display is wide enough in pixels to allow you to track eyes long enough on the object -- 8000 pixels/sec on an 8K display takes 1 second to scroll horizontally. That's easier than 8000 pixels/sec on a 1080p display, the object moves too fast to eyetrack! So that's why higher resolutions raises the retina refresh rate.

There are two different pixel response measurements, GtG versus MPRT. Motion blur is caused by both GtG and MPRT. You can zero-out GtG, but you still have MPRT motion blur left over. At GtG=0, MPRT100% can never be less than frametime on sample-and-hold display. And MPRT100% is pulse width time on strobed displays assuming single strobe; shorter strobes reduce motion blur. That's why NVIDIA ULMB still has motion blur at www.testufo.com/map at 3000 pixels/sec unless you reduce pulse width to 0.5ms MPRT instead of default 1ms MPRT. (Good self-proof of human-visible 0.5ms-vs-1.0ms MPRT by seeing for yourself in TestUFO). And if you multi-strobe (even by LFC), you've got duplicate images to worry about.

Remember you have 4 situations to try to eliminate motion blur or duplicate image effects:
8K 1000fps 1000Hz doing 8000 pixels/sec on 0ms GtG SAMPLE-AND-HOLD
- 8 pixels stroboscopic-stepping during stationary-gaze ()
- 8 pixels motion blur during MOVING-gaze (example: www.testufo.com)

8K 1000fps 1000Hz doing 8000 pixels/sec on STROBING
- 8 pixels stroboscopic-stepping during stationary-gaze
- No motion blur, no stroboscopic stepping during MOVING-gaze*

8K 250fps LFC 1000Hz doing 8000 pixels/sec on STROBING
- 32 pixels stroboscopic-stepping during stationary-gaze
- 32 pixels stroboscopic-stepping during MOVING-gaze

Ooops. LFC made strobing worse. Ouch.

Confirmed math, guaranteed extrapolatable:
(8000/8) = 8
(8000/250) = 32

*IMPORTANT: Motion blur stops being perceptible when MPRT is noticeably less than 1/pps. This applies to both strobed displays AND sample-and-hold displays. So for fast motion speeds like 8000 pixels/sec, which is more common on 4K and 8K displays, you need 1/8000sec MPRT = 0.125ms MPRT. Yup, the difference of 0.25ms MPRT versus 0.125 MPRT now becomes human noticeable when we're talking about future strobed 8K virtual-reality displays! That's why VR now uses heavily sub-millisecond MPRTs today. Thanks to the Vicious Cycle Effect where more resolution amplifies MPRT limitations and Hz limitations, again.

And realize that motion blur physics (and duplicate-image physics) behave differently with stationary gaze, versus moving gaze, largely because of a displays' finiteness of a refresh rate.

And multi-strobing a frame rate (even by LFC) simply chops-up the same motion blur, so you've created a spread of duplicate images in the same space as the original motionblur.

Even if you fix problems for one (e.g. stationary gaze situation) can create new problems for the other (e.g. moving gaze situation). It's very whac-a-mole, and the only universal whac-all-mole solution is ultrahigh frame rates at ultra high refresh rates, to fix ALL stroboscopics simultaneously with ALL motion blur, in a 100% perfectly flicker free manner and perfectly duplicate-image-free manner.

For 240fps (Even w/LFC) at 480Hz, 2000 pixels/sec will have 2 duplicate images spaced every 4 pixels
For 120fps (Even w/LFC) at 480Hz, 2000 pixels/sec will have 4 duplicate images spaced every 4 pixels
For 60fps (Even w/LFC) at 480Hz, 2000 pixels/sec will have 8 duplicate images spaced every 4 pixels

For 240fps (Even w/LFC) at 480Hz, 4000 pixels/sec will have 2 duplicate images spaced every 8 pixels
For 120fps (Even w/LFC) at 480Hz, 4000 pixels/sec will have 4 duplicate images spaced every 8 pixels
For 60fps (Even w/LFC) at 480Hz, 4000 pixels/sec will have 8 duplicate images spaced every 8 pixels

So, alas, we're SOL with motion blur reducing low frame rates without flicker. I even explained this five years ago in Blur Busters Area 51, scroll down to the 1000Hz Journey article.

I love making big replies to these sort of questions because we've actually test them, because Blur Busters has helped researchers/manufacturers/vendors to research this sort of stuff over the last 5 years and I'm now in 25 peer-reviewed research papers. For more of that fun, see www.blurbusters.com/area51 as the Cole Notes.

I love these kinds of questions.

Does my reply or any Area51 articles answer your question?
If not, let me know and I'll be happy to expand.

Does my reply create new questions?
If yes, I'll be happy to explain them, and give custom TestUFO links where available.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: LFC full range as HighFrameRate Blur Reduction?

Post by Chief Blur Buster » 18 Oct 2022, 21:57

Edited above post to cover more bases including the Co-GPU interpolation approaches.

Since the subject line is generic LFC blur reduction (unspecified), I wanted to cover all bases for my Area51 readers...
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

elvn
Posts: 4
Joined: 18 Oct 2022, 08:21

Re: LFC full range as HighFrameRate Blur Reduction?

Post by elvn » 19 Oct 2022, 09:37

Thanks for the reply.

I'm all for research vetted walls of text and refreshing (pun intended) my understanding of things or outright teaching me things. In fact I'm going to re-read that and the linked articles a few times. :geek:

. . . . . . . .

I know many have a strong interest in strobing/BFI but in the type of scenarios I was outlining and wondering about there would be no backlight strobing or BFI in play, just additional redraws per frame. I realize that ordinarily this would be the same image persistence when compensating frames as when not compensating frames, as you outlined, rather than using frame amplification/frame insertion or time warp.


The scenario I am interested in is a raw gpu driven 250fps, x4 via +3 "LFC" redraws per frame, on a 1000Hz oled theoretically. (With oled response time allowing 1000hz someday).

[In that hypothetical, perhaps the LFC, or "FC" could be a hardware chip on the display itself in order to bypass the cable and ports bottleneck, as I mentioned and you added in your replies as already having been examined some in a referenced link. While fiber cables can handle the bandwidth as you said, the progression of port bandwidth is pretty slow imo. OLED response time is already fast enough for 1000hz, at least on the response time facet, even if it would in the near future (esp. to enable things like ray tracing) have to use manufactured frames/frame amplification tech to reach those heights rather than waiting a lot longer on raw gpu power advancement.]

. . . . . . .

....Could some threshold for difference between redrawn refreshes enable LFC to lower persistence blur? If the next next - now near duplicate rather than exact frame redraw happens-, what % difference between redraws or scene "cells" would be required at 4k 10 bit 250fps x4 (via FC +3 per frame) at 1000Hz?

...Could some sort of a pixel shift type technology (even perhaps even a smaller % of the screen pixels) be enough to make our eyes see "High" Frame Compensation redraws operating on '250fps-to-1000fpsHz' as unique, wiping the previous frame's persistence, without having to resort to full AI hardware interpolated frame insertion mapped more logically between frame states (based on vectors) or vr's own vector/head movement (and even eye tracking) time warp directional prediction? Or would it result in bad artifacts, even at a small shift and at very high speed?


That is, 250fps on a 1000Hz oled gaming display, operated on x4 via +3 LFC style redraws per frame. . . . adding just enough pixel shifting so that the 3 extra draws per frame trick our eyes into seeing them as unique enough to wipe the previous frame's persistence and hopefully reduce blur enough at 1000Hz in a tradeoff theoretically. Even if the positioning didn't flow perfectly or vectors didn't match up exactly anymore, "imperfect frame amplification" that doesn't match up, for testing's sake. If that pixel shift itself was a very small displacement # of px in effect but perhaps worth the tradeoff overall at resulting 1000fpsHz reduction?. If that would work to get some more blur reduction, even if not congruent with the movement - - - I wonder how much would be necessary to shift with those parameters?

If it would work in the first place - How low of a percent of the screen's pixels would have to be shifted and by how much?
. . 100%?
. . or a lower resolution version of the frame pixel shift mapping wise
. . or just a best tested pixel grid pattern
. . or maybe a set of different cycling pixel shift grid patterns per redraw

I wonder how far off, how many 4k pixels or % would the imperfect +3 redrawn pixel shifted/pixelgrid-shifted refreshes be from the game's rendered "frame 1 and frame 5 per cycle":
- compared to +3 interpolated/AI frame inserted frames with current level of the tech(to our eyeballs when viewing it at 1000fpsHz speed in effect)
- or compared to the sample and hold persistence # of px we'd otherwise see at a flat 250fpsHz (at ~ 4ms or px)

And could a lower map of or grid of pixels be shifted as suggested above, complicating or diluting the comparison issue?

Doubling the frame rate cuts the image persistence by half, so quadrupling it the normal way would cut 4ms to 1ms/px at 1000fpsHz. But then would have to factor in the % of pixel shift (100% pixel shift per redraw [shifted by some amount] or different pixel shift grid pattern %'s shifted by some amount).

Or is a "directionless" pixel shift (or grid maps of pixel shift) of the subsequent redraws/frames/cells not enough to wipe the previous image persistence and reduce blur in itself?

Would it be artifacting, blinking around or stuttery looking? - but still not smearing/blurring as much at the resulting 1000fpsHz of draws+redraws? Could you read text even when moving the viewport at speed at the resulting in effect 1000fpsHz (even if it was stuttering or otherwise imperfect motion/artifacting rather than blurring?). Would there be any way to reduce potential artifacts?

. . . . . . .

TLDR: If even theoretically possible in the first place - Would any potential gains with this type of idea not be enough overall as compared to just running 250fpsHz (~4ms/px of persistence blur) due to the offset of pixel displacement (or grid of) itself and any potential artifacting? Even if with in effect the redraw achieving 1000fpsHz.

E.g. potentially reduce the distance of your persistence's # of pixels by comparison, but you have to displace a map of pixels 3 times in your grid by a # of pixels to do it while in effect transmitting everything much faster to your eyeballs. Theoretically the car makeover guy says "I put some fast pixel displacement grids in your pixel displacement to reduce your pixel persistence displacement". Fight fire with fire?

. . . . . .

Could be grasping at straws, theoretically, for marginal or inconsequential (non-BFI) blur reduction gains if any, or worse - a bad stuttering, smearing or multi-image artifacting or jerky end result but I find it interesting so any questions you could answer are greatly appreciated.

I do have a basic understanding of AI frame insertion, VR time warping, etc. at least conceptually . . . and I realize how useful they are now and in the future.. but I am curious of ways that things might have been able to be done a little sooner and maybe simpler, perhaps less heavy hardware dependence wise, and enabling bypassing the port bandwidth bottleneck. And generally just curious for curiosity's sake.


Note there would be no typical LFC low 30 to 48 fps getting operated on in these scenarios. The whole range is considered low here compared to 1000fpsHz. So in this scenario -
250fps solid, frame compensated x4 (+3 redraws per frame) to fill 1000Hz.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: LFC full range as HighFrameRate Blur Reduction?

Post by Chief Blur Buster » 21 Oct 2022, 18:02

elvn wrote:
19 Oct 2022, 09:37
....Could some threshold for difference between redrawn refreshes enable LFC to lower persistence blur? If the next next - now near duplicate rather than exact frame redraw happens-, what % difference between redraws or scene "cells" would be required at 4k 10 bit 250fps x4 (via FC +3 per frame) at 1000Hz?
Can you rephrase this question? Maybe split into multiple shorter grammatically correct sentences, with clearly defined variables. I'm unable to parse this one.
elvn wrote:
19 Oct 2022, 09:37
...Could some sort of a pixel shift type technology (even perhaps even a smaller % of the screen pixels) be enough to make our eyes see "High" Frame Compensation redraws operating on '250fps-to-1000fpsHz' as unique, wiping the previous frame's persistence, without having to resort to full AI hardware interpolated frame insertion mapped more logically between frame states (based on vectors) or vr's own vector/head movement (and even eye tracking) time warp directional prediction? Or would it result in bad artifacts, even at a small shift and at very high speed?
Pixel shift techniques is called reprojection. It's already used by most VR headsets. It works wonderfully.

ASW 2.0 added Z-buffer awareness, to pixel-shift foreground objects at different steps than background objects.

I wish reprojection/warping (pixel-shifting) was more common in non-VR use cases, to allow pans/scrolls/turns to stay smooth even during major framerate drops. I'd love to see 125fps "reprojected" to 500fps on a 500Hz display.

I'm sure RTX 4090 has enough bandwidth for it, if the game will co-operate with the reprojector (e.g. high-pollrate mouse synchronized to reprojection logic during panning / turning / scrolling), this requires native APIs in the operating system or drivers. The DLSS 3.0 method can also be used for the parallax-reveal pixels, to reduce reprojection artifacts along edges of objects. So reprojection would be used for most pixels, and DLSS 3.0 be used only for parallax-reveal pixels -- in theory.

But that may be a hugely variable GPU load, due to the varying amounts of parallax-reveal pixels.

Now, also, it's best not to suddenly enable/disable reprojection during LFC because it means sudden changes to display motion blur. Like motion blur suddenly halving everytime the reprojector decided to add new frames. It's best to use the reprojector to try to keep the framerate constant, e.g. converting variable frame rates to permanent consistent frame rates such as 1000fps 1000Hz.

TL;DR: Pixel shifting is called "reprojection" and is already being done in virtual reality.

Reprojection is done to keep framerate=Hz strobing in VR.

Have you ever used a modern VR headset? The newest ones now have less motion blur blur than the best strobed LCD's I've seen, because of the superlative big-money optimization on display motion blur at bigger budgets than for an average gaming monitors. It's shocking how much clearer VR displays are than even a 500Hz esports monitor.

Both Index and Quest 2 have 0.3ms MPRT, which would require 1000/0.3 = 3333fps 3333Hz flickerfree sample-and-hold to match motion clarity without requiring strobing/flicker based motion blur reduction methods.

They're even sharper motion clarity than a CRT now. Some of us even use a VR headset to put a virtual computer monitor on our desks that is superior to gaming monitors in motion clarity, it actually works (motion is clearer on the virtual monitor sitting on a virtual desk while wearing VR headset). And some additional tricks like VorpX can add stereoscopic support like NVIDIA 3D Vision, so you can have a 3D image inside the virtual monitor frame.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

elvn
Posts: 4
Joined: 18 Oct 2022, 08:21

Re: LFC full range as HighFrameRate Blur Reduction?

Post by elvn » 23 Oct 2022, 07:27

Great informative answer as always. Thanks. Its too bad reprojection isn't being exploited on gaming monitors and tvs.

VR has some great things but it has such a long way to go PPD wise. It will get there someday. Svelte MR glasses with screen within screen high PPD.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: LFC full range as HighFrameRate Blur Reduction?

Post by Chief Blur Buster » 26 Oct 2022, 20:14

One interesting innovation of reprojection that has not yet been done is making sure the pre-reprojected framerate is above flicker fusion threshold. That causes stutters in reprojection to disappear!

For example on Oculus Rift 45fps to 90fps, sometimes certain things stutter (hand tracking 45fps) while the background scrolls smooth (90fps) via head turns.

But if we had 100fps reprojected to 500fps, then even physics objects like enemy movements would still be smooth looking, just simply more motionblurred (due to frametime persistence) than turns (due to reprojection-based frame rate amplification).

Not everything in the game world *needs* to run at the same framerate; if motion blur is acceptable for such movements.

Different things running at different frame rates on the same screen is very common with reprojection (Oculus Rift), but if all framerates could be guaranteed triple-digit, then no stuttering is visible -- just different amounts of persistence motion blur (if using reprojection on a non-strobed display). This will be something I will write about in my sequel to the Frame Rate Amplification Article.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: LFC full range as HighFrameRate Blur Reduction?

Post by Chief Blur Buster » 26 Oct 2022, 20:19

elvn wrote:
23 Oct 2022, 07:27
Great informative answer as always. Thanks. Its too bad reprojection isn't being exploited on gaming monitors and tvs.

VR has some great things but it has such a long way to go PPD wise. It will get there someday. Svelte MR glasses with screen within screen high PPD.
One interesting innovation of reprojection that has not yet been done is making sure the pre-reprojected framerate is above flicker fusion threshold. That causes stutters in reprojection to disappear!

For VR, We Already Have Hybrid Frame Rates In The Same Scene

For example on Oculus Rift 45fps to 90fps, sometimes certain things stutter (hand tracking 45fps) while the background scrolls smooth (90fps) via head turns.

But if we had 100fps reprojected to 500fps, then even physics objects like enemy movements would still be smooth looking, just simply more motionblurred (due to frametime persistence) than turns (due to reprojection-based frame rate amplification).

Not everything in the game world *needs* to run at the same framerate; if motion blur is acceptable for such movements.

Different things running at different frame rates on the same screen is very common with reprojection (Oculus Rift), which is ugly when some of the framerates are below stutter/flicker detection threshold.

But if all framerates could be guaranteed perfect framepaced triple-digit, then no stuttering is visible at all! Just different amounts of persistence motion blur (if using reprojection on a non-strobed display). This will be something I will write about in my sequel to the Frame Rate Amplification Article.

Hybrid Frame Rates Stop Being Ugly if 100fps Minimum + Well Framepaced + Sample And Hold

Hybrid frame rates will probably be common in future frame rate amplification technologies, and should no longer be verboten, as long as best practices are done:

(A) Low frame rates are acceptable for slow enemy movements, but keep it triple-digit to prevent stutter
(B) High frame rates are mandatory for fast movements (flick turns, pans, scrolls, fast flying objects, etc)
(C) If not possible, then add GPU motion blur effect selectively (e.g. fast flying rocket running at only 100 frames per second, it's acceptable to motionblur its trajectory to prevent stroboscopic stepping)

The frame rates of things like explosions could continue at 100fps to keep GPU load manageable, but things like turns (left/right) would use reprojection technology. The RTX 4090 should easily be capable of >500fps reprojection in Cyberpunk 2077 at the current 1 terabyte/sec memory bandwidth -- and this is a low lying apple just waiting to be milked by game developers!

In other words, don't use 45fps. Instead of 45fps-reproject-90fps, use min-100fps-reproject-anything-higher. Then frame rate amplification can be hybridized at different frame rates for different objects on the screen -- that becomes visually comfortable once every single object is running at triple-digit frame rates!

Technically, reprojection could in theory be created as a parallel-layer API running between the game and the graphics drivers, much like how VR APIs do it. Except it's overlaid on top of non-VR games.

One major problem occurs when you're doing this on strobed displays -- sudden double/multi-image effects -- and requires GPU motion blur effect (mandatory) to fix the image duplicating (akin to CRT 30fps at 60Hz). However, this isn't as big a problem for sample-and-hold displays.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply