Blur Busters Forums

Who you gonna call? The Blur Busters! For Everything Better Than 60Hz™ Skip to content

New term: "Frame Rate Amplification" (1000fps in cheap GPUs)

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers. The masters on Blur Busters.

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Postby Chief Blur Buster » 15 Dec 2017, 00:08

Possibly, but, I think there will still be strong economic pressures in the direction of FRAT. Just will take up to a decade to borne itself out.

Really good FRAT will still need lots of horsepower and silicon. You may still need to, say, consume 50% of a powerful future Titan doing conversion of 100fps to 1,000fps. It will initially be a high-end-only feature, possibly.

It may even be accomplished by game engines instead, given sufficient GPU capability, too - increasingly improved versions of Oculus timewarping, until AMD or NVIDIA one-ups the other by introducing GPU/driver level based FRAT.

There may be unconventional algorithms in play, too. Heck, maybe even realtime framerateless beamtracing algorithms that can be denoised to any frame rates, instead.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 5079
Joined: 05 Dec 2013, 15:44

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Postby open » 16 Dec 2017, 12:04

BattleAxeVR wrote:The cynic in me thinks it's unlikely that big GPU manufacturers would back this type of shortcut to improving framerates, as it would cannibalize their sales.

For example, why pay for a 1080 ti that can hit an average of 50 fps on a 4K game, which then gets interpolated to 60 fps by some additional chip, when you can just buy a 1050 which can only manage 30 fps but has the same type of chip?

See the conflict of interest here? It's an incentive to not upgrade, or when you do upgrade to a new gen GPU, you pick a much cheaper model.

Sure, some people will pay more and turn off interpolation because they prefer "real frames" (if you work in games you realize how not real most of the frames are. E.g. temporal antialiasing and reprojection already synthesize "fake frames" that you see all the time from prior frames).

But the smart bet is on them not being interested in providing this service, as it will ultimately hurt their bottom line. Heck, there are PS4 games which only try to hit 60 fps for VR games, and rely on the reprojection to hit 120 fps for display in the headsets. If you can do that for 60 -> 120 you can do it for 30 -> 60 or 45 -> 60 as well, and if the quality is decent enough, who cares if the frames are real or fake? It's academic and a pedantic exercise. Modulo interpolation artifacts, which are being reduced all the time.

The latest Pixelworks interpolation engine on recent projectors like the Optoma UHD65 is apparently nearly 100% artifact free. And latency is of course something people are also working on as well. But if you can hit, say, 120hz, then one or two frames of latency added is only 8-16ms which is not very much. Currently these engines add like 100ms or more of latency but that figure is likely to drop over time.

What I write here is not appropriate for virtual reality of course, which demands max constant framerate at least in terms of reprojecting to the current head orientation and eventually eye focus, if not actually showing updated animations and scene content too. And that will likely go into custom hardware instead of additional compositor passes as they are now, so perhaps they will open this up for non-VR games too eventually, but I actually think they have a financial incentive to not do that.

Or hold off allowing it for 2D games because what "FPS can it do" is what sells GPUs in the gamer space at least.

One thing I do hope for is that G-Sync will perish thanks to HDMI 2.1 supporting Adaptive Sync standard VRR.

Xbox One X having variable refresh rate support could make the case for TV manufacturers to add this feature to their "gamer" friendly lines, and eventually to all TVs since it'll be embedded in commodity controllers which get passed around / reused between manufacturers.

1050 at 30 fps to 1080 ti at 50 seems a bit off. This is a necessary tech for high refresh displays. I hope gsync doesn't go away. Its better in many ways which you can read about on this forum. Presonally, it's better because of the lower input lag hit alone.

We're talking about techs that have huge performance requirements. Being able to use those refresh rates and techs is going to be a big incentive for buyers. Doubtful that people on the high end of nvidias price profiling sceme (now paying up to 3000 for increased gaming gpu performance) are going to want a low tier card where they will inevitably have to sacrifice quality at some level.
open
 
Posts: 133
Joined: 02 Jul 2017, 20:46

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Postby BattleAxeVR » 18 Dec 2017, 18:57

I own a G-sync monitor, and yes, it's pretty great, but its raison-d'etre is that game engines, having a variable work-load per frame, have variable frame rendering times, and it's better to slave the display to the GPU than the other way around and deal with stuttering and all the other problems with V-sync we all know and love.

The reason I wrote what I did, is from what I learned as a VR developer, namely, that a constant, perfectly synced, max framerate is necessary to avoid motion sickness. This fact makes VRR tech inappropriate for VR. Then I started to think to myself, well, why is it acceptable for 2D games to have variable smoothness. It's unnatural. It's certainly (much) better than V-sync, but both are a hack to compensate for the engine not being able to sustain max FPS at all times.

This is the problem that FRAT is the ideal tech to solve.

It's not about the specific numbers between 1050 vs 1080 Ti or whatever, that was just an example. Variable refresh rate is better than constant refresh rate for variable frames per second, but variable frames per second is itself inferior to constant frames per second. And it's very hard to reach super high, constant frames per second, the higher the resolution and refresh rate you are talking about. It's a war that cannot be won. Not only that, but it's inefficient to render that many unique frames, since the higher the native framerate, the better interpolation works. (less magnitude error = less obvious artifacts). And there are interpolation engines now which are very good, almost artifact-free.

It's approaching a "solved problem" at least in the 120hz TV range. We're just talking about displays of the with much better refresh rates.

And the fact that current interpolation chips can barely even do 4K24 to 4K60, at best. Videogames, especially on consoles, have often sacrificed peak FPS for constant but much lower FPS, like 60 or 30fps (on consoles), in order to get better looking pixels. But we can have both.
BattleAxeVR
 
Posts: 42
Joined: 14 Dec 2017, 11:38

Re: New term: "Frame Rate Amplification" (1000fps in cheap G

Postby Chief Blur Buster » 19 Dec 2017, 09:49

Yes, variable refresh rate is a great fix for variable frame rates.

Yes, constant framerate is really ideal for VR.

In theory, VRR tech isn't necessarily 100% inappropriate for VR if it is low-persistence VRR with constant-lag (not variable lag). Ideally via overkill framerates, e.g. VRR between 250Hz through 1000Hz for 1/250sec-to-1/1000sec persistence. However, you could also use strobed VRR too, so you can simply use lower refresh rates instead, to achieve the same persistence. Indeed, it would be far less motion-sicky than stutter.

(Those who understand persistence, know you either need strobing or overkill Hz to achieve ultra-low persistence.)

But this may be rendered mostly moot when Frame Rate Amplification Tech (FRAT) starts successfully achieving 1000fps laglessly & artifactlessly. And 1000Hz almost behaves like VRR, because persistence varies only in 1ms increments (refresh cycle granularity is so fine, you can more easily vary the persistence). Games modulating 30fps through 200fps (say, when FRAT is turned off) on a 1000Hz display would look darn near exactly like a G-SYNC display, since the refresh cycle granularity is so fine, that it behaves essentially VRR.

EDIT

___________

A Blur Busters Holiday 2017 Special Feature

Coming sooner than expected within our lifetimes in the 2020s:
Image\

Read more:
Blur Busters Law: The Amazing Journey To Future 1000Hz+ Displays
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 5079
Joined: 05 Dec 2013, 15:44

Previous

Return to Area 51: Display Science, Research & Engineering

Who is online

Users browsing this forum: No registered users and 2 guests