Blur Busters Forums

Who you gonna call? The Blur Busters! For Everything Better Than 60Hz™ Skip to content

Highest perceivable framerate? [good discussion]

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers. The masters on Blur Busters.

Re: Highest perceivable framerate?

Postby Chief Blur Buster » 01 Jun 2014, 20:30

Back to the original statement:
Alcazar wrote:Silky smooth animation that resembled the look of REAL LIFE. Ahh those were the days.
If you repeat the test again and pay close attention to detecting stroboscopic effects, you WILL see them (e.g. move mouse cursor in circles, or stare stationary while game motion scrolls past, or other things that makes you easily notice the stroboscopic effect)
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6405
Joined: 05 Dec 2013, 15:44

Re: Highest perceivable framerate?

Postby RealNC » 02 Jun 2014, 02:25

Chief Blur Buster wrote:Mouse Hz can be a limiting factor in seeing smoothness improvements on your monitor
a.k.a. 125Hz mice can prevent seeing microstutter improvements beyond ~125fps. Need 500/1000Hz to see fewer microstutters at beyond 125fps

And some games try to work around this by implementing "mouse smoothing." This interpolates between mouse ticks. It seems that many implementations of mouse smoothing add a delay though, so it's not always wanted.
TwitterSteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.
User avatar
RealNC
 
Posts: 2793
Joined: 24 Dec 2013, 18:32

Re: Highest perceivable framerate?

Postby Alcazar » 02 Jun 2014, 14:30

spacediver wrote:
Alcazar wrote:The results:
    60 FPS = Obvious stutter
    90 FPS = Some stutter
    100 FPS = Very little stutter
    110 FPS = No perceivable stutter
    120 FPS = No change
    140 FPS = No change


There is a potential confound here. The quake engine, as I understand it, can only render at 1000/n fps where n is an integer. So setting com_maxfps to 120 will result in 125 fps or 111.11 fps (you probably noticed this when you used cg_drawfps 1).

What this means is that if you had your refresh at 120hz, but your framerate at 125 fps, then there are some temporal artifacts due to the discrepancy.

btw, what name did you use - do you still play quakelive? If so, add me - I use the name Julios (from [xeno]).


Alcazar was/is the name. My clan was called K9 so I went by K9-Alcazar. Haven't played much quakelive but will look you up next time :)
User avatar
Alcazar
 
Posts: 9
Joined: 31 May 2014, 10:10
Location: Silicon Valley, CA

Re: Highest perceivable framerate?

Postby Alcazar » 02 Jun 2014, 14:45

Chief Blur Buster wrote:
Alcazar wrote:So, there you have it. Curious, I wonder if we will find some brains that perceive higher framerates though.

That's silly. Science papers have already disproven the notion of human brain functioning on frame rates. The human brain doesn't function on frame rates. "Brains that perceive higher framerates" is a non-sequitur, since there are many other variables that shifts these numbers around. There are multiple thresholds (e.g. flicker threshold, motion blur detection, stroboscopic detection), that function independently (see bottom of this post), but first, let's cover some bases first.


I agree that there are probably (and apparently, existing) ways to create tests that exercise a human brain's ability to recognize the discern ultra-high framerates and various degrees of motion blur way beyond 100FPS. And holy wow you have a tremendous wealth of knowledge on this subject so I'm going to have to confess I probably over-simplifid things for this forum.

But there is an important lesson learned from the test I performed back in the day, which was simply that anything over 100 FPS was unnecessary (at least, for my purposes.) I'd argue that even if you could tell the difference between the higher framerates through well-crafted tests, you aren't gaining any benefit from the ultra-high rates. My tests showed me that at 100 FPS, the motion of gaming became completely fluid (to my brain, at least) and anything beyond that was wasted electricity flowing througn my GPU. There was no real benefit from going higher than that, even when I spun the viewport around at high velocity, 100 FPS still felt perfectly "fluid"

I am curious if you believe there are any real-world applications of a computer display where going beyond this framerate perception limit could be beneficial? Like, while doing *what* on a computer would you wish you had 1000 FPS instead of only 100?
User avatar
Alcazar
 
Posts: 9
Joined: 31 May 2014, 10:10
Location: Silicon Valley, CA

Re: Highest perceivable framerate? [good discussion]

Postby Edmond » 02 Jun 2014, 15:16

This is all very scientific and nice. But i just want to say that realistically none of us will see 1000 or unlimited HZ displays in our lifetime(not for actual every day use), not if quantum computing suddenly takes off or whatever.
Which i doubt, cuz by the looks of it our species doesnt want technological progress in big leaps. Small, regular increments are way more profitable and keeps everyone employed, thus you cant say its a bad thing.

Lets think about what is realistic in our lifetime, if no new fundamentally different technologies arise.

Kids these days argue about whats the fps cap of vision ... 30fps vs 60 or whatever(yes, its bullshit). And when it comes to rendering - most still struggle to get that magic 60fps, since almost every display is still 60hz.

Realistically, and if more and more people support and popularize 120hz displays today, we can hope that 120hz becomes the new standard for everyone. I would like that, as my parents wouldn't have to torture their eyes looking @ 60hz monitors. Same goes for office workers and so on.
Obviously variable refresh rates would be must for such high refresh displays anyway, luckily we are starting to practice in that area already.

And when 120hz is the new standard for everyone, the enthusiasts of computer hardware will use 240hz displays. Such framerate is realistically achievable for gaming and it would give very low motion blur without introducing flicker. In such a world everyone would enjoy a totally brain-fooling fluid experience when using a computer display. And the hardware enthusiasts would get to enjoy a even better motion clarity.

The next step would be 500hz, but the fps requirement for that is too much and the blur reduction too little compared to 240hz.

And - obviously variable refresh rates would be must for such high refresh displays... We are incredibly lucky that we get to use gsync. Imagine how many games were "run" not "enjoyed" for decades before, because of the endless bullshit struggle with lag/stutter/tearing.

Personally, i am incredibly sensitive to any lag/stutter/whatever and i am buying a gsync monitor and never using the ulmb mode, because it disables gsync and introduces flicker. I dont want to worry about another trade off situation, fuck that. Variable 120hz still has way better motion clarity than 60hz anyway.

Sry, if it seemed off topic. Was trying to say highest perceivable framerate might be too high and therefore irrelevant. But that doesnt mean we cant all have a super smooth, artifact free gameplay very soon and add some flicker free motion clarity to that if you have a decade or two.
Edmond
 

Re: Highest perceivable framerate? [good discussion]

Postby Chief Blur Buster » 02 Jun 2014, 15:31

240fps@240Hz rolling-scan OLED should be doable. This panel tech can theoretically handle it given appropriate electronics, and 240Hz refreshes is already being done today on consumer displays via interpolation; it is a matter of letting the display motherboard pass it through from the video signal.

NHK 8K 120Hz, already done in the laboratory, has enough dotclock bandwidth to do 4K at true 480Hz, if a future panel supports it.

In the professional circles, full color (24bit) 1000fps@1000Hz at 1080p is actually coming to the laboratory by the end of this decade. Monochrome 1000fps@1000Hz is already in the lab. So it is technologically possible. However, it is not certain whether these tech will filter down to consumers.

Experiments are occuring, documented in Society For Information Fisplay (SID), quantum camera sensors (time coding arrival of each photon) would decouple the concept of framerate from video recording, theoretically allowing the video to be reproduced at any original frame rate. Consumers would be free to choose a display of their preferred refresh rate, in theory.

Realistic? Maybe not at the hands of the consumers, but 1000Hz displays already exist in the lab at lower color depths (and soon, full color) for the purposes of science. Today, a 4K camera can now be found in some upcoming smartphones and sub-$500 action cameras, and the new iPhone supports 120fps recording, plus my cheap $200 camera has a 1000fps low resolution recording mode. Most cheap HDTVs now have 120fps interpolation and increasingly 240fps interpolation, even if the signal accepted isnt yet above 60Hz (skimping a few dollars). Citrix hacked an LCD to accept 240Hz directly. The pieces for higher framerates are already technologically doable in ever cheaper and cheaper components, so at some point, it becomes a small cost-add to bump up standardized refresh rates. It will take many years, however...

For now, realistically, 240Hz gaming displays will arrive by the end of this decade. (you can bank on this specific prediction).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6405
Joined: 05 Dec 2013, 15:44

Re: Highest perceivable framerate? [good discussion]

Postby masterotaku » 02 Jun 2014, 15:36

About the stroboscopic effect: How much time does the eye retain an image? Let's say 0.2ms (random but maybe approximate number, based on the number of mouse cursors I see). That means than when you move the mouse quickly in a game playing at 120fps and with your eyes centered on the screen, you can see 24 different frames at the same time all over the place at any given moment (120*0.2). At 60Hz, they would be 12 frames.
Than retention must be the reason why we see things travelling through all the pixels of the screen while they're actually "teleporting".

More or less, at 80Hz and above I can't see flickering, so for example 90Hz and 125Hz look indistinguishable for me (color quality aside) in still images. Motion blur-wise, they look the same (using same persistence), but once I test the stroboscopic effect, it's very easy to know which refresh rate is which. Besides, at higher Hz, I feel it's easier to track movement (it's like it's easier to hook the eyes to the moving object).

The solution doesn't seem as "easy" as solving motion blur. Perhaps the only way of getting rid of the stroboscoping efect at reasonable refresh rates is in fact adding motion blur :lol: .
CPU: Intel Core i7 7700K @ 4.9GHz
GPU: Gainward Phoenix 1080 GLH
RAM: GSkill Ripjaws Z 3866MHz CL19
Motherboard: Gigabyte Gaming M5 Z270
Monitor: Asus PG278QR
User avatar
masterotaku
 
Posts: 436
Joined: 20 Dec 2013, 04:01

Re: Highest perceivable framerate? [good discussion]

Postby Chief Blur Buster » 02 Jun 2014, 17:41

masterotaku wrote:About the stroboscopic effect: How much time does the eye retain an image? Let's say 0.2ms (random but maybe approximate number, based on the number of mouse cursors I see).

It depends on the variables, such as proximity to vision center, brightness of the invidual images, the individuals' ability to count images quickly, brightness, contrast, etc. There is also no exact threshold, as it's a fuzzy threshold.

This is often called in vision sciences as "integration". The timespan tends to be in the scales of 1/10sec, give or take (I've seen numbers such as 1/30sec and 1/5sec). Varies a lot between individuals too, much like flicker thresholds can vary, etc. Your number is roughly in the same ballpark, but isn't a hard-and-fast number.

masterotaku wrote:More or less, at 80Hz and above I can't see flickering, so for example 90Hz and 125Hz look indistinguishable for me (color quality aside) in still images. Motion blur-wise, they look the same (using same persistence), but once I test the stroboscopic effect, it's very easy to know which refresh rate is which. Besides, at higher Hz, I feel it's easier to track movement (it's like it's easier to hook the eyes to the moving object).
Yep. The stroboscopic effect remains detectable for a long time, even to the thousand Hz range. It's no longer annoying beyond certain rates such as ~120Hz, but it's there to prevent some peoples' ability to completely immerse (Holodeck-style), while others don't even pay attention to it at all except when really focussing on it (like doing the mouse-arrow-on-black-background test).

masterotaku wrote:The solution doesn't seem as "easy" as solving motion blur. Perhaps the only way of getting rid of the stroboscoping efect at reasonable refresh rates is in fact adding motion blur :lol: .
This is correct, and many people prefer turning on motion blur because of this. That said, that should be a user preference.

Obviously -- At Blur Busters, we do not believe displays should force a guaranteed minimum motion blur upon your eyes, either(!) -- motion blur elimination should be a user accessible choice (by eliminating motion blur both at the display side and at the game side). But we aren't anti-motion blur, as such options should still be user-accessible.

You only need to add (1/Hz)th of GPU based motion blurring effect to eliminate the stroboscopic effect. So at 120Hz, you only need to blur the motion by 1/120sec (much like taking a camera photo while waving camera at 1/120sec shutter). But at 480fps@480Hz, you only need to add a tiny 1/480sec of motion blur (much like a camera at 1/480sec shutter). At some point, when the framerate & refreshrate gets high enough, adding GPU-based motion blur effects no longer becomes objectionable for desktop monitors

Imagine the theoretical 1000Hz monitor (or VR headset) requiring only 1ms of added GPU-motion-blur effect to also eliminate stroboscopic effects and mousedropping effects, without user noticing that the GPU is forcing additional blur above-and-beyond natural human limitations. Zero noticeable blur for most motionspeeds, zero stroboscopic effects, and zero wagonwheel effects. Vision research will be needed to test these out (very few true-1000Hz displays exist, just a few of them in laboratories, and true-500Hz displays are very expensive; Viewpixx displays cost well over $10K for these ultrahigh refresh rate scientific displays).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6405
Joined: 05 Dec 2013, 15:44

Re: Highest perceivable framerate?

Postby Chief Blur Buster » 02 Jun 2014, 18:00

Alcazar wrote:I am curious if you believe there are any real-world applications of a computer display where going beyond this framerate perception limit could be beneficial? Like, while doing *what* on a computer would you wish you had 1000 FPS instead of only 100?
It is true there is points of diminishing returns. But it becomes more important on bigger displays at higher resolutions during faster motion speeds.

If you double the size of the display at double the resolution, the stroboscopic limitations start to become more visible. Display limitations even becomes even worse when your full field of vision is covered (e.g. virtual reality). In this situation when you've got a literal OMNIMAX dome covering your field of vision in a headset, 4K is not enough resolution to be Retina quality. (Even IMAX isn't retina quality when you sit close enough to the IMAX screen). In this situation, stroboscopic imperfections start to clearly show up.

Yesterday's 17" and even 21" CRTs did not cover much FOV at sufficient resolutions, and you often played dungeon games (Quake Live) which often were dark so did not show strobing effects as clearly as bright high-contrast games. Things look crystal clear when you're tracking eyes on motion, it looks like perfect motion, which is what CRTs do. But if you stare stationary while things scroll past, the stroboscopic effect shows up.

Different people's vision have different sensitivities to different tresholds. Some people are more sensitive to tearing/stutters. Others are not. Some people are more sensitive to color. Others are less sensitive (color blindness of varying extents). Some people are more sensitive to motion blur (CRT vs LCD) . Others are not as sensitive or bothered by it. The stroboscopic effect affects different people very differently.

Also, due to diminishing points of return, you need to go up bigger steps to see a difference. For example, if going 60Hz->120Hz, then you need to essentially go from 120fps@120Hz -> 960fps@960Hz to see "wow, I do notice a difference". 120Hz vs 144Hz is subtle, but 120Hz vs 960Hz is not subtle. During this situation, the stroboscopic effect would go down so dramatically, that most mouse movements is a simple continuous motion blur on a black background, and likewise for game motion.

Not everyone would care, but consider:
- People who are used to 60Hz, who upgraded to 120Hz strobed (LightBoost) and can't comfortably go back
- People who are used to TN, who upgraded to IPS and can't comfortably go back
- People who never cared about 30fps-vs-60fps, but began playing lost of 60fps games and can't comfortably go back to 30fps
- People who thought 125Hz mouse was smooth, but then upgraded to a 1000Hz mouse and swear by it now
- Etc.

Such high Hz may never happen in our lifetimes, but it is worth considering we often do not see certain kinds of display effects until a new technology passes along, and then we find we can't live without it! This will ensure long-term progress in improving persistence issues on displays (one way or another, even if not via ultrahigh framerates).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors
User avatar
Chief Blur Buster
Site Admin
 
Posts: 6405
Joined: 05 Dec 2013, 15:44

Re: Highest perceivable framerate? [good discussion]

Postby ScepticMatt » 12 Jun 2014, 16:51

In my agreemend with flood, I feel that eye-tracking motion blur is the only real way to solve motion perception beyond using unpractical refresh rates. And while the eye-tracking needs to be fast, it doesn't need to be sub-ms as perception during saccades is limited and contrast sensitivity drops.

Image

A study done by the European Broadcast Union last year found that 700Hz is enough to avoid stroboscopic issues for UHDTV content viewed at 1.5 picture heights distance.
http://www.bbc.co.uk/rd/blog/2013/12/hi ... s-workshop

A while ago, I posted a thread about temporal aliasing on blurbusters.
Time discrete rendering causes temporal aliasing. As resolutions increase, impractical frame rates are necessary to avoid it. Spatio-temporal anti-aliasing can converge to an ideal solution, thus eliminating the need for higher frame rates (beyond flicker fusion), but eye movements need to be accounted for.

viewtopic.php?f=7&t=370&start=0
ScepticMatt
 
Posts: 36
Joined: 16 Feb 2014, 14:42

PreviousNext

Return to Area 51: Display Science, Research & Engineering

Who is online

Users browsing this forum: No registered users and 1 guest