Alcazar wrote:I am curious if you believe there are any real-world applications of a computer display where going beyond this framerate perception limit could be beneficial? Like, while doing *what* on a computer would you wish you had 1000 FPS instead of only 100?
It is true there is points of diminishing returns. But it becomes more important on bigger displays at higher resolutions during faster motion speeds.
If you double the size of the display at double the resolution, the stroboscopic limitations start to become more visible. Display limitations even becomes even worse when your full field of vision is covered (e.g. virtual reality). In this situation when you've got a literal OMNIMAX dome covering your field of vision in a headset, 4K is not enough resolution to be Retina quality. (Even IMAX isn't retina quality when you sit close enough to the IMAX screen). In this situation, stroboscopic imperfections start to clearly show up.
Yesterday's 17" and even 21" CRTs did not cover much FOV at sufficient resolutions, and you often played dungeon games (Quake Live) which often were dark so did not show strobing effects as clearly as bright high-contrast games. Things look crystal clear when you're tracking eyes on motion, it looks like perfect motion, which is what CRTs do. But if you stare stationary while things scroll past, the stroboscopic effect shows up.
Different people's vision have different sensitivities to different tresholds. Some people are more sensitive to tearing/stutters. Others are not. Some people are more sensitive to color. Others are less sensitive (color blindness of varying extents). Some people are more sensitive to motion blur (CRT vs LCD) . Others are not as sensitive or bothered by it. The stroboscopic effect affects different people very differently.
Also, due to diminishing points of return, you need to go up bigger steps to see a difference. For example, if going 60Hz->120Hz, then you need to essentially go from 120fps@120Hz -> 960fps@960Hz to see "wow, I do notice a difference". 120Hz vs 144Hz is subtle, but 120Hz vs 960Hz is not subtle. During this situation, the stroboscopic effect would go down so dramatically, that most mouse movements is a simple continuous motion blur on a black background, and likewise for game motion.
Not everyone would care, but consider:
- People who are used to 60Hz, who upgraded to 120Hz strobed (LightBoost) and can't comfortably go back
- People who are used to TN, who upgraded to IPS and can't comfortably go back
- People who never cared about 30fps-vs-60fps, but began playing lost of 60fps games and can't comfortably go back to 30fps
- People who thought 125Hz mouse was smooth, but then upgraded to a 1000Hz mouse and swear by it now
- Etc.
Such high Hz may never happen in our lifetimes, but it is worth considering we often do not see certain kinds of display effects until a new technology passes along, and then we find we can't live without it! This will ensure long-term progress in improving persistence issues on displays (one way or another, even if not via ultrahigh framerates).