SS4 wrote:Also since human eye perceive around 72 fps and lULMB eliminate blur is there even a need to run higher resolution than lets say 75 hz once this becomes more mainstream? I guess the only benefit would be reduced input lag but nothing much on the visual front?
The human eye does not see in frames: It is simply misinformation that has been spreading around the internet for years. What we should be discussing is the flicker fusion point - the point where individual frames blend together and become perceived as smooth motion. It is possible to achieve flicker fusion at 72 fps in many situations, but not all. Even when you do achieve flicker fusion, an increased framerate will still improve the perceived smoothness as well as reducing input lag. At one time people thought that 24 fps was enough and set that as the standard for filming movies. Modern action movies use much faster motion and frequently break the illusion, in fact it has become part of the movie "feel" to do so and why there is so much backlash over new high framerate filming.
The framerate required to achieve flicker fusion is affected by the amount of motion blur, so when using blur reduction techniques such as ULMB we need a much higher framerate to achieve the same effect. Even without this, fast motion at 120 fps can easily break flicker fusion. Ideally we want a maximum of one pixel of movement per frame, but that would require a ludicrous framerate to achieve in some situations. If you consider a flight simulator, where the plane performs a full 360 degree roll in 1 second at 120 fps, there will be 3 degrees of rotation per frame. You end up with more than 1000 pixels of travel per 30 frames (90 degrees) on a 1080p display. This may be an extreme case, but you can see that even significantly slowing down the rotation will still benefit from more than 120 fps.