Edmond wrote:Ye, i dont think we are gonna ever combine VRR with strobing and achieve artifact free motion.
The only and best way to get that is to just up the refresh rate on a flicker less OLED with VRR.
A VRR that scales from 0hz (meaning it can hold an image at loading screens or something) up to like 300 or 400 hz. Such FPS is achievable in 99% of games out there and will deliver low enough motion blur for almost everyone.
With displayport 1.3 you can make a 1080p 400hz panel already. With super MHL you can go nuts.
At 400Hz you are still going to have visible motion blur.
You need about 1000Hz to match the better CRTs, and CRTs are not entirely blur-free.
The goal should be better than CRT, not something worse than the displays we had 15-20 years ago.
400 FPS is certainly not achievable in every game, when new titles like The Witcher 3 are struggling to stay above 60 FPS on even a Titan X.
Strobing removes framerate from the equation.
With strobing you can have less motion blur at 30 FPS than a flicker-free display at 1000 FPS.
Of course you will have horrible flicker and judder at such a low framerate, but there won't be any motion blur.
I don't think that combining strobing and VRR is an impossible task.
There are a lot of problems to solve, but none which seem insurmountable.
That really seems like the ideal for this type of display though: something like an OLED which supports variable refresh rates up to several hundred Hz with zero motion blur on the panel, combined with strobing which keeps perceived motion blur below that of the best CRTs.
In the mean-time, there's really very little reason that the existing OLED telvisions, which are all doing at least 120Hz internally, couldn't be updated to include a DisplayPort connection and support VRR up to at least 120Hz.
They could also offer a strobed mode at a fixed refresh rate for older games where we are able to keep the framerate locked to a certain value.
RLBURNSIDE wrote:I tried to poke AMD and MS to see if there was a way to modify the HDMI specification to add VRR, but I got nowhere. Being able to render a game NOT locked, in other words anywhere between 30fps and 60fps, would be a HUGE benefit to game developers, who could then be free from the tyranny of trying to render frames of vastly different complexity and composition at the same cadence, which is a logical impossibility.
I have heard that it may only be possible over DisplayPort because it is packet-based unlike HDMI.
RLBURNSIDE wrote:Xbox One devs suggested dynamically using the scaler to modify the expected frame render time to counter framerate slowdowns, in order to guarantee 60fps, but apparently the quality isn't there (no matter how good the scaler is). One improvement to the scaler that's been cited at a recent talk is using 1440x1080 (instead of the full 1920x1080 or something with both vertical and horizontal scaling like 900p) when you're running around 45fps to get back up to 60. 1440x1080 apparently scales up to 1920x1080 a lot better than 900p does and gives similar perf boost. (well, 75% gpu costs for 1440x1080 vs 68% for 900p).
There were some PS3 games which used a dynamic framebuffer like this. Wipeout HD and RAGE are two examples which come to mind.
It did not work very well in my opinion; it was ugly and did not manage to keep the games locked to 60 anyway.
In Wipeout I seem to recall some pretty severe framerate issues towards the end of the game at higher speed classes and with more going on. (more aggressive opponents, so more weapon effects in use etc.)
You're right that VRR is the ideal solution, since you can never guarantee game performance, but you can instead sync up the display to whatever your game's variable performance is.
I hope that there is some way of getting VRR out of these consoles - even if that means a new revision with a DisplayPort connector on the back (remember when Microsoft updated the 360 with HDMI ports?) because there are a few games that I'd like to play (Bloodborne) but I absolutely cannot tolerate the low performance of these systems as it is. I'm not sure that I'll be able to tolerate anything less than 60 FPS anyway, but VRR should at least greatly improve the smoothness of those games.
Sony are in a unique position where they not only make the consoles, but they also make televisions. Even if I never got one, I'd love it if they were able to use the PS4 to push all the other display manufacturers into including DisplayPort connections with support for Adaptive-Sync.
Though we are starting to see some monitors with VRR support that are not TN panels, they're all still far too small, low contrast, and expensive for what you get in my opinion.
Though my TV is only 1080p60, it's 46" in size, has a 5000:1 native contrast VA panel with perfect color reproduction, and a 100+ zone local dimming system which turns that 5000:1 into "infinite" contrast. (though not perfect, it does work to significantly improve the contrast without noticeable blooming)
I'd really like to be upgrading to a larger 4K set in the 55-65" range which either has similar or better specs (perhaps OLED?) and VRR support.
Not a small 27" 1000:1 IPS panel at $800+
I also wonder whether something like this PG27AQ display is actually going to be worthwhile, or if we should really be waiting on the first DisplayPort 1.3 video cards (Radeon 300 series?) and displays, since they should be capable of going above 60Hz at 4K.
Though playing many games at 4K above 60 FPS may seem unreasonable today, most people seem to keep their monitors for at least five years, and not everyone will be playing the latest releases.
And that makes me wonder whether it's such a smart idea buying a G-Sync display at all. I'd much rather see either NVIDIA add support for Adaptive-Sync (doesn't sound likely) or see manufacturers follow BenQ and release monitors which have both a G-Sync module in them and a regular monitor board, only with the second board including Adaptive-Sync support.
I really don't like the idea of spending $800+ on a monitor which is locked to one specific GPU vendor.