4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Breaking news in the gaming monitor industry! Press releases, new monitors, rumors. IPS, OLED, 144Hz, G-SYNC, Ultrawides, etc. Submit news you see online!
Edmond

Re: 4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Post by Edmond » 12 Jan 2015, 05:28

Ye, i dont think we are gonna ever combine VRR with strobing and achieve artifact free motion.

The only and best way to get that is to just up the refresh rate on a flicker less OLED with VRR.
A VRR that scales from 0hz (meaning it can hold an image at loading screens or something) up to like 300 or 400 hz. Such FPS is achievable in 99% of games out there and will deliver low enough motion blur for almost everyone.

With displayport 1.3 you can make a 1080p 400hz panel already. With super MHL you can go nuts.

flood
Posts: 929
Joined: 21 Dec 2013, 01:25

Re: 4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Post by flood » 12 Jan 2015, 13:50

yes we are just not on shitty lcds.
wouldn't really be necessary if game devs targeted a fixed fps

Edmond

Re: 4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Post by Edmond » 13 Jan 2015, 05:19

flood wrote:yes we are just not on shitty lcds.
wouldn't really be necessary if game devs targeted a fixed fps
Thats impossible. You would need to redesign everything to be able to target a fixed fps. And by everything i mean the processor architecture and all the software layers upon it.

And you aint combining strobing with VRR on OLED either. And even if you somehow managed(you wouldnt tho), it would still be a flickerfest. The only way to remove ALL defects and artifacts is to combine OLED with VRR and up the HZ.

RLBURNSIDE
Posts: 104
Joined: 06 Apr 2015, 16:09

Re: 4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Post by RLBURNSIDE » 21 May 2015, 22:56

flood wrote:yes we are just not on shitty lcds.
wouldn't really be necessary if game devs targeted a fixed fps
We do target a fixed FPS. 60hz usually. Or your monitor's refresh rate. But usually engines are frame locked to 60hz because of consoles, and even then quite often those consoles end up shipping at 30fps and showing the same frame twice. Which is yes, quite laggy and slow, but it's necessary to avoid judder which is even worse.

I tried to poke AMD and MS to see if there was a way to modify the HDMI specification to add VRR, but I got nowhere. Being able to render a game NOT locked, in other words anywhere between 30fps and 60fps, would be a HUGE benefit to game developers, who could then be free from the tyranny of trying to render frames of vastly different complexity and composition at the same cadence, which is a logical impossibility.

VRR makes the monitor the slave to the game engine, and trust me, forcing a fixed frame render time is part of the problem, not the solution. The solution is VRR and relaxing the requirements for a fixed FPS in the engine, while still keeping the smoothness and judder-free pans and animation that we all want.

Xbox One devs suggested dynamically using the scaler to modify the expected frame render time to counter framerate slowdowns, in order to guarantee 60fps, but apparently the quality isn't there (no matter how good the scaler is). One improvement to the scaler that's been cited at a recent talk is using 1440x1080 (instead of the full 1920x1080 or something with both vertical and horizontal scaling like 900p) when you're running around 45fps to get back up to 60. 1440x1080 apparently scales up to 1920x1080 a lot better than 900p does and gives similar perf boost. (well, 75% gpu costs for 1440x1080 vs 68% for 900p).

To reiterate once again, most engines target a fixed FPS already. I've worked at many major companies and this is simply how engines work. They have an internal sync. Yes, it's inefficient and results in lots of duplicate work and wasted work and it sucks, big time, but it's all done because we are slaves to the monitor instead of the monitor being slaves to us. Getting VRR on to the consoles would be a huge industry-wide win and I'd very much like to see PS4 and Xbox One use their free-sync capable GPUs one day to send VRR over their HDMI ports. If that's possible with a firmware update and only if and when the HDMI base spec can be amended like it just has been to support HDR, through HDMI 2.0a revision.

Black Octagon
Posts: 216
Joined: 18 Dec 2013, 03:41

Re: 4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Post by Black Octagon » 22 May 2015, 14:38

RLBURNSIDE wrote:
flood wrote:yes we are just not on shitty lcds.
wouldn't really be necessary if game devs targeted a fixed fps
We do target a fixed FPS. 60hz usually. Or your monitor's refresh rate. But usually engines are frame locked to 60hz because of consoles, and even then quite often those consoles end up shipping at 30fps and showing the same frame twice. Which is yes, quite laggy and slow, but it's necessary to avoid judder which is even worse.

I tried to poke AMD and MS to see if there was a way to modify the HDMI specification to add VRR, but I got nowhere. Being able to render a game NOT locked, in other words anywhere between 30fps and 60fps, would be a HUGE benefit to game developers, who could then be free from the tyranny of trying to render frames of vastly different complexity and composition at the same cadence, which is a logical impossibility.

VRR makes the monitor the slave to the game engine, and trust me, forcing a fixed frame render time is part of the problem, not the solution. The solution is VRR and relaxing the requirements for a fixed FPS in the engine, while still keeping the smoothness and judder-free pans and animation that we all want.

Xbox One devs suggested dynamically using the scaler to modify the expected frame render time to counter framerate slowdowns, in order to guarantee 60fps, but apparently the quality isn't there (no matter how good the scaler is). One improvement to the scaler that's been cited at a recent talk is using 1440x1080 (instead of the full 1920x1080 or something with both vertical and horizontal scaling like 900p) when you're running around 45fps to get back up to 60. 1440x1080 apparently scales up to 1920x1080 a lot better than 900p does and gives similar perf boost. (well, 75% gpu costs for 1440x1080 vs 68% for 900p).

To reiterate once again, most engines target a fixed FPS already. I've worked at many major companies and this is simply how engines work. They have an internal sync. Yes, it's inefficient and results in lots of duplicate work and wasted work and it sucks, big time, but it's all done because we are slaves to the monitor instead of the monitor being slaves to us. Getting VRR on to the consoles would be a huge industry-wide win and I'd very much like to see PS4 and Xbox One use their free-sync capable GPUs one day to send VRR over their HDMI ports. If that's possible with a firmware update and only if and when the HDMI base spec can be amended like it just has been to support HDR, through HDMI 2.0a revision.
60Hz is a refresh rate, not a frame rate

Glide
Posts: 280
Joined: 24 Mar 2015, 20:33

Re: 4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Post by Glide » 22 May 2015, 18:39

Edmond wrote:Ye, i dont think we are gonna ever combine VRR with strobing and achieve artifact free motion.
The only and best way to get that is to just up the refresh rate on a flicker less OLED with VRR.
A VRR that scales from 0hz (meaning it can hold an image at loading screens or something) up to like 300 or 400 hz. Such FPS is achievable in 99% of games out there and will deliver low enough motion blur for almost everyone.
With displayport 1.3 you can make a 1080p 400hz panel already. With super MHL you can go nuts.
At 400Hz you are still going to have visible motion blur.
You need about 1000Hz to match the better CRTs, and CRTs are not entirely blur-free.
The goal should be better than CRT, not something worse than the displays we had 15-20 years ago.

400 FPS is certainly not achievable in every game, when new titles like The Witcher 3 are struggling to stay above 60 FPS on even a Titan X.

Strobing removes framerate from the equation.
With strobing you can have less motion blur at 30 FPS than a flicker-free display at 1000 FPS.
Of course you will have horrible flicker and judder at such a low framerate, but there won't be any motion blur.

I don't think that combining strobing and VRR is an impossible task.
There are a lot of problems to solve, but none which seem insurmountable.

That really seems like the ideal for this type of display though: something like an OLED which supports variable refresh rates up to several hundred Hz with zero motion blur on the panel, combined with strobing which keeps perceived motion blur below that of the best CRTs.

In the mean-time, there's really very little reason that the existing OLED telvisions, which are all doing at least 120Hz internally, couldn't be updated to include a DisplayPort connection and support VRR up to at least 120Hz.
They could also offer a strobed mode at a fixed refresh rate for older games where we are able to keep the framerate locked to a certain value.
RLBURNSIDE wrote:I tried to poke AMD and MS to see if there was a way to modify the HDMI specification to add VRR, but I got nowhere. Being able to render a game NOT locked, in other words anywhere between 30fps and 60fps, would be a HUGE benefit to game developers, who could then be free from the tyranny of trying to render frames of vastly different complexity and composition at the same cadence, which is a logical impossibility.
I have heard that it may only be possible over DisplayPort because it is packet-based unlike HDMI.
RLBURNSIDE wrote:Xbox One devs suggested dynamically using the scaler to modify the expected frame render time to counter framerate slowdowns, in order to guarantee 60fps, but apparently the quality isn't there (no matter how good the scaler is). One improvement to the scaler that's been cited at a recent talk is using 1440x1080 (instead of the full 1920x1080 or something with both vertical and horizontal scaling like 900p) when you're running around 45fps to get back up to 60. 1440x1080 apparently scales up to 1920x1080 a lot better than 900p does and gives similar perf boost. (well, 75% gpu costs for 1440x1080 vs 68% for 900p).
There were some PS3 games which used a dynamic framebuffer like this. Wipeout HD and RAGE are two examples which come to mind.
It did not work very well in my opinion; it was ugly and did not manage to keep the games locked to 60 anyway.
In Wipeout I seem to recall some pretty severe framerate issues towards the end of the game at higher speed classes and with more going on. (more aggressive opponents, so more weapon effects in use etc.)


You're right that VRR is the ideal solution, since you can never guarantee game performance, but you can instead sync up the display to whatever your game's variable performance is.
I hope that there is some way of getting VRR out of these consoles - even if that means a new revision with a DisplayPort connector on the back (remember when Microsoft updated the 360 with HDMI ports?) because there are a few games that I'd like to play (Bloodborne) but I absolutely cannot tolerate the low performance of these systems as it is. I'm not sure that I'll be able to tolerate anything less than 60 FPS anyway, but VRR should at least greatly improve the smoothness of those games.

Sony are in a unique position where they not only make the consoles, but they also make televisions. Even if I never got one, I'd love it if they were able to use the PS4 to push all the other display manufacturers into including DisplayPort connections with support for Adaptive-Sync.

Though we are starting to see some monitors with VRR support that are not TN panels, they're all still far too small, low contrast, and expensive for what you get in my opinion.
Though my TV is only 1080p60, it's 46" in size, has a 5000:1 native contrast VA panel with perfect color reproduction, and a 100+ zone local dimming system which turns that 5000:1 into "infinite" contrast. (though not perfect, it does work to significantly improve the contrast without noticeable blooming)

I'd really like to be upgrading to a larger 4K set in the 55-65" range which either has similar or better specs (perhaps OLED?) and VRR support.
Not a small 27" 1000:1 IPS panel at $800+

I also wonder whether something like this PG27AQ display is actually going to be worthwhile, or if we should really be waiting on the first DisplayPort 1.3 video cards (Radeon 300 series?) and displays, since they should be capable of going above 60Hz at 4K.
Though playing many games at 4K above 60 FPS may seem unreasonable today, most people seem to keep their monitors for at least five years, and not everyone will be playing the latest releases.
And that makes me wonder whether it's such a smart idea buying a G-Sync display at all. I'd much rather see either NVIDIA add support for Adaptive-Sync (doesn't sound likely) or see manufacturers follow BenQ and release monitors which have both a G-Sync module in them and a regular monitor board, only with the second board including Adaptive-Sync support.
I really don't like the idea of spending $800+ on a monitor which is locked to one specific GPU vendor.

RLBURNSIDE
Posts: 104
Joined: 06 Apr 2015, 16:09

Re: 4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Post by RLBURNSIDE » 23 May 2015, 01:02

Black Octagon wrote:60Hz is a refresh rate, not a frame rate
Says who? 1 hz = one X per second, it doesn't specify what X is.

While you're being pedantic, you might actually consider whether you're actually correct. You absolutely can say a 60hz game engine is one which is refreshed 60 times per second, for example Skyrim with its physics engine. If you try to overclock the game engine, weird stuff starts happening with the physics loop.

It's pure semantics, nothing more. The term herz is topic-agnostic. It specifies a frequency, not what the frequency is about.

Black Octagon
Posts: 216
Joined: 18 Dec 2013, 03:41

Re: 4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Post by Black Octagon » 23 May 2015, 13:52

RLBURNSIDE wrote:
Black Octagon wrote:60Hz is a refresh rate, not a frame rate
Says who? 1 hz = one X per second, it doesn't specify what X is.

While you're being pedantic, you might actually consider whether you're actually correct. You absolutely can say a 60hz game engine is one which is refreshed 60 times per second, for example Skyrim with its physics engine. If you try to overclock the game engine, weird stuff starts happening with the physics loop.

It's pure semantics, nothing more. The term herz is topic-agnostic. It specifies a frequency, not what the frequency is about.
I think Edmond interpreted the phrase 'fixed fps' in the same (literal) way I did.

I.e., A situation in which the game adjusts the IQ in real-time in order to maintain a constant frame rate of 60fps (for instance). Not just a game physics engine based on an assumed monitor refresh rate of 60Hz (à la Skyrim) but a genuinely 'fixed' rate at which the GPU renders frames and sends them to the display.

RLBURNSIDE
Posts: 104
Joined: 06 Apr 2015, 16:09

Re: 4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Post by RLBURNSIDE » 18 Jun 2015, 01:15

Black Octagon wrote:
RLBURNSIDE wrote:
Black Octagon wrote:60Hz is a refresh rate, not a frame rate
Says who? 1 hz = one X per second, it doesn't specify what X is.

While you're being pedantic, you might actually consider whether you're actually correct. You absolutely can say a 60hz game engine is one which is refreshed 60 times per second, for example Skyrim with its physics engine. If you try to overclock the game engine, weird stuff starts happening with the physics loop.

It's pure semantics, nothing more. The term herz is topic-agnostic. It specifies a frequency, not what the frequency is about.
I think Edmond interpreted the phrase 'fixed fps' in the same (literal) way I did.

I.e., A situation in which the game adjusts the IQ in real-time in order to maintain a constant frame rate of 60fps (for instance). Not just a game physics engine based on an assumed monitor refresh rate of 60Hz (à la Skyrim) but a genuinely 'fixed' rate at which the GPU renders frames and sends them to the display.
Well, lots of game engines are frame locked to refresh at either 30 or 60hz, regardless of the GPU. If the frame isn't ready, it simply sends the last frame again. On consoles V-sync is always on, so 60hz is identical to 60fps. Each one of those frames, however, isn't necesserily unique, because, once again, you cannot really guarantee that you will always draw your frames within 16ms in a dynamical environment, no matter what.

That's kind of the whole point of slaving the display to the GPU (g-sync or freesync) instead of the other way around (v-sync). Variable refresh rates are predicated on the truism that there is no general way to guarantee a fixed frame rate with unique frames with arbitrary inputs (what I mean by that, variable geometry and rasterization load implies variable frame rate, which in turn motivates variable refresh rate rather than studdering due to v-sync). On consoles you cannot ship the game without v-sync on, but you often do display the same frame twice. There is no real 60fps game that never drops below 60. Even in the best case estimates, where they try to fit everything in 14 miliseconds of frame budget, there are still always going to be outliers meaning your framerate isn't going to be constant.

The best thing for PC titles is not to frame lock, but to have a variable frame duration.

The fact that I used 60hz and 60fps interchangeably is just a byproduct of the fact that sending out frames at a rate of 60hz IS 60fps. It's pure semantics. I absolutely can use the word Hertz to discuss frame rate. It's a rate, a frequency, and thus it's applicable. People in gaming would do best by using actual scientific terminology instead of learning their maths through forums which often give them a very narrow understanding of these very general terms.

Post Reply