Blur Busters Forums

Who you gonna call? The Blur Busters! For Everything Better Than 60Hz™ Skip to content

4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Breaking news in the gaming monitor industry! Press releases, new monitors, rumors. IPS, OLED, 144Hz, G-SYNC, Ultrawides, etc. Submit news you see on the Net!

4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Postby Chief Blur Buster » 05 Jan 2015, 17:06

ASUS PG27AQ
4K IPS FreeSync monitor. 60Hz.
Republic Of Gamers - ROG branded

Article:
http://www.pcper.com/news/Displays/CES- ... NC-and-IPS

Image
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter!
User avatar
Chief Blur Buster
Site Admin
 
Posts: 3321
Joined: 05 Dec 2013, 15:44

Re: 4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Postby RadonGOG » 05 Jan 2015, 18:02

Chief Blur Buster wrote:ASUS PG27AQ
4K IPS FreeSync monitor. 60Hz.
Republic Of Gamers - ROG branded

Article:
http://www.pcper.com/news/Displays/CES- ... NC-and-IPS

Image

The liked article names this one a GSYNC and no FreeSync-Monitor...
RadonGOG
 
Posts: 2
Joined: 05 Jan 2015, 17:49

Re: 4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Postby jts888 » 05 Jan 2015, 21:44

RadonGOG wrote:The liked article names this one a GSYNC and no FreeSync-Monitor...


G-sync is exclusive to anything else, including any form of secondary or non-DisplayPort input.
Until Nvidia openly adopts VESA Adaptive-sync, they're wouldn't let the refresh rate pre-negotiation through their FPGA controller to the display scaler logic, even if this monitor happened to be using a new Adaptive-sync supporting scaler behind the Nvidia link.

Since next generation, Adaptive-sync scalers are already finished and hitting the streets in new monitors starting next month, there's virtually no reason for ASUS not not offer an Adaptive-sync version of this monitor soon other than possible binding agreements with Nvidia to not do so.

In any case, this panel or ones like it will soon be available in monitors from other manufactures anyway if it's any good, I think.
Assuming no nasty Adaptive-sync/FreeSync surprises, the sooner G-sync dies the better IMO.
jts888
 
Posts: 28
Joined: 05 Jan 2015, 21:36

Re: 4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Postby Sparky » 05 Jan 2015, 23:59

jts888 wrote:Assuming no nasty Adaptive-sync/FreeSync surprises, the sooner G-sync dies the better IMO.

Wouldn't it be better to have two competing solutions constantly trying to outperform each other? There is still a lot of room for improvement.
Sparky
 
Posts: 506
Joined: 15 Jan 2014, 02:29

Re: 4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Postby jts888 » 06 Jan 2015, 01:07

Sparky wrote:Wouldn't it be better to have two competing solutions constantly trying to outperform each other? There is still a lot of room for improvement.


No.

Adaptive-sync is the simplest possible (and now the standard) communications protocol to achieve the end result of variable refresh rates. FreeSync is just the hardware/driver on the GPU side feeding any arbitrary, freely licensed monitors.

G-sync is both a proprietary, needlessly complicated bidirectional communications protocol and the proprietary hardware on both GPU and monitor ends. Its cost comes from its (~$100-$150) FPGA board, which is essentially a hardware emulator for a custom scaler, which allows firmware upgrades to markedly change the logic in the future.

While G-sync monitor logic allows the possibility of doing variable-refresh-aware pixel overdrive (why the buffer appears to be needed), etc., there is no innate technical reason that the custom driver logic needs to talk to GPUs over the G-sync protocol instead of the simpler, more effective, and now standard Adaptive-sync.

What G-sync really does is:
  • keep AMD/Intel GPUs from feeding Nvidia's fancy scalers
  • keep established scaler makers from directly competing against Nvidia's monitor hardware by not feeding them Adaptive-sync
  • establish an overall "walled garden" for the entire display chain, keeping gamers locked reciprocally into Nvidia GPUs/monitors, buying one to not lose an already sunk cost in the other
In all likelihood, G-sync and FreeSync chains will have identical latency characteristics, but G-sync will have some faint pixel transition quality advantage over at least the first generation of Adaptive-sync scalers, but at the cost of a $150 or more premium. I expect Nvidia to pound on these differences once the first Adaptive-sync monitors are released, but it'll be tough to convince the market unless they are really substantial.

If G-sync pixel overdrive tweaks and special GPU frame pacing etc. were really worth $100 or more, Nvidia could easily unbundle the halves via a monitor firmware update, supporting Adaptive-sync and letting them each freely compete on the market.

But they realistically won't even consider unchaining the G-sync monitors and cards until they've definitively lost the market race.
jts888
 
Posts: 28
Joined: 05 Jan 2015, 21:36

Re: 4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Postby Sparky » 06 Jan 2015, 03:02

jts888 wrote:
Sparky wrote:Wouldn't it be better to have two competing solutions constantly trying to outperform each other? There is still a lot of room for improvement.


No.

Adaptive-sync is the simplest possible (and now the standard) communications protocol to achieve the end result of variable refresh rates. FreeSync is just the hardware/driver on the GPU side feeding any arbitrary, freely licensed monitors.

G-sync is both a proprietary, needlessly complicated bidirectional communications protocol and the proprietary hardware on both GPU and monitor ends. Its cost comes from its (~$100-$150) FPGA board, which is essentially a hardware emulator for a custom scaler, which allows firmware upgrades to markedly change the logic in the future.

While G-sync monitor logic allows the possibility of doing variable-refresh-aware pixel overdrive (why the buffer appears to be needed), etc., there is no innate technical reason that the custom driver logic needs to talk to GPUs over the G-sync protocol instead of the simpler, more effective, and now standard Adaptive-sync.

What G-sync really does is:
  • keep AMD/Intel GPUs from feeding Nvidia's fancy scalers
  • keep established scaler makers from directly competing against Nvidia's monitor hardware by not feeding them Adaptive-sync
  • establish an overall "walled garden" for the entire display chain, keeping gamers locked reciprocally into Nvidia GPUs/monitors, buying one to not lose an already sunk cost in the other
In all likelihood, G-sync and FreeSync chains will have identical latency characteristics, but G-sync will have some faint pixel transition quality advantage over at least the first generation of Adaptive-sync scalers, but at the cost of a $150 or more premium. I expect Nvidia to pound on these differences once the first Adaptive-sync monitors are released, but it'll be tough to convince the market unless they are really substantial.

If G-sync pixel overdrive tweaks and special GPU frame pacing etc. were really worth $100 or more, Nvidia could easily unbundle the halves via a monitor firmware update, supporting Adaptive-sync and letting them each freely compete on the market.

But they realistically won't even consider unchaining the G-sync monitors and cards until they've definitively lost the market race.
I'm not talking about minor tweaks, I'm talking about huge improvements that will require a lot of R&D: Combining variable refresh and low persistence. It's not going to happen if everybody settles on a single standard and tries to compete on price alone.
Sparky
 
Posts: 506
Joined: 15 Jan 2014, 02:29

Re: 4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Postby jts888 » 06 Jan 2015, 04:30

Sparky wrote:I'm not talking about minor tweaks, I'm talking about huge improvements that will require a lot of R&D: Combining variable refresh and low persistence. It's not going to happen if everybody settles on a single standard and tries to compete on price alone.

But G-sync doesn't even do that, and nothing about the proprietary G-sync protocol is necessary to do that.

Scaler and/or monitor manufacturers just need to experiment and decide when a frame update starts coming down the wire:
  • for how long (if at all) should the backlight be turned off early, before the start of the redraw?
  • following a full intensity strobe (duration determined by brightness level), should the backlight brightness go immediately to the lower "holding" level, transition smoothly to it, or even be briefly shut off completely?
Those are literally the only factors to worry about, since GPUs can't pre-announce the transmission of a frame, sending them out immediately upon completion anyway, so you're just trying to find an optimal tradeoff point between muted motion blur trails when following moving objects and reduced strobing artifacts when scanning across static content.

Letting the different manufacturers accept standardized protocol input and come up with competing parameters to the preceding questions is certainly within their talents and is plainly the way to go forward. This sort of flexibility may well already be within the capabilities of existing scaler ASICs given that strobing and blur reduction solutions come from several different vendors now.

This certainly isn't rocket science, and you're drinking too much Kool-Aid if you think this has to be a multi-million dollar research effort. G-sync's custom FPGA hardware is really only needed to allow a 2-way link to specifically only GeForce cards and to allow Nvidia to sell prototype scaler chips to the public at lower cost/risk than possibly buggy fabricated ASICs.
jts888
 
Posts: 28
Joined: 05 Jan 2015, 21:36

Re: 4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Postby Sparky » 06 Jan 2015, 07:10

jts888 wrote:
Sparky wrote:I'm not talking about minor tweaks, I'm talking about huge improvements that will require a lot of R&D: Combining variable refresh and low persistence. It's not going to happen if everybody settles on a single standard and tries to compete on price alone.

But G-sync doesn't even do that, and nothing about the proprietary G-sync protocol is necessary to do that.

Scaler and/or monitor manufacturers just need to experiment and decide when a frame update starts coming down the wire:
  • for how long (if at all) should the backlight be turned off early, before the start of the redraw?
  • following a full intensity strobe (duration determined by brightness level), should the backlight brightness go immediately to the lower "holding" level, transition smoothly to it, or even be briefly shut off completely?
Those are literally the only factors to worry about, since GPUs can't pre-announce the transmission of a frame, sending them out immediately upon completion anyway, so you're just trying to find an optimal tradeoff point between muted motion blur trails when following moving objects and reduced strobing artifacts when scanning across static content.

Letting the different manufacturers accept standardized protocol input and come up with competing parameters to the preceding questions is certainly within their talents and is plainly the way to go forward. This sort of flexibility may well already be within the capabilities of existing scaler ASICs given that strobing and blur reduction solutions come from several different vendors now.

This certainly isn't rocket science, and you're drinking too much Kool-Aid if you think this has to be a multi-million dollar research effort. G-sync's custom FPGA hardware is really only needed to allow a 2-way link to specifically only GeForce cards and to allow Nvidia to sell prototype scaler chips to the public at lower cost/risk than possibly buggy fabricated ASICs.
I don't think you fully understand the problem. To start, you need high persistence at low framerates, to avoid flicker and PWM artifacts. You then need to smoothly transition to high framerate low persistence in order to minimize motion blur, without any perceptible change in brightness or colors. THEN you need to figure out how to do all of those things when you don't know how long it's going to take to get the next frame. I think it's solvable, but it's as much a biology problem as an electronics one.

What's actually needed to do that research? Tens to hundreds of different panels, different panel technologies, engineers to interface prototype scalers to different panels and calibrate them, test subjects(young people, old people, different types of colorblindness, etc.), and a lot of time to spend working on the problem.

Think about why g-sync exists in the first place. Do you really think Nvidia would have gotten involved in a traditionally low margin industry if the major display manufacturers didn't need a kick in the pants?
Sparky
 
Posts: 506
Joined: 15 Jan 2014, 02:29

Re: 4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Postby jts888 » 06 Jan 2015, 16:16

Sparky wrote:I don't think you fully understand the problem. To start, you need high persistence at low framerates, to avoid flicker and PWM artifacts. You then need to smoothly transition to high framerate low persistence in order to minimize motion blur, without any perceptible change in brightness or colors. THEN you need to figure out how to do all of those things when you don't know how long it's going to take to get the next frame. I think it's solvable, but it's as much a biology problem as an electronics one.

I don't think any of us enthusiasts fully understand the problem, but as a practicing computer engineer I'm pretty confident I get at least the major relevant concerns, and I still don't think it's rocket science.

Given LED backlight driving circuitry that can produce sharply transitioning, varying-amplitude rectangular output, "PWM" (really, only the sub-case of multiple pulses per input frame) is not even relevant, and the only issue is matching human perceived brightness between the dimmed holding periods and the pulsed strobes within their larger unlit blanking windows. Finding the matching strobe-to-blank-width ratios for each holding dimness level along a backlight brightness curve is nowhere near as hard a topic in human vision as you seem to think it is, and, given this table, you can have arbitrarily long/short/nonexistent holding intervals that have zero effect on perceived net brightness. (Trying to dynamically adjust the holding interval brightness, etc. instead of this is a fool's errand, requiring mult-frame buffering/lag to avoid flickering artifacts under varying frame rate conditions. I suspect trying to follow this approach/line of thinking is why so many in this community think this is a highly difficult problem.)

You then are just left with:
  • defining the strobe shape (quick/bright/sharp vs. broader/dim/blurry) at different Hz/brightness ranges, which is only a single variable whose domain/range attenuate as brightness rises anyway.
  • defining a function on how gradually the panel sharpens/blurs motion under changing frame intervals.
Determining the strobe/dim parity tables is best done by individual display manufacturers, and setting the configurable motion clarity parameter space ideally left open for the end users themselves to twiddle via OSDs.

Sparky wrote:What's actually needed to do that research? Tens to hundreds of different panels, different panel technologies, engineers to interface prototype scalers to different panels and calibrate them, test subjects(young people, old people, different types of colorblindness, etc.), and a lot of time to spend working on the problem.

I agree that each panel needs to be independently parameter tuned, but the problem isn't about research. It's just about scaler ASICs exposing backlight driver timing/level controls via firmware tables, and letting monitor manufacturers tweak those tables to their individual preferences. It escapes me why you think it's necessary for Nvidia to do anything more than sell a flexibly configurable component, unless you actually want them to subsume the monitor manufacturers into component vendors for their own monitors.

Hell, you could even let monitor firmware writers expose the sharpness/jumpiness vs. blurriness levels to end users via the OSD, etc., without needing a Detonator driver or whatever to play with things.

Sparky wrote:Think about why g-sync exists in the first place. Do you really think Nvidia would have gotten involved in a traditionally low margin industry if the major display manufacturers didn't need a kick in the pants?

Yes, the industry needed to be shaken up, but G-sync seems to be dubiously over-engineering approach to the problem.

tl;dr: after matching a panel's perceived strobe/dimming levels, controller-created artifacts are easily avoided, and users are left only with determining and configuring their personally preferred motion clarity levels at different brightness x Hz ranges.

I get that it's a big overall undertaking bringing new technologies to market, but the underlying core engineering problem is vastly overstated IMO, mostly due to Nvidia being themselves and gamer "journalism".
Last edited by jts888 on 06 Jan 2015, 16:41, edited 1 time in total.
jts888
 
Posts: 28
Joined: 05 Jan 2015, 21:36

Re: 4K IPS FreeSync monitor by ASUS [ROG PG27AQ]

Postby jts888 » 06 Jan 2015, 16:30

For visual reference.
As panel brightness goes up, the grey hold levels rise and the strobe pulse area (usually width, since intensity will normally be maxxed) gets bigger.

The blank+strobe+trailing transition interval is the panel's minimum frame interval, say 1/144 s ~= 8.3 ms.
Perceived (average over narrow time) brightness for the strobe-in-dark-box must be made to match the continuous lower hold level brightness.

At high frame rates, the hold periods go away or are too short and dim to be noticed, and exhibited behavior is perceptibly indistinguishable from tradition fixed-Hz strobing displays.
At low frame rates, the strobe pulse can be made arbitrary shorter/fatter until it fills its time window, becoming indistinguishable from sample-and-hold displays if de-accentuating choppy motion is desired.
Attachments
lcd_dynamic_strobing.png
lcd_dynamic_strobing.png (53.78 KiB) Viewed 5099 times
jts888
 
Posts: 28
Joined: 05 Jan 2015, 21:36

Next

Return to News / Rumors / Conventions

Who is online

Users browsing this forum: No registered users and 3 guests