Worlds First 360hz G-Sync Monitor!

Breaking news in the gaming monitor industry! Press releases, new monitors, rumors. IPS, OLED, 144Hz, G-SYNC, Ultrawides, etc. Submit news you see online!
User avatar
AddictFPS
Posts: 314
Joined: 15 Jan 2020, 14:13

Re: Worlds First 360hz G-Sync Monitor!

Post by AddictFPS » 21 Jan 2020, 23:20

First of all, thanks Asus and the panel manufacturer for raising the frequency of gaming monitors.

ImageImage

FHD 240Hz
1920 x 1080 = 2,073,600 Px * 240 Hz = Base PixelClock(Scanout Speed) 497.664 MPx/s + ~20% for extra sync tasks = ~600Mhz

https://www.blurbusters.com/faq/advance ... stalk-faq/
Image

Next Asus FHD 360Hz monitor need: 746.496 MPx/s
Current QHD 240Hz like Lenovo Y27gq-25: 884.736 MPx/s
Current UHD 4K IPS 120Hz 8bits RGB: 995.328 MPx/s through 1x DisplayPort v1.4 without OverClock

So, we have already electronics capable of running scanout at 995.328 MPx/s speed, with this speed panel manufacturers can make FHD 480Hz

Image

Are panel manufacturers rationing technological progress ? Or there are others technical limits that not allow now FHD 480 ?

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Worlds First 360hz G-Sync Monitor!

Post by Chief Blur Buster » 22 Jan 2020, 19:54

<Technical>
The bandwidth is no longer the primary limiting factor preventing 1000 Hz

We already have 8K 120Hz electronics, which means 1080p ~2000 Hz is possible under the same bandwidth parameters (color depth setting, DSC setting, etc).

The technical restriction is the number of microseconds spent injecting voltage into an LCD Pixel. Not enough time means you can't inject enough voltage in that brief time limit to execute a fast GtG. Often, when a panel gets overclocked far beyond limits, its pixel response becomes lethargic / slower (falling like a rock) because the scanout pulse is too extremely brief to do a really effective GtG.

A display cannot refresh all pixels at once. If you see High Speed Videos of LCD Scanout, you will see that there's a wipe effect -- the LCD is refreshing one pixel row at a time, with a GtG lag (the fade zone).

The 360Hz LCD may theoretically be overclockable to 480Hz but you may see the contrast ratio drop by ~50% or see GtG suddenly slow down to 2ms, or other side effects -- like what happend to the 60Hz laptop display overclocked to 180Hz (the refresh cycles started badly smearing into each other like an old 33ms LCD). There's a huge yang to go with the ying.

Also, the velocity of pixel row addressing in panels isn't quite there in TCON/scalers (and whatever panel-edge electronics integrated into the glass). Zis had to do lots of shortcuts by going 1/4th vertical resolution in order to achieve overclocked 4x refresh rate on what is essentially an off-the-shelf 4K120Hz capable panel, in what looks like a simultaneous refreshing of 4 pixel rows at a time.

Many fast LCDs are now designed to refresh adjacent pixel rows in scanout sweeps. There are LCDs that will refresh two or four adjacent pixel rows at a time simultaneously, in their top-to-bottom scanout sweep. This gives more time to inject voltage into a pixel row, than if only one pixel row is voltaged at a time.

Adding more channels of simultaneous refreshing to an LCD panel or OLED panel, will help immensely. A panel that can refresh 8 pixel rows at a time instead of 4 pixel rows at a time, will be able to do double the refresh rate at the same scanout velocity.

The refreshing doesn't have to be adjacent pixel row, but could theoretically be subdivided into a defacto equivalent of multiple displays using concurrent scanouts, like the 960 Hz OLED scanout pattern by treating a panel like 8 separate slices of 120Hz displays (scroll down for it). It's probably very dependant on how the panel is wired -- whether to concurrent-refresh contiguous pixel rows (most panels that have multi-row refresh do it this way), or concurrent-refresh distant pixel rows.

Bottom line -- it boils down to how much time per pixel row you have in injecting voltage into pixels. Double the Hz means you have half the time. So you have to spend either less time voltaging a pixel row (to begin GtG transitions) or double the number of channels (to maintain active-refreshing-time-per-pixel-row)

Early panels like IBM T221 (YouTube Teardown, the first 4K monitor) had ribbon cables entering the panel-glass to both the top and bottom edges, to allow its early 2-channel refresh capability. But nowadays, most LCDs only have ribbon cables into one edge, and can multi-channel-refresh from one edge. The engineering world discovered that you have fewer artifacts if you keep simultaneous-pixel-row refreshing to contiguous pixel rows. In many cases, monitors are subdivided into multiple vertical strips internally and each strip has a synchronized scanout.
But there are simple/clever workarounds to eliminating sawtooth multiscan artifacts if you horizontally segment-multiscan (treat the screen as multiple subdivided screens) -- there might be more opportunity to subdivide a panel into multiple displays and use that as a refresh rate increase method. But there are engineering challenges.

Theoretically, future panels can double Hz by going back to the "ribbon-cables-entering-panel-at-both-top-and-bottom", so a 720Hz refresh rate could be achieved with that 360 Hz panel that way with just today's technology -- if it was fabbed slightly differently using today's technology. But such a niche-manufactured LCD panel may be horrendously expensive (>$2000+) as we no longer have monitor motherboard standards for dual-edge-fed panels for commoditized desktop monitors.

It's still a technological option and also makes LCD capability to 1000Hz visualizable within this human generation (though ideally we need eventual <~0.5ms GtG90% across the whole GtG heatmap to really make that shine). It's still unobtainium, but no longer a pipe-dream unobtainium (like 4K in 1990s). Sources now tell me there is a roadmap to 1000 Hz LCDs by the decade of 2030s, but I don't know if it will be achieved that way.

Again, the bandwidth is not the problem. The GPU and the display currently is the weak links (but solvable)
</Technical>

...Hey, you asked for it. This is Blur Busters!
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

jasswolf
Posts: 68
Joined: 23 Jan 2020, 05:09

Re: Worlds First 360hz G-Sync Monitor!

Post by jasswolf » 23 Jan 2020, 05:26

darzo wrote:
14 Jan 2020, 15:40
So how do these now very high refresh rates interact with the "server tick rates" of games? I've definitely noticed an improvement, supported by performance, going to 240hz but from what I've read for Overwatch only the esports tick rates have been 144, let alone 240.
Tick rate is more about minimising the issue of total input lag in relation to the player's response to what they're seeing in game. In that context, the maximum added delay between a tick rate of 144 vs 240 on an action in game is about 2.77 ms. Assuming you're not aiming within a few pixels/triangles of the edge of a hitbox, that's not a huge deal breaker for online play because it's rare for an object to be moving fast enough to be impacted by this.

In fact, this is why hitboxes are a thing: to be a little more forgiving of player aim under the constraints of an online connection to an economically run server environment. You'll probably see tick rates go up in competitive games (aside from CS:GO, which is the only game I can think of where you can customise tick rates) in the coming years as server tech scales up.

The benefit of the increased number of frames shown, the improved motion blur and improved motion perception still assists the player despite this because you're able to more cleanly and accurately track things on screen to make a better shot.

User avatar
AddictFPS
Posts: 314
Joined: 15 Jan 2020, 14:13

Re: Worlds First 360hz G-Sync Monitor!

Post by AddictFPS » 23 Jan 2020, 23:52

Chief Blur Buster wrote:
22 Jan 2020, 19:54
...Hey, you asked for it. This is Blur Busters!
Image
We already have 8K 120Hz electronics, which means 1080p ~2000 Hz is possible under the same bandwidth parameters (color depth setting, DSC setting, etc).

The technical restriction is the number of microseconds spent injecting voltage into an LCD Pixel. Not enough time means you can't inject enough voltage in that brief time limit to execute a fast GtG. Often, when a panel gets overclocked far beyond limits, its pixel response becomes lethargic / slower (falling like a rock) because the scanout pulse is too extremely brief to do a really effective GtG.

Adding more channels of simultaneous refreshing to an LCD panel or OLED panel, will help immensely. A panel that can refresh 8 pixel rows at a time instead of 4 pixel rows at a time, will be able to do double the refresh rate at the same scanout velocity.
WoW - This sound cooooool !

1Ch - 60Hz Image Image
2Ch - 120Hz :)
4Ch - 240Hz :D
8Ch - 480Hz Image
16Ch - 960Hz Image
Bottom line -- it boils down to how much time per pixel row you have in injecting voltage into pixels. Double the Hz means you have half the time. So you have to spend either less time voltaging a pixel row (to begin GtG transitions) or double the number of channels (to maintain active-refreshing-time-per-pixel-row)
I think is not a good idea reduce time voltaging, if LCD cell not evolve at the same rate, we need good Overdrive control, moreover i suspect that the issue called Pixel Inversion could appear, is caused by bad voltage management.

http://www.lagom.nl/lcd-test/inversion.php
Theoretically, future panels can double Hz by going back to the "ribbon-cables-entering-panel-at-both-top-and-bottom", so a 720Hz refresh rate could be achieved with that 360 Hz panel that way with just today's technology -- if it was fabbed slightly differently using today's technology. But such a niche-manufactured LCD panel may be horrendously expensive (>$2000+) as we no longer have monitor motherboard standards for dual-edge-fed panels for commoditized desktop monitors.
How expensive aproximately you think could be manufacture 25" TN FHD 240Hz with the "ribbon-cables" @480Hz 4Ch@8Ch ? Double price 500@1000$ ? It would be a very useful data for eSport Professional Gamers !
The technical restriction is the number of microseconds spent injecting voltage into an LCD Pixel.
I read that MicroLED pixels have NanoSecond response time Image, but how much time is needed to injecting voltage into an MicroLED pixel :?:
Last edited by AddictFPS on 26 Jan 2020, 18:03, edited 1 time in total.

life_at_1ms
Posts: 38
Joined: 31 Oct 2019, 03:20

Re: Worlds First 360hz G-Sync Monitor!

Post by life_at_1ms » 26 Jan 2020, 06:55

AddictFPS wrote:
23 Jan 2020, 23:52
How expensive aproximately you think could be manufacture 25" TN FHD 240Hz with the "ribbon-cables" @480Hz 4Ch@8Ch ? Double price 500@1000$ ?
Let's just start with OLED please :) I can't handle bright blacks :(
500hz+ OLED and I'll stop missing my CRT like a girlfriend from last century!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Worlds First 360hz G-Sync Monitor!

Post by Chief Blur Buster » 26 Jan 2020, 12:06

AddictFPS wrote:
23 Jan 2020, 23:52
How expensive aproximately you think could be manufacture 25" TN FHD 240Hz with the "ribbon-cables" @480Hz 4Ch@8Ch ? Double price 500@1000$ ? It would be a very useful data for eSport Professional Gamers !
The technical restriction is the number of microseconds spent injecting voltage into an LCD Pixel.
I read that MicroLED pixels have NanoSecond response time Image, but how much time is needed to injecting voltage into an MicroLED pixel :?:
Just to be clear, "injecting voltage into an LCD", is a roundabout way of saying sending voltage down the matrix wires (horizontal and vertical) to the specific active matrix transitor that is controlling a specific subpixel of an LCD (there are three separate GtG actions per pixel). Transistors can operate very fast (millions or billions of times per second) if they are designed as such, and the switching voltage is designed as such. Unfortunately with displays, you've got tiny transistors being controlled long-distance by microwire grids criss-crossing the screen. So switching speed is slowed down bo various law-of-physics. And once a transistor is switched, its switched state will drift until the next refresh pass which then resets the transistor again. This is the TL;DR version, please don't ask for the novel-sized version. Ha.

On that note, ths speed of controlling an LCD pixel through an active matrix transistor, has different behaviors from controlling a LED pixel. LED pixels are solid state with no momentum behaviours of molecules needing to move (like in an LCD) so switching speed AND switching stability is much, much, much higher on a MicroLED. In fact, many of them are refreshed at 600 Hz, although they are just repeat-refresh passes. Right now, for microLED, refresh speed is not nearly as much of a weak link. It should be much easier to engineer 1000Hz discrete refresh cycles with direct-view MicroLED (distinguishing from MicroLED-backlit LCDs).

However, true 1000Hz MicroLEDs probably will still be more expensive than the first 1000 Hz LCDs, until the cost curves overlap....
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
AddictFPS
Posts: 314
Joined: 15 Jan 2020, 14:13

Re: Worlds First 360hz G-Sync Monitor!

Post by AddictFPS » 26 Jan 2020, 18:49

life_at_1ms wrote:
26 Jan 2020, 06:55
Let's just start with OLED please :)
Warning OLED

Image

Unfortunately this technology is not recommended for desktop tasks or games, due to static images causing Burn-In.

HUD, crosshair, etc... all these elements are fast burned in screen with the pass of weeks.

Moreover, LG say about OLED C9 (2019) that Burn-In process can't be avoided, is accumulative.

This means that: Play 1 hour + Turn Off screen 1 hour + Play 1 hour, as the same Burn effect that play 2 hours straight.

Using low bright and Black Frame Insertion, reduce the speed of burn process, but is not stopped.

I think this is the only reason that brake LG to sell OLED monitors. They don't want an avalanche of complaints.

TV is another task, pixels are allways changing color, but if is used like PC monitor, or see TV channels with fixed Logo, burn assured:

https://www.rtings.com/tv/learn/real-li ... rn-in-test

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Worlds First 360hz G-Sync Monitor!

Post by Chief Blur Buster » 27 Jan 2020, 02:50

To be fair, that said;

Sometimes the bliss of OLED is worth some decreased risk of burn-in. I've heard of quite a few enjoying some good LG C9 OLEDs for a year with no significant burn-in, and suggests that they'll get a decent few year's worth out of it.

There's some general best practices to make the purchase worth it, considering that some got LG OLEDs for pretty much the league of $999 or thereabouts during major/refurb sale -- not bad for a 120 Hz capable OLED of a television sized format.

Basically, generally:
- Back off the OLED brightness. Then burn in will take much longer
- Use Dark Mode
- Taskbar set to autohide
- Use auto-rotating wallpaper or video wallpaper
- Short screensaver times
- Orbit feature (only a few pixels, but at least things like crosshairs will create blurry burn-in rather than sharp burn-in)

An OLED with a faint 1% to 3% burn-in after one year is still an enjoyable OLED, especially since faint burn-in is just an artifact that might be like a spotch on LCDs, or an artifact like IPS glow, and lost in the noise of busy scenes and only easily noticed on solid backgrounds. It's when it reaches a sufficiently strong strength, or you view lots of material with solid reds, that it starts to become a huge bother.

Annoyance to faint 1% burn-in is partially psychological, like a new scratch on a new car, as LCDs aren't perfect either (OLEDs just come 'more perfect' out of the box, and becomes 'more imperfect' (burn in) as time goes on but doesn't yet look worse than an LCD until after a while. True degradation to "no longer enjoy" playing, may take yet longer -- faster if you drive the OLED hard, slower if you are cautious about settings. You might end up with a 5-yearslong progression of 0.5% -> 1% -> 2% -> 3% -> 5% burn-in, or you might get 10% burn in after the first year -- even with same number of hours use -- depending on how you've configured things. Even the dim gray field of a VA LCD is splotchier than OLED burn in of a pampered OLED after the first year.

Also, some OLEDs try to slow burn in by having a pixel illumination counter for each pixel of an OLED, and can compensate for burn-in that way, but those algorithms are still imperfect.

At some point, it may fall below annoyance noisefloor, but at this point, it's a "manageable annoyance" (i.e. years before burn-in begins to intefere with enjoyment) if the OLED is set up properly, unless you're usijng it as your primary computer desktop screen. And LCDs do have wear and tear, a 10-year old gaming monitor may have fallen to under 100 nits of brightness due to the non-replaceable LED backlight wearing out.

The LCD will remain a horse in the display race for decades to come -- there's a lot of quality room to milk from an LCD (gamut wise, blacks wise, contrast wise, response wise, hertz wise) -- and in some cases can be made to look better than DLP / OLED -- however, OLED is not necessarily doomed. With slight improvements to how fast burn in appears, its degradation speed to non-enjoyment may not necessarily be vastly worse (i.e. full order of magnitude) than the degradation speed of various components of a LCD gaming monitor (such as backlight wear & tear).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

darzo
Posts: 211
Joined: 12 Aug 2017, 12:26

Re: Worlds First 360hz G-Sync Monitor!

Post by darzo » 31 Jan 2020, 00:57

Any idea on the quarter in which this monitor will be released?

rootsoft
Posts: 42
Joined: 01 Nov 2014, 13:07

Re: Worlds First 360hz G-Sync Monitor!

Post by rootsoft » 12 Feb 2020, 11:39

darzo wrote:
31 Jan 2020, 00:57
Any idea on the quarter in which this monitor will be released?
I have been looking for the same information.

Post Reply