NVIDIA Demonstrates Experimental 16,000Hz AR Display

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: NVIDIA Demonstrates Experimental 16,000Hz AR Display

Post by Sparky » 15 May 2017, 16:40

Chief Blur Buster wrote:
RealNC wrote:However, 1000Hz, or even 2000Hz is realistic for old games. 16000Hz... eh, not so much. A 1000Hz OLED display is (IMO) what we can realistically dream for.
Now we just need 8-channel 120Hz OLED panels:.
Would that need a panel and column/row drivers specifically designed for it, or can existing ones be driven that way? If you're designing all new hardware anyway, you might just make the number of channels equal to the number of rows, and get yourself a ~100khz panel. If your interface to the GPU can't handle that, you can use the extra performance for increased color depth.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: NVIDIA Demonstrates Experimental 16,000Hz AR Display

Post by Chief Blur Buster » 15 May 2017, 20:30

Sparky wrote:
Chief Blur Buster wrote:Would that need a panel and column/row drivers specifically designed for it, or can existing ones be driven that way? If you're designing all new hardware anyway, you might just make the number of channels equal to the number of rows, and get yourself a ~100khz panel. If your interface to the GPU can't handle that, you can use the extra performance for increased color depth.
Complexity goes up massively the more channels you try to cram in. As far as I understand it, each channel requires more wires and computing power on the board. You need the pixel processing power of 8K @ 60Hz to drive 1080p @ 960Hz. You've got the scaler and TCON (timing controller) to contend with, and sometimes you can implement that with FPGA's on the monitor motherboard. I'm not sure how they do it with OLEDs, however.

Currentl OLEDs only have the equivalent of two channels (an ON pass and an OFF pass can be refreshed concurrently).

Image

That said, today, we have OLEDs that can handle an ON-scan pass, and an OFF-scan pass, so 120Hz using 60Hz scanouts may be an easier place to start with. Instead of an ON pass and an OFF pass for pulsed 60Hz, you'd do two ON passes concurrently for true 120Hz (but at 60Hz scanout velocity). This is discussed in the other thread -- see Custom OLED Rolling Scans. At this technological stage, I think the only type of concurrent scan that today's OLEDs might be able to feasibly support, would be this one which only requires 2 channels.

Image

Until they manufacture 4 channels in an OLED, at which stage (240Hz refresh rate in 60Hz scan velocity) could be done, or dual-low-persistence rolling scans which requires 4 channels (low-persistence 120Hz strobed refresh rate with 60Hz scanout velocity). That said, I'm making hefty assumptions on how flexible OLED electronics can be done -- but you may, for example, you may have to replace the TCON with your own custom FPGA running custom code, etc. (Much like what happened in the first GSYNC monitors -- the monitor motherboard was completely replaced).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Sparky
Posts: 682
Joined: 15 Jan 2014, 02:29

Re: NVIDIA Demonstrates Experimental 16,000Hz AR Display

Post by Sparky » 15 May 2017, 21:38

I was thinking instead of updating one pixel at a time in a serial fashion, you could use your column drivers to select a row, and the row drivers to set the intensity of the entire column in parallel. You'd need to design the parts to allow it, but I don't see it being much more expensive in terms of die area or in panel circuitry. And while I expect the vast majority of use cases for something like this to be increasing color depth via temporal dithering, it would let people build displays with absurdly high refresh rates.

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: NVIDIA Demonstrates Experimental 16,000Hz AR Display

Post by Chief Blur Buster » 15 May 2017, 23:45

Perhaps.... Even if you can refresh a whole row at once, you still need to inject voltage into each pixel long enough to yield an accurate color. There's apparently already a lot of parallelism occuring already today on today's LCD panels (modern LCDs refresh more than 1 pixel in a row simultaneously already), and I presume is already done with OLEDs. Just because it's necessary to spend enough time to inject voltages into pixels, despite running at high refresh rates.

But there's a lot of complicated factors, I hear about panel engineering... Crosstalk between rows and columns could become a rather intense problem if you tried to address too many pixels simultaneously. I remember the old days of LCDs where you had vertical and horizontal voltage leakages (single-pixel-width horizontal and vertical lines that brightens/darkens) depending on how much driving occured in that area of the LCD. You still do to a certain extent even on today's LCDs, when viewing certain kinds of test patterns (e.g. Lagom Pixel Walk can show row-column pixel crosstalk artifacts on certain gaming monitors LCDs)

There needs to be sufficient time to send a voltage to the pixel (active matrix transitors aren't instantaneous...) and that's what plays into favour of a slower scanout. So for a two-channel OLED, somehow you need to send two different voltages to two different pixels at the same time. That's very hard with row/column addressing when the row is trying to steal power away from your other pixel, causing GtG inconsistencies, creating vertical/horizontal leakage effects. Those nanowires of rows/columns aren't zero resistance, so you actually probably have to real-time voltage-compensate the other pixel depending on how much voltage you're trying to inject into the OTHER pixel. (Ohm's Law can be a big [BLEEP].) Mere fractions of incorrect microvolts into those TFT transistors can actually lead to the wrong color shade (even 0.1% shade inaccuracy creates visible issues for certain parts of the color gamut), giving image quality flaws. So you've gotta compensate voltages damn darn near perfectly. I suspect current LCDs already have to do such sheninigians to attempt to preserve horizontal/vertical uniformity -- increasing amount of voltage for more distant pixels from the edge row-column addressors ...

There's even a lot of voodoo going on hidden in the TCON that many lower-tier monitor manufacturers don't even bother to learn about, just sourcing chinese panels, creating higher-level custom firmwares, put them all into a custom fancy bezel, and shipping them. With GSYNC -- even NVIDIA does far more work as a monitor manufacturer, than several of the monitor manufacturers themselves...

Now, for 8-channel, you're going to have literally a supercomputer inside the display panel trying to voltage compensate all 8 pixels from each other simultaneously. It's easy enough to refresh multiple pixels in a row OR column without creating too much risk of horizontal-line / vertical-line crosstalk artifacts -- but we may be refreshing multiple pixels in a row AND column -- ouch, leakage-mania -- and you might even need dedicated GPU processing to drive a 1000Hz OLED, or some bigass FPGA -- working in parallel handling each row addressor. We're talking about >2 gigahertz pixel clocks with probably lots FLOPS computing power per pixel (realtime voltage compensation, brightness nonlinearity compensation, picture adjustments, etc)! If we're counting the subpixels, that's 6 billion subpixels per second. Depending on the processing needed per subpixel, it could potentially exceed 1 TFLOPS to drive a 1080p @ 960Hz OLED properly with consistent image uniformity. The panel engineering helps a lot (e.g. electronics that are crosstalk resistant) but we're going to be pushing limits rather hard even to just do only true 960Hz or 1000Hz You could also instead begin to refresh a different row/column (e.g. different row and different column) for the two pixels to reduce voltage crosstalk even further, but that in itself has its own complexities and disadvantages, and your bandwidth would be very limited to a total equalling to one full row's of pixels (no matter if you were addressing more than one row but trying to do separate columns to avoid simultaneous row/colunn leakage issue) -- you might as well fast-scan rather than slow-scan. So you now have to figure out how to concurrently refresh multiple pixels in the same rows AND same columns. That's far more complex to do reliably than refreshing one row's worth of pixels simultaneously. Trust me, it gets horrendously complex, beyond the apparently simple factors... Sometimes some elements are simpler than expected (e.g. the concept of doing slow-scans for higher-Hz) but implementation may yet require very creative workarounds... OLEDs behave differently from LCDs too, and have their own engineering considerations.

I envy the real display engineers who have to get their hands dirty with such finicky panels.

That said, at least getting to 1000Hz doesn't require a 1/1000sec scanout -- that will still dramatically simplify engineering regardless... Five years ago I thought 1000Hz OLED might be unobtainum, but I think we may actually be only 5 years away (At least for laboratory tests) assuming the panel is sufficiently designed to accomodate a higher Hz at slower scanouts, and multiple simultaneous pixel-row-refresh channels (which is technologically doable).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

mindovic
Posts: 2
Joined: 20 Apr 2017, 06:07

Re: NVIDIA Demonstrates Experimental 16,000Hz AR Display

Post by mindovic » 16 May 2017, 03:35

Interesting read, thanks! :) What came to my mind after reading that, is a GPU where is 1 core dedicated to every pixel of the panel, running Raytracing ( per pixel photorealistic rendering ) renderer. Sounds utopic, but could be interesting to see in real :D This solution would also eliminate problems with crosstalk and pixel selection.

Post Reply