Just as Blur Busters love to microphone-drop a lot of outdated 30fps-vs-60fps myths -- that is the wrong perspective (like Newtonian thinking versus proper Einsteinian thinking); because of multiple reasons.ophidon wrote: ↑16 Sep 2020, 13:45As for 1000hz screens in the future, as long as they don't remove the ability to run at 'low' Hz like 180 or 240, the issue should remain resolved if it can be with current technology. While I would certainly like to see a 1000 Hz display that could be bright enough at a 1/1000, or even 10/1000 frame pulses for amazing clarity, I remain doubtful that will -ever- exist.
(A) We don't necessarily assume single-Hz granularity.
1000Hz can also mean configurable phosphor-fade persistence, like configuring to 1ms phosphor fade, 1.5ms phosphor fade, 2ms phosphor fade, 2.3ms phosphor fade, simply by having a chasing-rolling-scan behavior of faded alpha-blend-to-black chasing behind the rolling scan. The more Hz we have, the more accurately we can emulate the progress of phosphor decay in a CRT electron gun emulator (Which will make things brighter)
(B) HDR support brings the necessary brightness needed.
I hve seen the Sony 10,000 nit prototype Display at CES. It was an LCD. It was amazing, it wasn't too bright, but used to make highlights brighter (streetlamps at night, neon lights at night, sun glint reflections off a chromed car). Those looked whiter than white. The headroom of 10,000 nit still leaves 625 nits for 60Hz with 1ms phosphor decay. Since HDR is limited to only a few pixels per refresh cycle, the fact we can use a rolling-scan bar, means we can satisfy the "HDR budget" simply by only brightly illuminating a few pixels at a time. CRT phosphor is similarly bright (5-figure nits) so HDR is the knight in shining armour to rescue the CRT emulation problem by 2030+. LED capable of 10,000nit is pretty cheap, the problem is turning that into a FALD-backlight in a three-figured priced gaming monitor. But FALD is no problem if you are willing to pay four figures for a display. Let's not exclude four-figure displays in the quote of "I remain doubtful that will ever excist".
(C) FALD innovations and MicroLED innovations
I've seen FALD LCDs with superior contrast and colors over OLED before; they actually exist. The magic is a sufficiently high-resolution FALD (full array local dimming) that has so many LED elements. Plus MicroLED display are easily capable of simultaneously "super bright HDR" + "super high Hz", with the right driving electronics.
As a person who have seen multiple 1000Hz through 2880Hz displays in prototype laboratories, as well as knowing that LED ribbons (300-LED) have become cheap on Alibaba for under $10 Example 1500-nit brightness 32x32 screen for only EIGHT DOLLARS -- it's a jumbotron module designed to be built into big stadium screens, but hobbyists use them as miniature low-resolution screens too. Some of them cheaply reach 10,000+ nits. There is a time coming where low-resolution MicroLED sheets (e.g. 1000-pixel screens to 10,000-pixel screens) is going to fall below $100. (Even JumboTron 32x32 modules are now only a few dollars off Alibaba). And did you know that jumbotron module internally refreshes at 1920 hertz! For only 8 dollars! If we swapped the frame buffer chip (60Hz) with a custom FPGA, the eight dollar JumboTron module becomes a low-resolution 1920 Hertz screen! Eventually we have JumboTron-superbright FALD monochrome sheets available, which can be commandeered to an ultrabright FALD. Just a few modifications and slap them behind a commodity LCD, and you've got a relatively kick-ass FALD. Right now FALD is the territory of 4-figure and 5-figure priced screens that match or beat OLED quality, so most people don't believe LCD can become as bright and as contrasty -- which is a shame, because there's an amazing amount of headroom available in LCD left.
Now, did you know direct-view MicroLED is simply a miniaturized JumboTron shrunk to retina resolution for desktop sized/wall sized use? And can blow past 10,000 nits?
All the tech exists, just have not been glued together yet into an integrated solution, and will require a beefy GPU with frame rate amplification technology. But ultimately, ultrahigh-Hz combined with ultra-HDR, is not unobtainiumly expensive. 4K was a $10,000 lab curiuosity in 2000, now it's a $299 Walmart special. Ultrahigh-Hz and HDR doesn't necessarily need to be impossible stuff, especially with the slow mainstreamification of 120Hz (new smartphones, tablets, etc). DLP chips are already refreshing at 1440Hz-2880Hz today, using 1-bit temporal dithering to generate low-Hz full color, and are relatively cheap, but aren't suitable for desktop monitors. If you haven't seen thousands of prototype/experimental displays on convention and corporate trips like I have, then judgements of "impossible" are likely premature.
The tech filters down really slowly, but so did 4K (over 20 years). Now you see Apple/Samsung working to mainstream 120Hz this decade, and all the 120Hz initiatives going on (NHK 8K 120Hz broadcasts of Tokyo Olympics). 240Hz might even mainstream by 2040s, while 1000Hz remains an enthusiast curiousity -- as long as there are humankind benefits, if 240Hz can be done for only $3 to $5 extra with no impact on battery life, for 4x clearer browser scrolling, Apple/Samsung will do it for example. It's a matter of economics of the falling cost of high Hz. 2030? 2040? 2050? But it isn't "Never". For mainstream 240Hz. Who knows? What this does, is push the bleeding edge up further (like 1000Hz ultra-HDR displays), the pressures of technology progress, which solves our CRT-electron-beam-emulator problem.
I'll leave that as a Micdrop.
I see the canary in the coal mines of the emerging technology of the future -- so never say never.
It may be a while before HDR converges with ultra-Hz, but it will happen before 2100. Perhaps by 2025. Or 2030. I'm guessing 2030-ish. But it is not equivalent to "Never". There are quite a lot of engineering paths towards a cheap 1000Hz that also includes HDR -- but probably will take approximately a decade-ish (give or take, could be 2025 or 2040).
The "never will happen" stuff tends to be Newtonian thinking around Blur Busters -- winky wink.
This is the type of stuff Blur Busters mythbusts -- Blur Busters mission also includes to stop the 1000Hz laughing relatively quickly, with articles such as Blur Busters Law: The Amazing Journey To Future 1000 Hz Displays. Even the diminished 240Hz-vs-360Hz difference (from LinusTechTips) is partially from GtG limitations, combined with LinusTechTips testing methods -- GtG limitations lagging behind Hz innovations is much like how the first 240Hz monitor from 2017 was more ghostly motion-blurry than today's more-refined 240Hz monitors.
Most of the mainstream masses are spoiled by the mostly unchanged look of desktop LCDs from 2005-2020. From the first Samsung 2ms TN 1080p/1200p displays today's 1ms 1080p 240Hz displays, they are all similar contrast ratios of about 1000:1, similiar brightnesses of 200-to-400nits, and with similar color gamut (~72% NTSC) even when transitioning from CCFL to the first LED backlights. If you're one of the masses spoiled by this, you ain't seen anything yet -- have you ever seen a desktop monitor driven by a 10,000-LED backlight? Totally different ballpark, sometimes better than OLED. It looks like an OLED, sometimes better, and the brightest whites exceed >1000nits. With tens of thousands of LEDs in a locally dimmed backlight, the blooming is gone, smaller than the bloom around a CRT phosphor dot.
Eventually, monochrome FALD MicroLED sheets will be machine-manufactured mass manufacture stuff -- since 10,000-LED monochrome MicroLED displays are much cheaper than 1080p direct-view MicroLED displays / JumboTrons -- so ultra high quality FALD LCDs in the sub-$1000 price range will arrive before direct-view full-color RGB FALD MicroLED displays (ala miniaturized JumboTron, or miniaturized desktop-sized Samsung "THE WALL" display panels). So expect nit-rocket FALD desktop monitors before then. Just set up a factory to manufacture FALD sheets as cheaply as cheap JumboTron modules, and you can commoditize FALD just fine, with no Hz limitations. It's only a matter of time -- other tech may arrive prior (OLED? Direct-view MicroLED? Blue-phase microsecond-GtG LCDs? Etc), but the fact remains is that there are multiple cheap tech paths to nits+Hz.
We never needed high Hz because Hz didn't help CRT as much, because CRT was zero blur already (due to flicker strobe effect -- low persistence). But the only way to achieve low-persistence flickerlessly is ultra-framerate + ultra-Hz, and the VR pushes tech progress on trying to pass the reality test (real life equalling VR) is also putting upward pressure on refresh rates, now that it has finally become technology possible to do strobeless low-persistence (at least in the laboratory). Since the humankind benefits are there, it's only a matter of time when it becomes cheap enough to include such refresh rates in screen technology (much like how 4K and retina no longer cost much above low-resolution).
Such a display can in theory (buy increasingly experimentally proven, bit by bit) temporally emulate the look-feel of the refreshing pattern of any legacy display (within human vision integration timescales). Want the display to emulate the look of a plasma screen (including christmas tree effects and countouring?) Want the display to emulate the look of a CRT tube (including zero blur + phosphorfeel + shadowmask)? A retina-Hz retina-rez retina-HDR display becomes a venn diagram big enough to emulate past displays. 1000Hz may not be enough for accurate emulation of any legacy display, but it will begin passing a lot of A/B blind-tests for a lot of people (flat-tube CRT versus electron-gun-emulated ultra-Hz+Rez+HDR digital display).
We've got multiple humankind technology-benefits progress that will eventually force ultrahigh-Hz and ultrahigh-HDR and ultrahigh-rez simultaneously converge -- the recipie necessary for an accurate CRT electron gun emulator.
I talk to many researchers in the refresh rate equivalent of 1990s 4K research or 1980s Japan HDTV researchers. Sure, 1950s and 1960s predicted wall-hangable TVs by 1970s, and it took a lot longer, but it wasn't never. We see the same situation here. Blur Busters prime directive is to pull the needle as early as possible, as cheaply as possible.
For those unaware, Blur Busters is two parts --
...Blur Busters Media writes simplified stuff in Popular Science style to mythbust refresh rate race (descendants of 30fps-vs-60fps myths)
...Blur Busters Laboratory does the more advanced stuff like contracts with manufacturers, and I have to visit a lot of conventions (pre-COVID anyway), so I get to see the future of displays much earlier than many. Our Area 51 Display Research Articles on BlurBusters.com contains a lot of content that are now textbook reading at NVIDIA and other places, since they are easier Popular Science article formattings of boring advanced scientific papers, helping to push the needle of the refresh rate race.
Have I thrown down the microphone hard enough yet? I have more...
</Future Screen Technology Discussion>