Timeline of of 8K 1000hz CRT phosphor fade emulation

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
Post Reply
thatoneguy
Posts: 181
Joined: 06 Aug 2015, 17:16

Timeline of of 8K 1000hz CRT phosphor fade emulation

Post by thatoneguy » 13 Jun 2022, 09:17

So I was thinking about what Chief was talking about emulating CRT via 8K 1000hz phosphor emulation(technically you'd need more than 7680x4320 depending on the CRT you're emulating to get total accuracy but still) and I did some calculations.
8k1000hz.PNG
8k1000hz.PNG (24.86 KiB) Viewed 7025 times
OUCH!! That's a lot of bandwidth required. That's close to a terabit of bandwidth
For comparison the highest bandwidth provided so far is by DP 2.0 which is only 77Gbps maximum for a 4-lane DP 2.0 cable.
Maybe it could do 8K 1000hz with DSC Compression but that would be unacceptable imo.

Who knows when we'll reach Terabit bandwidth cables... perhaps 20 years from now. So until then the 8K CRT Shader with 1000hz granularity(or even 360hz) remains a pipedream.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Timeline of of 8K 1000hz CRT phosphor fade emulation

Post by Chief Blur Buster » 13 Jun 2022, 19:44

Not necessarily.
  • New versions of DSC for compression (No name yet, but I'll call them DSC 2.0, and DSC 3.0 as placeholders)
  • Multiple parallel cables (like with first 4K displays almost two decades ago)
  • Display-side processing
  • Future optical cables
Remember, that the Zisworks 4K120 display required 2 parallel DisplayPort cables to work. And IBM T-221 could use 4 concurrent DVI links. Same principle. For now, for today's technology, DSC and/or roughly ~8 or ~16 parallel cables can already do it with today's technology.

8K ~1000Hz can now eventually be reduced to 4 parallel video cables with a DSC 2.0 implementation, and 2 parallel video cables with a DSC 3.0 implementation (with a version bump to HDMI 3.0 or DisplayPort 3.0) without needing to go optical fiber yet.

Internally you could do it display-side (CRT emulator in the scaler/TCON display-side, instead of computer-side). Himax already has an 8K 288 Hz TCON, and I visited DisplayWeek 2022 and saw BOE demonstrate a 576Hz-capable 4K TCON. With such high Hz, we're getting close to the point where we can write a CRT electron beam simulator in the display's scaler/TCON firmware (replacing the interpolator code -- in fact a CRT electron beam simulator has simpler logic than motion interpolation!).

Here's photographic proof of a display with a 4K 576Hz TCON -- from my own smartphone -- from my visit at DisplayWeek.

Image

That said, 8K 288Hz is twice the bandwidth of 4K 576Hz. That being said, this illustrates how close we are to 8K 1000Hz at least at the TCON level. Compared to 320x200 VGA 70 Hz (4.5 million pixels per second) we have already achieved 9.5 billion pixels per second (Over 280 gigabits per second at 30-bit color depth) in the TCON already announced at DisplayWeek. Read that again.

So we're not exactly that far from these numbers in humankind, from the consumer endpoint. It might require a custom firmware though, because you're doing it display-side (executing a CRT electron beam simulator instead of an interpolator).

Also if you spend a bit -- it can still be done computer side. Through creative tricks, 8K 1000Hz is already in the lab today. While this is not consumer technology but more enterprise/simulator technology -- 8K ~1000 Hz is achievable with today's technology if you use 16 different 8K 60Hz LCoS projectors or 8 different 8K 120Hz projectors -- strobing one at a time each -- onto the same projector screen (projector stacking) each connected to its own separate genlocked GPU to display the appropriate frame.

Another alternate method of quadruple-digit true color is 24 (or 30 or 36) separate 1-bit monochrome 1440-2880Hz DLP chips projecting onto the same screen, for 1 bit each of the 24-bit colorspace (or 30-bit or 36-bit). Zero temporal dithering at all. Even projector-stacking difficulties is conveniently solved by the retina-resolution nature of 8K and using GPU shaders to do generic projection mapping without needing perfect pixel-grid alignment (effective rez loss ~10% but you get the 1000Hz at the end) -- like digital keystone/bow/linearity/convergence/etc on steroids. Then you've got all 24-36 projectors perfectly aligned for computer image data at reasonable levels.

There's already multiple generic designs for 4K and 8K 1000fps 1000Hz pipelines using merely today's technology and already being worked on in the labs :D

There is already plans to extend DisplayPort/HDMI to even higher bitrates -- possibly with a path to an optical interconnect.

Retina refresh rate isn't as impossible as you think, assuming sufficient budget and thinking outside-the-box.

From a scientific laboratory viewpoint, 8K 1000Hz today is where 4K was in year 1995, a few years before the IBM T221 in year 2001.

While it may take a decade (or two) to hit consumer displays, the industry is already working on the perfecting the 8K 1000Hz supply chain. Naysaying the way you do, simply artificially/falsely slows down the refresh rate race too much, methinks. Wink-wink. ;)

I hope that the accelerated existence of 1000Hz demonstration displays by multiple parties will accelerate the development of small form-factor (consumer sized) 1000Hz+ displays in an ultrafast-pixel format (e.g. OLED, MicroLED), once the benefits of such displays are properly realized.

The engineers who aren't paying attention need to study www.blurbusters.com/area51 to understand the humankind need for 1000fps 1000Hz (motion blur reduction without flicker-based techniques) for many use cases such as virtual reality and simulators, and that this is already technology already in the laboratory.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

thatoneguy
Posts: 181
Joined: 06 Aug 2015, 17:16

Re: Timeline of of 8K 1000hz CRT phosphor fade emulation

Post by thatoneguy » 14 Jun 2022, 06:38

Was it necessary to change the thread title? :roll:
Chief Blur Buster wrote:
13 Jun 2022, 19:44
Not necessarily.
  • New versions of DSC for compression (No name yet, but I'll call them DSC 2.0, and DSC 3.0 as placeholders)
Well I already addressed that when I said that DSC Compression is unacceptable imo.
I think in retro games and pixel art games especially it'd be that much more noticeable and it would hurt the look of the CRT Shader making it counterproductive to the goal.

DisplayPort first was came out around 2006-2008 and it could do 10.8Gbit/s at the time. Now fast forward 14-16 years later and that transmission rate has only increased by 7 times. If improvements continue at this rate it will take a long time for us to get uncompressed 8K 1000hz.
As a sidenote: There's been much buzz about VR and having 8K resolution per eye at 1000hz+ and I think especially in VR applications the compression would be even more noticeable so I really wonder if the industry will really step it up when it comes to bandwidth.

Bottom line is that enthusiasts are not going to accept spending a lot of money for a compressed inferior experience when they can get a CRT for much cheaper.
The CRT replacement dream seems a lot farther away than I thought.

Full-on digital squarewave strobe at 8K+ 60hz seems a more attractive proposition to me compared to that. You get uncompressed video/color space at the trade-off of more flicker but since for retro games you're far away from say a 20 inch screen then the flicker wouldn't be as noticeable, and if we're talking an arcade cab scenario where you'd be close to the screen then we could just settle for longer hold times like 3ms MPRT or so even if it would probably be inferior to CRT in that case.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Timeline of of 8K 1000hz CRT phosphor fade emulation

Post by Chief Blur Buster » 14 Jun 2022, 13:52

thatoneguy wrote:
14 Jun 2022, 06:38
DisplayPort first was came out around 2006-2008 and it could do 10.8Gbit/s at the time. Now fast forward 14-16 years later and that transmission rate has only increased by 7 times. If improvements continue at this rate it will take a long time for us to get uncompressed 8K 1000hz.
A useful data point on compression:

I have a Quest 2, and Quest 2 compression for Oculus Link (USB-C cabled connection, instead of AirLink) can be as high as 300 megabits per second in H.EVC. It looks perceptually lossless compression for 4K at 0.3 GBps for something that would be roughly ~20 GBps uncompressed (roughly ~4K-ish at 90Hz), although that fits the limited colorspace of the Quest 2 LCD.

That's almost E-Cinema territory bitrates. Netflix is barely over 10 Mbps for the same number of pixels, while Blu-Ray 4K can go almost 10x that for the video portion.

Based on that, if they develop the right compression algorithms, I'm pretty sure that roughly 4:1 compression of a future theoretical DSC algorithm is even more perceptually lossless than that, even for a much wider DCI-P3 or even Rec.2020 colorspace with perfect blacks, especially when pixels are much smaller -- the compression algorithm would be optimized in a way.[/quote]
thatoneguy wrote:
14 Jun 2022, 06:38
Bottom line is that enthusiasts are not going to accept spending a lot of money for a compressed inferior experience when they can get a CRT for much cheaper.
The CRT replacement dream seems a lot farther away than I thought.
That's fair. 8K 1000Hz is not going to be cheap for a long time.
thatoneguy wrote:
14 Jun 2022, 06:38
Full-on digital squarewave strobe at 8K+ 60hz seems a more attractive proposition to me compared to that.
We only need ~240Hz for software-based rolling scan to be a superior option (to many people) if it's OLED instead of LCD. it won't have as low motion blur as short-puslewidth squarewave strobing, but it would look superior (in CRT realism) to the LG C9 OLED 60Hz BFI.

Also, 4K still provides relatively decent HLSL simulation, 8K is just cherry on the top of the cake.

Although buying a CRT is cheaper today, the problem is that it's becoming harder to find good CRTs, as hundreds still get dumped at the recycle everyday (even in 2022, businesses clearing them out and residents clearing their attics in spring cleaning), reducing the number of CRTs available on the used market.

In ten or fifteen years, it will be far easier to get a 4K 500Hz OLED display that does a more more accurate spatial & temporal job of emulating a Sony PVM CRT than a worn-out semi-misconverged CRT that you had to drive 4 hours to pick up.

4K 240Hz has already arrived (LCD). But as we already know, OLED/MicroLED can really compensate to the point where 240Hz OLED looks clearer motion than 360Hz LCD (the OLED advantage is about 1.5x to 2.0x refresh rate, thanks to lack of visible GtG blurring added to the MPRT blurring). Also Samsung demoed a 240Hz OLED at DisplayWeek so 240Hz OLED panels are not far behind.

If you've seen MAME HLSL on a 4K 60Hz OLED screen, you already know that it looks fantastically kick-ass -- how the 4K is a massive improvement to CRT filter simulation. At this point, every phosphor dot on NTSC CRTs now becomes visible.

To get "Sufficiently superior to a used CRT in the 2030s used resale market" -- we do not really need to go straight to retina 8K and 1000Hz, superior-on-average-to-a-worn-CRT in all checklist (color, gamut, blacks, texture simulation, temporal simulation, etc, etc) is likely doable at 4K 500Hz, especially if it's OLED/MicroLED instead.
thatoneguy wrote:
14 Jun 2022, 06:38
You get uncompressed video/color space at the trade-off of more flicker but since for retro games you're far away from say a 20 inch screen then the flicker wouldn't be as noticeable, and if we're talking an arcade cab scenario where you'd be close to the screen then we could just settle for longer hold times like 3ms MPRT or so even if it would probably be inferior to CRT in that case.
For a game like Pac Man or Duck Hunt, you wouldn't need an ultrafast phosphor, as you would with fast-pan games like Super Zaxxon or Super Mario. Even those specific panning speeds are accomodatable to less than 1-pixel motion blur with a software-BFI-implementation on a ~500Hz OLED/MicroLED.

This is because of the low resolutions involved, e.g. 256x240, panning one screenwidth per second is only 4 pixels per 240p refresh cycle. The MPRT of 500Hz would be half a pixel of motion blur at that Super Mario Brothers / Super Zaxxon style panning speed -- which is still vastly superior to 4 pixels of motion blur when playing Super Mario on an LCD at full running-speed blast. Only very few games such as Sonic Hedgehog pans faster than those common arcade platformer/shooter panning speeds.

Even adding simulated phosphor decay would only affect that slightly, since it's a logarithmic fade. The next refresh cycle could be as 75% fade, and the one after, a ~95% fade, and so on. So it insignificantly affects the MPRT of one refresh cycle. And the fade curve can be custom tuned as a compromise between motion clarity and CRT simulation.

For the majority of games people want to play, 500Hz could easily be sufficiently retina to 90% of enthusiasts. Also let's consider that in ~2050, we'll literally have 50 year old adults who's never seen games on a CRT tube, so the venn diagram of CRT enthusiasts is sadly declining.

Regardless.

Early tests in CRT electron beam simulators have shown impressive promise -- it can actually begins to look superior to 60Hz 50:50 BFI early on, with just merely a 240Hz display, as long as your panning speeds are not too fast (seams still are visible -- like a super-blurry tearing where a tearline is vertically blurred by hundreds of pixels). But at slow panning speeds, it has the flicker feel and lowered motion blur, the hints of a temporal CRTfeel you normally don't feel with flat panels. So there's some promises already in early internal tests. I am going to release (by 2023) a very basic TestUFO CRT electron beam simulator for 240Hz OLEDs (coming).

We do not need it to look CRT perfect in order for it to be worth it -- refresh rates and resolutions have finally reached the point where basic CRT beam simulators can start to be superior to monolithic BFI for slow to medium pan speeds.

We must program accurate open source electron beam simulators while people like me are alive, that automatically infinitely scales beyond my lifetime (looks better than BFI beginning at 240Hz OLED, looks very accurate at 480Hz, looks almost perfect at 960Hz, and looks retina at 1920Hz).

Just like HLSL automatically scales (looks more realistic on higher resolution displays it was never yet tested on), CRT beam simulators can automatically scale to more accuracy when more refresh rate is thrown at it. 240Hz OLED is coming, and 4K 240Hz LCD has arrived (Samsung). The bottom line is as we notch 1080p->2160p and bump 240Hz->480Hz and the bump LCD->OLED, is the only steps needed to turn a beam simulator into something that can begin to look really faithful for most retro content of non-superfast pan speeds.

I do not think anyone disagrees with beginning such an open source project -- I feel it must be done while people who grew up around CRTs and the golden era of 8-bit gaming.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

thatoneguy
Posts: 181
Joined: 06 Aug 2015, 17:16

Re: Timeline of of 8K 1000hz CRT phosphor fade emulation

Post by thatoneguy » 17 Jun 2022, 12:24

Chief Blur Buster wrote:
14 Jun 2022, 13:52

Also, 4K still provides relatively decent HLSL simulation, 8K is just cherry on the top of the cake.
Well I wouldn't exactly say that 8K is the cherry on the top.
4K is good but it's barely scratching the surface.
For example a standard Consumer Sony Trinitron CRT needs more than 5000px vertical to be fully replicated, which means 7416x5405 which means that 16:9 8K still doesn't quite meet the demand vertically speaking
https://twitter.com/ruuupu1/status/1356 ... 39?lang=en

For higher TVL screens/lower dot pitch CRTs like BVMs or PVMs or Computer Monitors you would need more than that(though personally I wouldn't bother going past emulating 600-700TVL since the max resolution I'd emulate with CRT Shaders would probably be 640x480 for 6th gen games)

For "good enough" we're already practically there, especially with the recent Sony Megatron Shader(whose author "MajorPainTheCactus" has posted here) which is imo even superior to CRT-Royale or MAME HLSL.
Chief Blur Buster wrote:
14 Jun 2022, 13:52
Although buying a CRT is cheaper today, the problem is that it's becoming harder to find good CRTs, as hundreds still get dumped at the recycle everyday (even in 2022, businesses clearing them out and residents clearing their attics in spring cleaning), reducing the number of CRTs available on the used market.
Eh.. tbh there's still plenty of CRTs out there(if you're not too picky about the quality/cutting edge). I think they'll last us another 30 years.
Chief Blur Buster wrote:
14 Jun 2022, 13:52
4K 240Hz has already arrived (LCD). But as we already know, OLED/MicroLED can really compensate to the point where 240Hz OLED looks clearer motion than 360Hz LCD (the OLED advantage is about 1.5x to 2.0x refresh rate, thanks to lack of visible GtG blurring added to the MPRT blurring). Also Samsung demoed a 240Hz OLED at DisplayWeek so 240Hz OLED panels are not far behind.
That's the thing, I'd say if anything panels are progressing faster than bandwidth improvements.
Chief Blur Buster wrote:
14 Jun 2022, 13:52
For a game like Pac Man or Duck Hunt, you wouldn't need an ultrafast phosphor, as you would with fast-pan games like Super Zaxxon or Super Mario. Even those specific panning speeds are accomodatable to less than 1-pixel motion blur with a software-BFI-implementation on a ~500Hz OLED/MicroLED.

This is because of the low resolutions involved, e.g. 256x240, panning one screenwidth per second is only 4 pixels per 240p refresh cycle. The MPRT of 500Hz would be half a pixel of motion blur at that Super Mario Brothers / Super Zaxxon style panning speed -- which is still vastly superior to 4 pixels of motion blur when playing Super Mario on an LCD at full running-speed blast. Only very few games such as Sonic Hedgehog pans faster than those common arcade platformer/shooter panning speeds.
Eh I dunno about that Pacman/Mario comparison. Though I have to correct you on Pacman's resolution(it's 224x288 not 256x240) it is true that Pacman has a fixed screen with no scrolling but the character sprites do move at a fairly fast rate so I don't know if I'd consider SMB to be faster than Pacman overall. Aside from that there are also plenty of arcade games from the early 80s which are faster than Pacman.
Also even in games like Pacman and the like they do appear ever so subtly smoother at lower persistence. You can even observe that in RPG video games like Final Fantasy which don't move/scroll very fast at all.

I agree that 500hz would be pretty damn good but even 8K@500hz uncompressed would still need a lot of bandwidth so I'd say that is pretty far away too. And even with DSC it's a few years off because DSC only does 3:1 right now.

The parallel cable idea is one I missed from your initial post. You're right, 16 parallel DP 2.0 cables could actually do it(provided of course that the panel can do 8K 1000hz) although it would cost a lot and it would be a lot of cabling.
So in theory if 5 years from now VESA improves their Displayport Bandwidth by 4 times(wishful thinking but you never know) you would only need 4 parallel cables, which would still be a lot of cabling but less than right now.
Displayport could also increase its bandwidth if it used more lanes but I'm guessing the main reason they don't go over 4 lanes is because of compatibility reasons.


BTW, what's the reason we haven't gone optical fiber yet? Is it costs or some kind of other limitation?
The technology has been around for forever but I never quite understood why it hasn't been used in audio/video interfaces yet.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Timeline of of 8K 1000hz CRT phosphor fade emulation

Post by Chief Blur Buster » 17 Jun 2022, 16:03

thatoneguy wrote:
17 Jun 2022, 12:24
Chief Blur Buster wrote:
14 Jun 2022, 13:52

Also, 4K still provides relatively decent HLSL simulation, 8K is just cherry on the top of the cake.
Well I wouldn't exactly say that 8K is the cherry on the top.
4K is good but it's barely scratching the surface.
For example a standard Consumer Sony Trinitron CRT needs more than 5000px vertical to be fully replicated, which means 7416x5405 which means that 16:9 8K still doesn't quite meet the demand vertically speaking
https://twitter.com/ruuupu1/status/1356 ... 39?lang=en
This can definitely be true (8K not being enough) but it also depends on the size of the CRT and viewing distance -- what threshold you need to mee. Small CRTs are already beyond retina when sitting 20 feet from them at sofa viewing distance.

As you mentioned, emulating 240p, 480i and 480p is a decently low bar. The important thing is that the algorithm scales successfully to better quality the more resolution you throw at it (better spatials) and the more refresh you throw at it (better temporals). So you're not limited to 4K nor 8K permanently.
thatoneguy wrote:
17 Jun 2022, 12:24
For "good enough" we're already practically there, especially with the recent Sony Megatron Shader(whose author "MajorPainTheCactus" has posted here) which is imo even superior to CRT-Royale or MAME HLSL.
Yes, there are superior shaders to HLSL now.
thatoneguy wrote:
17 Jun 2022, 12:24
Chief Blur Buster wrote:
14 Jun 2022, 13:52
Although buying a CRT is cheaper today, the problem is that it's becoming harder to find good CRTs, as hundreds still get dumped at the recycle everyday (even in 2022, businesses clearing them out and residents clearing their attics in spring cleaning), reducing the number of CRTs available on the used market.
Eh.. tbh there's still plenty of CRTs out there(if you're not too picky about the quality/cutting edge). I think they'll last us another 30 years.
They will be around for a while, but the pain point (no longer available locally + ship from increasingly far distances) starts to go up and up, while the pain point of alternatives keeps going down and down. The crossover point won't be an exact timing point for all parts of the world, and everyone will have their preferences. OLED and HDR options will be much more widespread
thatoneguy wrote:
17 Jun 2022, 12:24
Eh I dunno about that Pacman/Mario comparison. Though I have to correct you on Pacman's resolution(it's 224x288 not 256x240) it is true that Pacman has a fixed screen with no scrolling but the character sprites do move at a fairly fast rate so I don't know if I'd consider SMB to be faster than Pacman overall. Aside from that there are also plenty of arcade games from the early 80s which are faster than Pacman.
Perhaps the ghosts scroll as fast as the Super Mario playfield, but...

The dominant area of highly noticeable motion blur occurs from large playfield scrolling more often than from the character scrolling. The blur from a single sprite takes more persistence in order to be casually noticed than a whole playfield background.

It's equal if you're actually paying attention intensely at specific pixels and know what you're looking for, but if you're focusing on playing the game, the blur/persistence limitations definitely show up more casually quickly in full-screen-scrolling games than in stationary playfield moving character games. Different players play differently -- some people use peripheral vision to pay attention to the ghosts, others stare and eye-track them, etc. The way people use their eyes on games vary quite widely.

The thing with large scrolling playfields is that it's harder not to eye-track them, and so smaller margins of motion blur is noticed sooner, as a blur delta between stationary and moving. As things suddenly stop scrolling and starts scrolling, the clarity difference emerges more rapidly and brings attention to itself. While admittedly, everyone sees/uses their eyes differently though, my example was intended to convey some games where display motion blur is more immediately noticeable (on average by all, not necessarily just your experience).
thatoneguy wrote:
17 Jun 2022, 12:24
BTW, what's the reason we haven't gone optical fiber yet? Is it costs or some kind of other limitation?
The technology has been around for forever but I never quite understood why it hasn't been used in audio/video interfaces yet.
I'm sure it will have to happen eventually.

But for now, cable technologies change infrequently and there is still improvements to be had (both by wire, and by compression standards) before the cost-effort of standardizing an optical video interconnect. Maybe who knows, USB5 or USB6 as an optical channel in the middle of the USB-C port as one possibility, or a clean-sheet connector.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

thatoneguy
Posts: 181
Joined: 06 Aug 2015, 17:16

Re: Timeline of of 8K 1000hz CRT phosphor fade emulation

Post by thatoneguy » 18 Jun 2022, 10:10

Chief Blur Buster wrote:
17 Jun 2022, 16:03
This can definitely be true (8K not being enough) but it also depends on the size of the CRT and viewing distance -- what threshold you need to mee. Small CRTs are already beyond retina when sitting 20 feet from them at sofa viewing distance.
It will definitely be noticeable in an arcade cab scenario, and more specifically for mid-to-late 90s 24khz/31khz+ arcade games.
For those games you need 600-700TVL(ideally probably 640-650 lines).
Downsampled phosphors will look more noticeable up-close.

For a decent consumer TV like the Sony Trinitron I linked to 8K 16:9 is about ~83% of the way there.

I agree with the rest of the post.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Timeline of of 8K 1000hz CRT phosphor fade emulation

Post by Chief Blur Buster » 18 Jun 2022, 19:00

thatoneguy wrote:
18 Jun 2022, 10:10
For a decent consumer TV like the Sony Trinitron I linked to 8K 16:9 is about ~83% of the way there.
That being true about arcade CRTs seen really up close -- you definitely see the phosphor pattern even with merely 20/20 vision.

With all those 5K, 6K and 10K screens that have come out, as well as 4800p versions of 8K (like 1920x1200p and 4K 2400p displays), there could be subtle steps that might happen without going all the way to 16K. The display manufacturers often release odd panel sizes for various markets. Hard to predict what panel manufacturers decide to do in the 2030s, but resolutions and refreshes are still an upwards trajectory.

I still remember when 1080p was still a fairy tale in the 1990s when we were still all about 1080i (interlaced), and I was using my NEC XG135 CRT projector (8" tubes x 3).

At the time, I used to work for video processor companies (Runco scalers, TAW, Key Digital, and the Faroudja-equipped HOLO3DGRAPH, or dScaler).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply