Digital Foundry - CRT better then 4K OLED

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
JKJK
Posts: 8
Joined: 23 Oct 2019, 15:48

Re: Digital Foundry - CRT better then 4K OLED

Post by JKJK » 28 Oct 2019, 13:44

It's probably too small an amount to register on the geiger. But because there is NO SAFE quantity of radiation, the dose and its effects are cumulative over a lifetime. All those years in front of CRT playing quake, it has increased the odds of us getting brain/ eye cancer. :lol:

Jason38
Posts: 83
Joined: 24 May 2019, 10:23

Re: Digital Foundry - CRT better then 4K OLED

Post by Jason38 » 28 Oct 2019, 23:38

JKJK wrote:It's probably too small an amount to register on the geiger. But because there is NO SAFE quantity of radiation, the dose and its effects are cumulative over a lifetime. All those years in front of CRT playing quake, it has increased the odds of us getting brain/ eye cancer. :lol:
I'm the guy who owns over a 1000 Incandescents! There is risk from those as well because of the glow/flicker they have over your lifetime. That is a risk I am fine with even CRT is a risk I am fine with because most LED feels like death to eye's and body. I do prefer my Plasma TV over my CRT though and find it easier on my eyes. Everyone has different vision though and for some reason my brain/eyes don't do well with LED.

ELK
Posts: 94
Joined: 18 Dec 2015, 02:51

Re: Digital Foundry - CRT better then 4K OLED

Post by ELK » 25 Nov 2019, 10:32

All trinitrons like the one in the video use aperture grille technology. It is the reason why it looks so good.
https://en.wikipedia.org/wiki/Aperture_grille

FINALLY the same concept is seen on a modern screen but (I think it is patented by samsung) and only used in their VR headset the odyssey plus.

It should be featured in every monitor imo.

User avatar
Chief Blur Buster
Site Admin
Posts: 7699
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada
Contact:

Re: Digital Foundry - CRT better then 4K OLED

Post by Chief Blur Buster » 28 Nov 2019, 15:10

Jason38 wrote:
28 Oct 2019, 23:38
I'm the guy who owns over a 1000 Incandescents! There is risk from those as well because of the glow/flicker they have over your lifetime. That is a risk I am fine with even CRT is a risk I am fine with because most LED feels like death to eye's and body. I do prefer my Plasma TV over my CRT though and find it easier on my eyes. Everyone has different vision though and for some reason my brain/eyes don't do well with LED.
<SideTopic>
While LED illumination is not monitor tech, I follow these developments, and LEDs are also used in displays too...

I'm not bothered by LED (as long as they are CRI 90% and PWM-free). As long as those criteria are met (e.g. 2019-era IKEA light bulbs are my "low budget favourite" LED bulbs at the moment. The Blur Busters office is illuminated by a bunch of LEDARE 1600 Lumens, on an electronic dim-to-0% dimmer, and becomes warmer color when dimmed). They are more PWM-free than an incandescent, and they feel full-spectrum to me, but if you're sensitive to blue-light LED, they might not be good enough for you. That said, they are much superior to fluorescent and classic LED. They are improved/advanced versions of classic blue-LED with modern phosphors, which might not be good enough.

For most people, once color spectrum is solved and PWM is solved, any remaining LED discomfort by humans tends to be excess blue light in the LED bulb spectrum. Try those ultraviolet-phosphor bulbs with CRI 95. They're the bees knees. They are used in beauty salons and museums that pay a lot for proper spectrophotometer-measured LED spectrum. It's like the better-than-plasma-equivalent of LED bulbs, using phosphors for blue. Once you try those, you can ditch your hoard. I'm not sensitive as long as CRI is at least >80% and preferably >90%, but I understand the blue-spectrum issue which disrupts the circadian rhythm. That's why gaming monitors now have a low-blue-light mode, and iPhones/Androids now have a low-blue-light mode. (Imperfect as they are, they help somewhat).

Since I've become an LED expert, this probably deserves a new thread in the OffTopic Lounge.

TL;DR: Seek (A) PWM-free, (B) high-CRI, (C) ultraviolet-chip LED if you're extremely picky about LED. Getting all A+B+C in same bulb is hard to find.
</SideTopic>
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

User avatar
Chief Blur Buster
Site Admin
Posts: 7699
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada
Contact:

Re: Digital Foundry - CRT better then 4K OLED

Post by Chief Blur Buster » 28 Nov 2019, 15:24

ELK wrote:
25 Nov 2019, 10:32
FINALLY the same concept is seen on a modern screen but (I think it is patented by samsung) and only used in their VR headset the odyssey plus.
Nearly all LCD panels use vertical stripes of color filters much like an aperture grille, just much finer and pixel-aligned. But it is true most small OLEDs have used Pentile, which has its pros/cons. Fundamentally, as long as PPI is raised very high, and you use subpixe-aligned rendering, the benefits can be quite similar. But in the real world, pentile screens are claimed as higher resolution than they really are,

e.g. the same-resolution pentile screen as a traditional-RGB-grid screen, the way they quote specs is that there is often an average of 2 subpixels per pixel on a Pentile, and 3 subpixels per pixel on the striped grid. When going apples-vs-apples (total subpixel count), the Pentile can actually be better than striped. It's just a sleight of hand on how they measure resolution on these different pixel layouts. (Metaphorically; imagine much like a retina Pentile versus a low-resolution striped screen -- then you'd end up preferring the Pentile)

Now, I prefer traditional striped grids though, the striped OLED is a new technology for mobile screens.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

Jason38
Posts: 83
Joined: 24 May 2019, 10:23

Re: Digital Foundry - CRT better then 4K OLED

Post by Jason38 » 04 Dec 2019, 20:32

Chief Blur Buster wrote:
28 Nov 2019, 15:24
ELK wrote:
25 Nov 2019, 10:32
FINALLY the same concept is seen on a modern screen but (I think it is patented by samsung) and only used in their VR headset the odyssey plus.
Nearly all LCD panels use vertical stripes of color filters much like an aperture grille, just much finer and pixel-aligned. But it is true most small OLEDs have used Pentile, which has its pros/cons. Fundamentally, as long as PPI is raised very high, and you use subpixe-aligned rendering, the benefits can be quite similar. But in the real world, pentile screens are claimed as higher resolution than they really are,

e.g. the same-resolution pentile screen as a traditional-RGB-grid screen, the way they quote specs is that there is often an average of 2 subpixels per pixel on a Pentile, and 3 subpixels per pixel on the striped grid. When going apples-vs-apples (total subpixel count), the Pentile can actually be better than striped. It's just a sleight of hand on how they measure resolution on these different pixel layouts. (Metaphorically; imagine much like a retina Pentile versus a low-resolution striped screen -- then you'd end up preferring the Pentile)

Now, I prefer traditional striped grids though, the striped OLED is a new technology for mobile screens.
I'm interested in the difference between Pentile OLED vs striped OLED. I get strained by almost every LED screen but I have had success with two OLED phones Yotaphone 2 and Samsung S2. I want to buy an OLED TV from LG but want to wait one more year as I think they are bit too much still. Seems LG uses RGBW which I hope will be OK for me. I'm holding on to my Yotaphone 2 If I can get one more year as I want a 120hz refresh OLED phone at the very least with my next phone purchase. The Samsung S2 phone used a RGBG setup which is kind of interesting. That phone never gave me issues either. I'm also really interested in a gaming OLED monitor these can't come soon enough. I think these JOLED gaming monitors will be different from LG as they will be RGB OLED. I feel like OLED anything will solve most of my problems with eye strain related to screens. I love the idea of true blacks and fast pixel response time. I usually seem pretty fine with 60hz OLED I can only imagine how awesome 120hz or higher would be.

User avatar
LagBuster
Posts: 71
Joined: 15 May 2014, 06:50

Re: Digital Foundry - CRT better then 4K OLED

Post by LagBuster » 06 Dec 2019, 14:01

New video:

phpBB [video]

User avatar
Chief Blur Buster
Site Admin
Posts: 7699
Joined: 05 Dec 2013, 15:44
Location: Toronto, Ontario, Canada
Contact:

Re: Digital Foundry - CRT better then 4K OLED

Post by Chief Blur Buster » 06 Dec 2019, 16:10

They will need to head-to-head against the ViewSonic XG270 :D
witega wrote:
05 Dec 2019, 23:45
...Re: ViewSonic XG270...
Yeah 120hz strobed is incredible with this monitor. I am not seeing any crosstalk, just remarkable. With the color accuracy of an IPS, it's time to retire the Sony FW900 (well I already did, but if I still had it).
The FW900 already meets its match if you cherrypick the right gaming monitor.
That's been a big goal of the new "Blur Busters Approved" program.

Crosstalk-free 99% gamut strobing. No "bad LightBoost colors" stuff. While, you do have to reduce an IPS 240Hz monitor to 120Hz to get FW900-competing benchmarks in colorful CRT clarity motion on an LCD without degraded colors. Since 120Hz strobing is higher quality on a 240Hz panel than 120 Hz strobing on a 144Hz panel. And it's the fastest IPS on market, you finally go crosstalk free (no double images). And you now can get a wider color gamut, just like with a Sony FW900.

You do have to cherrypick the right LCD in order to blow away the FW900.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

       To support Blur Busters:
       • Official List of Best Gaming Monitors
       • List of G-SYNC Monitors
       • List of FreeSync Monitors
       • List of Ultrawide Monitors

3dfan
Posts: 57
Joined: 26 Jan 2014, 06:09

Re: Digital Foundry - CRT better then 4K OLED

Post by 3dfan » 12 Dec 2019, 07:00

nice to see blur busters crew helping tuning modern monitors for the best experience, but as much as i would like modern display technology to finally match the old CRT for gaming, after waiting eternally and undoubtedly being better than a TN monitor, i still have my doubts that ViewSonic XG270 monitor being the long awaited successor because:

-still no 60hz strobed enabled mode, as you stated in other post that monitor only strobes at 75hz minimum and its full strobe, no rolling scan, so strobing flicker has to be more aggresive than CRT flicker even at 75hz, 60hz CRT is very usefull, not just for emulators, retro games or 60fps limited games, also for modern games that struggle to run beyond 60fps constantly with nice looking settings and no notable gaming flicker on the CRT for me, even when the FW900 as a bigger screen than others 4 3 CRT monitors, i dont even feel its flickering amplified at 60hz gaming due to the screen having bigger size.

-also you stated in other post the XG270 still have a brightness versus clarity tradeoff effect, which makes me believe CRT like clarity quality mode in the XG270 reduces brightness notably, so that if you want a CRT brightfull life like colorful clear motion quality gaming looking approach on the XG270, the only way would be to play games at non strobed mode to avoid brightness loss and at 240hz and constant 240 fps, to reduce blurry motion and not even an overpriced gpu such the RTX 2080 TI or even a TITAN will be able to do that.

-also what about blacks especially for dark atmosphere games on that monitor being IPS tech, which is known to have poor blacks with ugly glow, notable light bleed? i dont see how it can beat or at least match the CRT on this.

-and there is still one of the strong CRT advantage points mentioned in both digital foundry videos which i, as a FW900 owner completely agree that is the CRT hability to display nice looking - moving clear games at lower resolutions and refreshes that a modern monitor need to approach the CRT quality, and when the ViewSonic XG270 being native 1080p intead of 4k or 1440p can be an advantage, it still need higher refresh rate even when strobing than CRT to approach the same quality and this without forgetting the brightness loss issue




it would be interesting if VA modern monitors take blur busters attention for future tuning (well, OLED, but it has been so long wait that i im losing my hopes on it), i remember a while ago a CRT enthusiast claimed a VA panel being the closest quality modern monitor to FW900 he had seen, however only being able to strobe to minimum 85hz and having a fatal flaw of not allowing the user to reduce the strobing brightness being too bright when in strobed mode. iam not an expert on this but i find it weird why the monitor developer did not allowed the user to adjust this, would be understandable if it was a matter of rising the backlight brightness, which could potentially damage it, but in this case would be the opposite just lower the backlight voltage to reduce the overbrigtness wouldnt it? he also claimed latency on that monitor was very good also since VA panels have much better blacks than IPS, it makes also it a more interesting technology to be a more realistic rival candidate to the CRT.

thatoneguy
Posts: 46
Joined: 06 Aug 2015, 17:16

Re: Digital Foundry - CRT better then 4K OLED

Post by thatoneguy » 13 Dec 2019, 01:51

IMO OLED is a stopgap technology like Plasma.
MicroLED is where it's at and if you've been paying attention to the latest news there's been quite some important breakthroughs as of late such as Plessey finally developing Red InGaN LEDs on Silicon which allows them to produce Full RGB Color displays.

We're going to need crazy brightness for HDR(and for strobing as well) so OLED is fighting a losing battle already.
OLEDs simply can't get bright enough to meet the new (high-end)standards.
They will lose to Quantum Emissive Displays and MicroLED Displays in the long run.

Post Reply