Page 3 of 7

Re: Digital Foundry - CRT better then 4K OLED

Posted: 29 Sep 2019, 08:28
by rasmas
3dfan wrote:there is no need to read the entire thread there, use the search function, search words such "dpu3000" "sunix" "delock" also search for user "derupter", he has provided very usefull info about DAC adapters.

now that you seem from europe, there seems to be an adapter (delock 62967) cheaper than the sunix for the european market from my undertanding, that eventhoug with pixel clocks lower that the sunix seems to be good enought for resolutions and refreshes for the fw900, more info about it search there "62967"
Nice! i'll check these, and the delock 62967 cost 25€ here (VGA to HDMI), so, if similar to the dpu3000, it's 100€ cheaper :) .
1000WATT wrote:...
I have no hope that it will rise in price, especially in my country.
You can always ship it to other country (not sure about delivery costs but if you said they pay $1000 for one i think it could be worthy -maybe a bit time consuming-)
1000WATT wrote:for solo games there has long been an oled tv.
Do you have OLED? compared to your monitor and the Sony, is it like a good "mix" of both technologies (TFT and CRT)?
1000WATT wrote:...For FPS games there are 240zh monitors. and as soon as we get interpolation with minimal delays and artifacts from 1000fps, including for retro console players.
But we'll need powerful hardware for these fps, that's why i like CRTs.
1000WATT wrote:...... my fw900 will definitely go to the trash.
Noo! :D
1000WATT wrote:Also, if you like to tinker with the settings and calibration of the fw900, this is a good choice, it will take away all your free time, becoming more and more perfect. Like those people who like to sit in the garage and repair and clean the engine of their beloved old car. This is a kind of buzz.
...
In my case i would just put it "good enough" and use it :D .

Thanks guys ;) .

Re: Digital Foundry - CRT better then 4K OLED

Posted: 30 Sep 2019, 16:19
by Chief Blur Buster
1000WATT wrote:(the only reason my crt is standing in the closet is the transition from 980 to 1080ti. 3 pieces of FW900 are available in my city for 105 and 140 dollars.)
I need a Sony GDM-W900 or a Sony FW900 or a NOKIA 445pro for the Blur Busters Lab (comparision purposes, etc).

Anyone selling one within 2 hours drive of Toronto/Hamilton area? Send PM or email mark[at]blurbusters.com

Note: If outside 2 hours drive, I'm willing to accept donation from afar if the party covers freight shipping to YHM. As a bonus, I'll commit to doing a CRT-FreeSync test on it, to see if the model tolerates an analog raster VRR signal, via HDMI->VGA 1:1 scan conversion.

Re: Digital Foundry - CRT better then 4K OLED

Posted: 30 Sep 2019, 17:57
by 1000WATT
fhfhfh.jpg
fhfhfh.jpg (42.71 KiB) Viewed 7795 times

Re: Digital Foundry - CRT better then 4K OLED

Posted: 30 Sep 2019, 20:19
by Chief Blur Buster
Some people's boat achors wasting closet space are the precious to you.

Yes, many of you don't want to let go of your precious.

My precious!

/LOTR

Re: Digital Foundry - CRT better then 4K OLED

Posted: 01 Oct 2019, 05:55
by 3dfan
why dont you try asking in The CRT Collective group in facebook? seems there are people getting FW900's there from diverse locations

Re: Digital Foundry - CRT better then 4K OLED

Posted: 01 Oct 2019, 19:00
by Chief Blur Buster
3dfan wrote:why dont you try asking in The CRT Collective group in facebook? seems there are people getting FW900's there from diverse locations
Thanks, I've requested a join.

Also, spacediver might loan a CRT, but I'd like a permanent reference addition to the Blur Busters Lab. Comparison purposes, etc.

Re: Digital Foundry - CRT better then 4K OLED

Posted: 02 Oct 2019, 20:11
by flood
as a fan of CRTs and current owner of an fw900, i have to say the video in the op is kind of underwhelming

one thing: all the camera shots showing only the monitor have colors adjusted to crush black levels. for a room where the illumination is such that the bezels are as bright as in the video, the black levels of the fw900 will not be that great (due to diffuse reflections of ambient light from the phosphor layer).

0. geometry/convergence: lcd's never have issues here
1. sharpness: lcd at native resolution or with integer scaling > crt at any resolution > lcd upscaled
2. black levels: yes CRTs are better but this is with caveats
a) the phosphor layer of the CRTs reflect more ambient light than LCDs. this means that in many (most?) lit rooms, the black levels of an LCD might be lower than that of a CRT.
b) on a CRT, the local contrast ratio isn't as good as the global. it's kinda like FALD, except without the artifacts of individual rectangles going on and off.
c) especially for old CRTs, you might need to mess around with calibration settings to get proper black levels.
d) some CRTs require quite a bit of warming up before black levels settle down (fw900 especially needs quite a while like >30min)
but anyway a CRT properly calibrated/adjusted in a dark room is just way better than any LCD (excluding FALD). and no IPS glow or whatever
3. persistence: CRTs have very low persistence but then there is also a soft trail. this is a mild annoyance for bright objects moving on a dark background. i don't know how the best strobed LCDs compare.
4. viewing angles: CRTs are better than all LCDs. IPS displays might not have much color shift, but the luminance still drops off significantly with angle. the absence of this in CRTs results in a sort of subtle quality, where the displayed image looks like a uniform piece of floating piece of paper in the monitor. it depends on how close you are to the display and its size, but with LCDs this simply is not the case; even with perfectly uniform backlighting, the center of the screen will be brighter than the corners.
5. colors/gamut: ultimately it depends on where the content was mastered.
from spacediver's white point balance guide https://hardforum.com/threads/windas-wh ... s.1830788/
The phosphors used in high end GDM trinitrons are known as SMPTE-C phosphors, and their chromaticity is based on BT.601, which is a standard associated with SD (standard definition) content. Fortunately, the SD primaries are very close to the HD primaries, so this slight deviation is a non issue. Interestingly, with the advent of HD, many studios continued to use the Sony BVM CRTs, which had the SMPTE-C primaries, even when mastering HD content, which means that a fair chunk of HD content will be more accurately rendered on a GDM display than on a display that has true HD primaries.

Re: Digital Foundry - CRT better then 4K OLED

Posted: 03 Oct 2019, 12:17
by Chief Blur Buster
flood wrote:3. persistence: CRTs have very low persistence but then there is also a soft trail. this is a mild annoyance for bright objects moving on a dark background. i don't know how the best strobed LCDs compare.
The best strobed LCDs (the top 5%-10%) can easily beat many medium-persistence CRTs such as the Sony FW900

The difference is
-- Phosphor ghosting on CRT
-- Strobe crosstalk on LCD

Some CRTs have more phosphor ghosting, while some LCDs have less strobe crosstalk. The venn diagram already overlaps.

It's hard to spend the money to buy enough monitors (CRT+LCD) to figure out where the venn diagram overlaps. But I am Blur Busters, that means I've seen enough to witness the venn diagram.

<Advanced-Technical Post>

As Advanced Crosstalk FAQ indicates, the best LCDs can be literally ~1/100th the crosstalk of the worst LCDs. Unfortunately, most people can't afford to buy 10 monitors to find which one has the best strobing.

However, strobing often degrades color quality due to the way many monitors need to slightly overclock its scanout (to cram GtG into VBI) *and* to stay away from overdrive-problematic fulldark primaries and fullbright primaries (you need ovedrive voltage undershoot headroom below black, and overdrive voltage overshoot headroom above white, to eliminate crosstalk for extreme bright/dark colors). Also, TN panels are prone to LCD inversion artifacts (checkerboard patterns) and strobing amplifies that.

For fans of strobing, what I absolutely love about the upcoming strobing on 240Hz 1ms IPS panels is
(A) IPS is immune to inversion artifacts for the most part
(B) IPS is now approaching 1ms, which finally "crams the GtG elephant into the tiny VBI drinking straw" -- Most of GtG-time is within VBI-time, needed to eliminate majority of strobe crosstalk.
(C) IPS is now 240Hz, which is plenty of hertzroom above comfortable strobing (~100-144Hz), hertzroom needed to eliminate majority of strobe crosstalk.
(D) IPS backlights tend to be incredibly bright, which helps strobing
(E) IPS colors are more immune to color degradation during strobing.
(F) IPS with "LightBoost 10%" motion clarity, without nearly as many drawbacks

Overvoltage-boosted strobing that milks the Talbot-Plateau Law to cram more photons in ever-shorter time periods of a strobe backlight LCD to reduce motion blur -- means the LCDs can gain CRT motion clarity attributes as LED backlight technology improves. The Valve Index virtual reality LCD is able to achieve 0.33ms LCD persistence (less than one-third of LightBoost 10%) while being extremely bright -- far brighter than LightBoost 10% or ULMB Pulse Width 10. That's why LCD strobe backlights can have less motion blur than low-persistence OLEDs, that's part of why HTC/Oculus switched to LCD instead of OLED. It's easier to push engineering closer to the Talbot-Plateau law (double brightness flash in half time = equal brightness), due to the fact outsourced light is easier to make brighter than direct pixel light. This may change when MicroLEDs come (which will be brighter than OLEDs), it's simply "bright + low persistence" is extremely hard (how strobe backlights darken the image), but you can simply use more powerful LEDs + heatsinked backlights. (Remember, LED is getting brighter and is now used in stadiums). Which makes brighter low-persistence progressively easier to achieve; and technologically also helps reduce strobe crosstalk, by keeping the flash length within the VBI too.

Strobe tuning and overdrive tuning will be critically important for IPS strobing, but what I'm seeing on my desk (I have a 240Hz 1ms IPS already, not ready to mention the brand just yet). I am liking what I am seeing ... It's not as zero-crosstalk as LightBoost, but you're getting the top 10%-best strobing *AND* the vaunted IPS quality. You still lose lumens, but at least it's only a drop literally from ~400 lumens to roughly ~150-lumens (ish) -- not a problem for many people who's been used to the poor ~50lumens of "LightBoost 10%".

You still have more strobing lag than CRT, due to the requirement of scanout-in-dark and GtG-complete-in-dark before the flash -- but it's minimal and fine for motion-clarity-priority situations rather than latency-priority situations; and you can still turn off strobing to get the same IPS lag as an average 1ms TN panel.

Little known to users, but there's also the async between scanout versus VSYNC OFF (frameslice streaming into the scanout) so vertical latency gradient mechanics feel different on CRT (sequential-scan) versus strobing (global flash), but this can become irrelevant if you use low-latency VSYNC ON methods to minimize microstutter on CRT or LCD. (In fact, low-latency VSYNC ON feels better with strobe LCDs than with CRTs, because VSYNC ON is global, and strobe is global = zero latency gradient = no latency differentials between top/center/bottom for synchronized global frame delivery mechanisms).

Most screens scanout top to bottom, see http://www.blurbusters.com/scanout for high speed videos of TN LCD, IPS LCD, OLED, etc.

Common latency gradient mechanics, Present()-to-Photons
VSYNC ON + CRT = latency(top) < latency(bottom)
VSYNC OFF + CRT = latency(top) ~= latency(bottom)*
VSYNC ON + LCD strobe = latency(top) = latency(bottom)
VSYNC OFF + LCD strobe = latency(top) > latency(bottom)**

*Note1: Individual frameslices of VSYNC OFF are latency subgradients themselves, as a global fixed gametime, streamed into the progressive cable scanout in realtime. Lag is always lowest immediately below a tearline, and highest immediately above a tearline, and are modifiers of [+0...+frametime] to pre-existing pixel latency, as a linear latency gradient from top thru bottom of frameslices, and frameslices can overlap VBI and/or wraparound to the next refresh cycle. Given sufficiently high framerate and random frameslice placements (random tearlines), average latency equallizes across the full surface of the CRT.
**Note2: You have Note1 equalization of latency on the cable, but you've got the mandatory scanout-in-dark followed by the global flash. As a result, the freshest pixels are near bottom edge of screen, so when measuring multi-pass average latency of a single pixel, the panel's latency gradient is inverted

As refresh rates go higher (or gets eliminated altogether), the latency range in the latency gradient falls down, as the latency gradient is usually capped to a range of refreshtime (1/Hz) .... That said, assuming framebuffer backpressure is kept in check (e.g. VR-style shallow-queue framepacing techniques for ultralowlatency 90Hz or 120hz VSYNC ON equivalent), perfect high-Hz strobe feels beautifully consistent latencyfeel. That said, for a 100Hz CRT VSYNC ON, if a person is using VSYNC ON with a CRT, most of those people don't care enough about the 10ms latency differential between top/bottom. On the other hand, some of us are simultaneously picky about latency AND latency consistency, in addition to zero-stutter, zero-blur. So, understanding latency consistency along the screen surface plane, is a key element to a "Blur Busters Einstein" brain matter (I currently teach this stuff; in the training/classroom stuff I sometimes do at manufacturers/vendors -- services.blurbusters.com -- from time to time, a vendor flies me in for a few days or a week)

Absolute lag is lowest with VSYNC OFF + CRT though, but if latency consistency and glassfloor Present()-to-Photons latency is desired for all 2 million pixels simultaneously, it currently favours strobe LCD + VSYNC ON. The problem is getting a really good strobed LCD without the artifact tradeoffs...

Either way, back on topic...

There are preferences between IPS (IPS Glow) versus TN (TN viewing angle) versus VA (VA dark ghosts) but with the major speedup of IPS pixel response, it's easier to "have-cake-and-eat-it-too". Almost.

If you're buying the best strobed 120Hz cost-no-object within 6 months, three words: "240Hz 1ms IPS". Yes, hertzroom, but you need to buy hertzroom if you want good strobing at lower Hz.

We are going to have an exciting announcement soon. Keep tuned.

</Advanced Technical Post>

Re: Digital Foundry - CRT better then 4K OLED

Posted: 03 Oct 2019, 15:12
by flood
what is the strobe length of the best lcd's nowadays? i think, ignoring phosphor ghosting/whatever, crt's are superior unless the strobe length is <0.5ms, in which case it doesn't really matter

there's still this annoying thing however if you're not in fullscreen exclusive mode:
viewtopic.php?f=10&t=1381&start=400#p26086

as long as you have a graphics driver installed, the cursor's position is set during vblank. so the cursor has less latency at the top than at the bottom.
i've asked people many times whether they can feel the difference in latency between the cursor at the top of the screen and bottom of the screen. (which is like 15ms for 60Hz... quite a lot). i'm not sure i've gotten any positive responses lol. part of it is that you get used to it.

Re: Digital Foundry - CRT better then 4K OLED

Posted: 03 Oct 2019, 15:33
by RealNC
My CRT TVs had phosphor ghosting. My CRT monitors did not. At least I can't remember bright pixels leaving any visible trails behind.