NeonPizza wrote: ↑31 May 2024, 00:07
But how much latency does the HDR SDR brightness boost trick add exactly, and how many nits do you gain to compensate for the 50% brightness drop using 60hz 8.3ms BFI in SDR game mode?
Picture processing on Retrotink 4K is done realtime on a streaming basis (raster/beamraced), for things like brightness/contrast/etc. There is no lag penalty for using any of Retrotink 4K's picture adjustments. The brightness boost is part of this.
Only SDR lag and HDR lag difference would be applicable here.
NeonPizza wrote: ↑31 May 2024, 00:07
I plan on getting a QD-OLED(Last years Samsung S90C), but it's internal 60hz BFI has a whopping 29ms of latency. But that automatically becomes irrelevant, basically dropping down to 0 lag since my S90C would be paired with the TINK4K using it's
own 60hz BFI which adds a grand total of 8.3ms of lag without any additional latency coming from the TV?
2ms lag - (TINK4K Transceiver)
8.3ms lag - (TINK4K 60hz BFI @1080p or 4K)
4ms lag, assuming - (HDR SDR Brightness boosting trick)
14.3ms of lag total, with no additional latency coming from the S90C? If that's so, that's pretty good with an 8.3ms persistence and all! -5ms of lag total would be ideal to get games feeling nearly as good as a CRT, but that's where vanilla 120fps or 144fps comes into play.
You asked me a question with additional unknown/undefined/unconfirmed scientific variables, so I'm not able to answer this. Apologies.
Also, I do not understand the 4ms rabbit-out-of-a-hat number. There is no lag difference between HDR without boost, and HDR with boost. Simply HDR on/off lag only, it's just transceiver lag (which includes the rolling scanline window used for realtime streaming picture processing) and half-frame lag only. There is no lag penalty for using picture adjustments like contrast/brightness/saturation boost.
NeonPizza wrote: ↑31 May 2024, 00:07
Another thing i was wondering about, is if the TINK4K HDR SDR Brightening trick has a negative impact on colour accuracy. I've noticed that turning the HDR Module 'ON' setting in my LG C1 OLED's service menu to gain more brightness when using the TV"s internal 60hz BFI in SDR game mode, ruins the colours, and doesn't really salvage enough brightness either. Does the exact same thing apply to what TINK4K is doing, colour wise, or is just a mere brightness upgrade without any negative impact to the picture?
You adjust the HDR boosts to stay within your TV's gamut. This will minimize color distortions.
Large HDR boosts will distort color but you can compensate using other picture adjustments on the Retrotink, like brightness/contrast/gamma/saturation/RGB gains/etc. You can save favourite profiles.
Remember, original CRT tubes also distorted colors when they became "bright" (e.g. overbright scenes can have color distortions versus dim scenes).
Some screens I tested on distort HDR less than other screens. For example, OLED screens tended to distort HDR boosts somewhat less than LCD screens, although you don't want to push it to more than roughly ~2x brightness on current LG WOLED's, although you may be able to get to ~3x+ on the brighter (higher end) OLED panels, before really noticing any colorspace distortions.
So this doesn't necessarily detract from the retro experience, as long as kept under sufficient control...
...because CRT tube color also distorted when you overbrightened/oversaturated/etc them.
NeonPizza wrote: ↑31 May 2024, 00:07
The goal is to buy two RetroTINK4k's. One for my main future 65" QD-OLED, with triple strobe BFI for Streaming, Blu-rays & DVD's
For 24fps triple strobe (72Hz flicker via 144Hz visible+black frame pairs), make sure your TV supports 144Hz, or at least 144Hz within its VRR range.
Also, the Retrotink 4K supports a VRR-flag trick for custom refresh rates. It can be used to trick a VRR-compatible TV to do fixed-frequency BFI at any refresh rate within its VRR range. The refresh rate won't vary during Retrotink 4K VRR, it's simply a VRR-flag trick a TV to accept custom fixed-Hz refresh rates when the TV doesn't support custom fixed-Hz refresh rates in non-VRR mode.
NeonPizza wrote: ↑31 May 2024, 00:07
End game would obviously be 5ms or less lag, with zero motion blur. And when that happens i'll definitely be there, because CRT's are aging and diminishing in picture quality as time goes on.
Long term, my dream is to help a next-generation retro video processor to support a CRT electron beam simulator. Using brute refresh rates (e.g. 480-960Hz) to do 8-16 digital refresh cycles of CRT beam simulation per 1 analog refresh cycle. Brute Hz for the win!
NeonPizza wrote: ↑31 May 2024, 00:07
It's getting very difficult to even find something like a 27" Sony WEGA Trinitron(2005) with low hours in this day and age. I'd rather just settle for QD-OLED + TINK4K at this point.
I've given up trying to buy a Sony FW900 CRT at this stage now.
Eventually, ~1000 Hz ~3000+ nit OLEDs will be available by end of decade, which will be fantastic for software-based CRT beam simulators. Did you know that a high-speed video (dynamic range equalized) of a CRT tube, played back in realtime on a ultra-high-Hz OLED, kind of looks like the original CRT tube? The aim is now to do it via software-based means, including box-in-middle techniques.