LG CX vs. RetroTINK-4K BFI - confusing...

High Hz on OLED produce excellent strobeless motion blur reduction with fast GtG pixel response. It is easier to tell apart 60Hz vs 120Hz vs 240Hz on OLED than LCD, and more visible to mainstream. Includes WOLED and QD-OLED displays.
Nintendude
Posts: 8
Joined: 25 Aug 2020, 05:34

LG CX vs. RetroTINK-4K BFI - confusing...

Post by Nintendude » 06 Dec 2023, 11:10

Hi all,

The BFI feature in the RT4K is great news and sounds like a big step forward but I find it confusing:

I understand that the RT4K needs a 240hz screen to minimise motion blur by 75% (approx.4ms persistence).

But the LG CX can achieve 4ms persistence with only a 120hz screen.

How is the TV able to double the Tink's performance? What can the hardware do that the software can't?

Also, I read that you can combine the two forms of BFI but how does this work / is it any better? If you turn the CX's BFI to High, what setting do you then use on the RetroTink and what does it do to the BFI's performance?

Thanks a lot for any info.

User avatar
Chief Blur Buster
Site Admin
Posts: 11775
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: LG CX vs. RetroTINK-4K BFI - confusing...

Post by Chief Blur Buster » 11 Dec 2023, 13:01

Nintendude wrote:
06 Dec 2023, 11:10
Hi all,

The BFI feature in the RT4K is great news and sounds like a big step forward but I find it confusing:

I understand that the RT4K needs a 240hz screen to minimise motion blur by 75% (approx.4ms persistence).

But the LG CX can achieve 4ms persistence with only a 120hz screen.

How is the TV able to double the Tink's performance? What can the hardware do that the software can't?

Also, I read that you can combine the two forms of BFI but how does this work / is it any better? If you turn the CX's BFI to High, what setting do you then use on the RetroTink and what does it do to the BFI's performance?

Thanks a lot for any info.
Hardware Has More Control

Hardware can refresh a pixel (or light modulation) multiple times per signal Hz. That's why plasmas refreshed at 600Hz on its own despite it doing only 60Hz. You can think of the LG CX as having a 240Hz-like scanout behavior behind the scenes, even if it is only 120Hz or less. In a high speed video, LG CX can modulate a pixel 240 times a second (modulate = turn pixel on or turn pixel off), which means it's capable of sub-refresh BFI at 120Hz.

The automatic pixel modulation may be digital (e.g. turning off a pixel mid-refresh) or analog (e.g. phosphor). But the science & physics end result is the same!
  • That's why CRT can do sub-refresh "BFI-like" behavior.
  • That's why OLED can do sub-refresh BFI.
  • That's why Plasma can do sub-refresh "BFI-like" behavior.
  • That's why LCD can do sub-refresh strobe.
CRT does defacto BFI better than a refresh cycle because it flickers briefer. Same for some special hardware methods that turns off pixels faster than a refresh cycle. That's what LG CX does. It can turn off pixels again mid-refresh cycle.

Software and box-in-middle can't.

Software can only modulate a pixel once per hardware refresh cycle. So you need more signal Hz if you want to modulate a pixel many times per simulated Hz. So you can virtualize 60Hz-in-240Hz (with Retrotink 4K or BFI injection) and have 4 modulation opportunities per simulated 60Hz refresh cycle.

So more Hz the better if I want to modulate a pixel multiple times per emulated Hz (e.g. 240Hz output for a 60Hz signal means I've got multiple pixel modulation choices, ala TestUFO Variable Persistence BFI For 240Hz Monitors)

Do you understand better now? ;)

However...

- Retrotink BFI has less latency than hardware OLED BFI (Because of fast FPGA that does minimum processing)
- Retrotink BFI is brighter than hardware OLED BFI (Because of convert SDR->HDR and then HDR nits booster)
- Retrotink BFI is more flexible at high Hz (multi strobe support and odd-Hz support)
- Retrotink BFI can add BFI support to any Hz within VRR range by putting a fixed-Hz inside a VRR signal transport (VRR-flagged fixed Hz signal that looks like VRR except it's hardware-Hz-capped). So you can do 96Hz BFI to any LG TV, for double-strobe 48Hz flicker on 24fps movies like a retro 35mm film projector's shutter (to hide filmreel movements, but added flicker in old movie theaters, which you can now reproduce on Retrotink, if you want).

Hardware OLED firmware BFI is lazy because it simply buffers the whole refresh cycle before beginning to output. Retrotink 4K does it at the absolute fastest possible that laws of physics will allow, it only waits until (input-refreshtime minus output-refreshtime) before beginning output. So if accepting a slowscan 60Hz before starting a fastscan 120Hz output, it will only buffer half of the slowscan 60Hz (pixels arrive one pixel at a time over the cable), then once "half of the buffer is full", it begins outputting the faster output. Like 2x speed water flow, while the tank is only half full from the incoming 1x speed water flow, knowing the water flow (output refresh cycle) won't be interrupted. Because the 1x flow will finish the moment the 2x flow finishes (scanouts meets). So minimum possible BFI latency. And Retrotink BFI is less laggy than LG firmware BFI as a result;

So you've got pros/cons.

EDIT: To purchase, Blur Busters Affiliate Link for Retrotink 4K
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

vinnyoflegend
Posts: 1
Joined: 20 Dec 2023, 12:55

Re: LG CX vs. RetroTINK-4K BFI - confusing...

Post by vinnyoflegend » 20 Dec 2023, 13:02

- Retrotink BFI can add BFI support to any Hz within VRR range by putting a fixed-Hz inside a VRR signal transport (VRR-flagged fixed Hz signal that looks like VRR except it's hardware-Hz-capped). So you can do 96Hz BFI to any LG TV, for double-strobe 48Hz flicker on 24fps movies like a retro 35mm film projector's shutter (to hide filmreel movements, but added flicker in old movie theaters, which you can now reproduce on Retrotink, if you want).
Maybe I'm misunderstanding this, but does this mean that I can take a Resolution X@60hz HDMI output from a PC source, have Retrotink convert it to 120hz BFI, and output to any monitor that supports Resolution X@120hz (or 4K @ 120hz if Resolution X was not originally 4k?), effectively netting me BFI support that is not dependent on my GPU or monitor having specific BFI or ULMB support?

User avatar
Chief Blur Buster
Site Admin
Posts: 11775
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: LG CX vs. RetroTINK-4K BFI - confusing...

Post by Chief Blur Buster » 27 Dec 2023, 22:05

vinnyoflegend wrote:
20 Dec 2023, 13:02
Maybe I'm misunderstanding this, but does this mean that I can take a Resolution X@60hz HDMI output from a PC source, have Retrotink convert it to 120hz BFI, and output to any monitor that supports Resolution X@120hz (or 4K @ 120hz if Resolution X was not originally 4k?), effectively netting me BFI support that is not dependent on my GPU or monitor having specific BFI or ULMB support?
Correct.

You can add BFI to any signal.

You can add 120Hz, 240Hz, 360Hz BFI to generic HDMI, S-Video, VGA, Composite, Component.

Video Input: HDMI, S-Video, VGA, Composite, Component
Video Output: HDMI, at any custom Hz, even with BFI

You don't even need a GPU, it even works with analog Betamax/VHS VCRs and LaserDisc players and any video sources you want to add BFI to.

The main limitation is bandwidth budget; If you want BFI on PC based sources, remember to make sure your output is within the bandwidth budget of 4K60, 1440p120, or 1080p240. So you can game at 60Hz input and 1080p240 output (successfully tested), as an example. Or
1440p60 -> 1440p120 output (currently untested by me).

IMPORTANT: There are input signal and output signal bandwidth limitations due to memory bandwidth performance. For example, 1080p60 input and 1080p240 output is not possible without overclocking the Retrotink 4K

EDIT: To purchase, Blur Busters Affiliate Link for Retrotink 4K
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
NeonPizza
Posts: 67
Joined: 20 Oct 2021, 03:01

Re: LG CX vs. RetroTINK-4K BFI - confusing...

Post by NeonPizza » 28 May 2024, 01:08

Chief Blur Buster wrote:
11 Dec 2023, 13:01
Nintendude wrote:
06 Dec 2023, 11:10
Hi all,

The BFI feature in the RT4K is great news and sounds like a big step forward but I find it confusing:

I understand that the RT4K needs a 240hz screen to minimise motion blur by 75% (approx.4ms persistence).

But the LG CX can achieve 4ms persistence with only a 120hz screen.

How is the TV able to double the Tink's performance? What can the hardware do that the software can't?

Also, I read that you can combine the two forms of BFI but how does this work / is it any better? If you turn the CX's BFI to High, what setting do you then use on the RetroTink and what does it do to the BFI's performance?

Thanks a lot for any info.
Hardware Has More Control

Hardware can refresh a pixel (or light modulation) multiple times per signal Hz. That's why plasmas refreshed at 600Hz on its own despite it doing only 60Hz. You can think of the LG CX as having a 240Hz-like scanout behavior behind the scenes, even if it is only 120Hz or less. In a high speed video, LG CX can modulate a pixel 240 times a second (modulate = turn pixel on or turn pixel off), which means it's capable of sub-refresh BFI at 120Hz.

The automatic pixel modulation may be digital (e.g. turning off a pixel mid-refresh) or analog (e.g. phosphor). But the science & physics end result is the same!
  • That's why CRT can do sub-refresh "BFI-like" behavior.
  • That's why OLED can do sub-refresh BFI.
  • That's why Plasma can do sub-refresh "BFI-like" behavior.
  • That's why LCD can do sub-refresh strobe.
CRT does defacto BFI better than a refresh cycle because it flickers briefer. Same for some special hardware methods that turns off pixels faster than a refresh cycle. That's what LG CX does. It can turn off pixels again mid-refresh cycle.

Software and box-in-middle can't.

Software can only modulate a pixel once per hardware refresh cycle. So you need more signal Hz if you want to modulate a pixel many times per simulated Hz. So you can virtualize 60Hz-in-240Hz (with Retrotink 4K or BFI injection) and have 4 modulation opportunities per simulated 60Hz refresh cycle.

So more Hz the better if I want to modulate a pixel multiple times per emulated Hz (e.g. 240Hz output for a 60Hz signal means I've got multiple pixel modulation choices, ala TestUFO Variable Persistence BFI For 240Hz Monitors)

Do you understand better now? ;)

However...

- Retrotink BFI has less latency than hardware OLED BFI (Because of fast FPGA that does minimum processing)
- Retrotink BFI is brighter than hardware OLED BFI (Because of convert SDR->HDR and then HDR nits booster)
- Retrotink BFI is more flexible at high Hz (multi strobe support and odd-Hz support)
- Retrotink BFI can add BFI support to any Hz within VRR range by putting a fixed-Hz inside a VRR signal transport (VRR-flagged fixed Hz signal that looks like VRR except it's hardware-Hz-capped). So you can do 96Hz BFI to any LG TV, for double-strobe 48Hz flicker on 24fps movies like a retro 35mm film projector's shutter (to hide filmreel movements, but added flicker in old movie theaters, which you can now reproduce on Retrotink, if you want).

Hardware OLED firmware BFI is lazy because it simply buffers the whole refresh cycle before beginning to output. Retrotink 4K does it at the absolute fastest possible that laws of physics will allow, it only waits until (input-refreshtime minus output-refreshtime) before beginning output. So if accepting a slowscan 60Hz before starting a fastscan 120Hz output, it will only buffer half of the slowscan 60Hz (pixels arrive one pixel at a time over the cable), then once "half of the buffer is full", it begins outputting the faster output. Like 2x speed water flow, while the tank is only half full from the incoming 1x speed water flow, knowing the water flow (output refresh cycle) won't be interrupted. Because the 1x flow will finish the moment the 2x flow finishes (scanouts meets). So minimum possible BFI latency. And Retrotink BFI is less laggy than LG firmware BFI as a result;

So you've got pros/cons.

EDIT: To purchase, Blur Busters Affiliate Link for Retrotink 4K
Well that's great to here! Regarding the HDR boosting SDR trick to regain the brightness loss from either BFI or when using scan lines for retro based titles. But my question is, if something like a Samsung QD-OLED(S90C) has 29ms of lag from it's 60hz BFI(MotionClear's 8.3ms persistence) input lag when gaming at 60fps, which is already too much latency for my liking, how much lag would there be in comparison when using TINK4K's 60hz BFI? The goal is to get under 16ms. TINK4K supposedly adds another 2.5ms i've heard as well, at least.

Another thing I noticed with the internal 60hz BFI on my LG C1 OLED, is that it causes severe shadow detail crushing. And when using 120hz's Low, Medium and High BFI when gaming at 120fps, the black crushing also gets worse the higher your go. It's a deal breaker. Is BFI shadow detail crushing also a problem with TINK4K, or is that just the result of lazy half baked BFI implementation from LG?

60hz 8.3ms persistence BFI at 60fps, with zero additional shadow detail crushing, combined with the HDR brightness boosting SDR trick, @4K with 16ms or less latency would be a frikking god send at this point. I can't tolerate 16ms/100% OLED motion blur. 8.3ms i feel gets you into plasma motion clarity territory and it's good enough to make games look playable.

User avatar
Chief Blur Buster
Site Admin
Posts: 11775
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: LG CX vs. RetroTINK-4K BFI - confusing...

Post by Chief Blur Buster » 28 May 2024, 15:06

NeonPizza wrote:
28 May 2024, 01:08
The goal is to get under 16ms. TINK4K supposedly adds another 2.5ms i've heard as well, at least.
Retrotink 4K BFI is lower lag than LG internal BFI.

This is because Retrotink 2:1 BFI is fully raster beam-raced based, which means it begins outputting the visible refresh cycle (fastscan 1/120sec) when it's buffered half of the slowscan 1/60sec signal. The scanouts meet at the end of the incoming scanout and outgoing scanouts. It's the lowest-latency BFI. So 50%:50% BFI (outHz = 2 x inHz) only adds half an inHz latency, e.g. +8.3ms latency for adding BFI to a 60Hz signal, which turns it into a 120Hz signal.

Metaphorically: It's like a water faucet flowing slowly (a single refresh cycle aka 60Hz) into a tank (aka video buffer in Retrotink 4K), and the tank has another faucet at bottom that releases 2x faster than the original faucet refilling the tank (output Hz starts at 2x flow rate, aka 120Hz) before the faucet finished filling the tank. The output flow (aka output signal scanout) continues at full 2x speed (aka 120Hz scanout), even while the input flow (aka input signal scanout) is still pouring into the tank (aka video buffer in Retrotink 4K). Meaning one uninterrupted fast flow (120Hz scanout), even though it started flowing BEFORE the input slow flow (60Hz signal) was finished buffering (tanked). Basically, the fastest/soonest/least laggy possible BFI for a standard 60Hz input signal versus a standard 120Hz output signal (no nonstandard signals, such as large VTs or Quick Frame Transport). The goal is soonest-possible uninterrupted double flow speed (double scanout speed), so the output scanout is timed halfway to the halftime of the input scanout.

However, if you use the HDR booster feature, remember HDR adds a bit of latency, so if you want only half a frame latency over LG 60Hz, you will need to use SDR (probably). However, you have the option to turn on/off HDR brightness boosting.
NeonPizza wrote:
28 May 2024, 01:08
Another thing I noticed with the internal 60hz BFI on my LG C1 OLED, is that it causes severe shadow detail crushing. And when using 120hz's Low, Medium and High BFI when gaming at 120fps, the black crushing also gets worse the higher your go. It's a deal breaker. Is BFI shadow detail crushing also a problem with TINK4K, or is that just the result of lazy half baked BFI implementation from LG?
That's why Retrotink 4K BFI is superior. You can adjust Retrotink 4K picture settings (contrast, brightness, gamma) and histogram the black/white level manually to avoid the crushing effect. You can even lift the blacks off black (make blacks into a grey color), or dim the whites below white (makes whites darker than fullwhite). Adjust accordingly to your room lighting conditions and you'll see way more detail with Retrotink 4K BFI.

Keep in mind though, you'll only have 8bpc (24bit color), the 30bit color will be decimated by Retrotink 4K, which doesn't interfere with retro uses, but may affect certain home theater uses (e.g. adding BFI to Blu-Ray or Netflix). However, this is one of the cheapest video processors on the market capable of 48Hz, 72Hz, 96Hz, 120Hz dejudder, with optional BFI simulation of 35mm projector double strobe -- a feature that not even many 5-figure-priced-units had!
NeonPizza wrote:
28 May 2024, 01:08
60hz 8.3ms persistence BFI at 60fps, with zero additional shadow detail crushing
And you can get this with only +8.3ms lag (half a 60Hz) over Retrotink non-BFI.
So that's whatever beam raced transceiver lags (~2ms-ish) plus whatever pre-existing LG lag has, plus the fastest BFI overhead (60Hz BFI adding only 8.3ms lag).
NeonPizza wrote:
28 May 2024, 01:08
combined with the HDR brightness boosting SDR trick
As long as you're okay with LG's native HDR lag, though adding BGI will only be +8.3ms above non-BFI.

In other words, it will be combined lag of:
SDR BFI -- will be roughly (~2ms transceiver lags + 8.3ms BFI lag) latency over whatever lag the normal LG SDR mode is
HDR BFI -- will be roughly (~2ms transceiver lags + 8.3ms BFI lag) latency over whatever lag the normal LG HDR mode is

*Assumes 60Hz input, 120Hz BFI'd output. Using different ratios and different refresh rates will change the arithematic.
NeonPizza wrote:
28 May 2024, 01:08
I can't tolerate 16ms/100% OLED motion blur. 8.3ms i feel gets you into plasma motion clarity territory and it's good enough to make games look playable.
It most certainly does.

I hope I've given you enough information to make a pros/cons decision.

If you want to give Blur Busters an affiliate commission, you're welcome to use ?ref=area51 to the end of the Retrotink URL:
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
NeonPizza
Posts: 67
Joined: 20 Oct 2021, 03:01

Re: LG CX vs. RetroTINK-4K BFI - confusing...

Post by NeonPizza » 31 May 2024, 00:07

Chief Blur Buster wrote:
28 May 2024, 15:06
NeonPizza wrote:
28 May 2024, 01:08
The goal is to get under 16ms. TINK4K supposedly adds another 2.5ms i've heard as well, at least.
Retrotink 4K BFI is lower lag than LG internal BFI.

This is because Retrotink 2:1 BFI is fully raster beam-raced based, which means it begins outputting the visible refresh cycle (fastscan 1/120sec) when it's buffered half of the slowscan 1/60sec signal. The scanouts meet at the end of the incoming scanout and outgoing scanouts. It's the lowest-latency BFI. So 50%:50% BFI (outHz = 2 x inHz) only adds half an inHz latency, e.g. +8.3ms latency for adding BFI to a 60Hz signal, which turns it into a 120Hz signal.

Metaphorically: It's like a water faucet flowing slowly (a single refresh cycle aka 60Hz) into a tank (aka video buffer in Retrotink 4K), and the tank has another faucet at bottom that releases 2x faster than the original faucet refilling the tank (output Hz starts at 2x flow rate, aka 120Hz) before the faucet finished filling the tank. The output flow (aka output signal scanout) continues at full 2x speed (aka 120Hz scanout), even while the input flow (aka input signal scanout) is still pouring into the tank (aka video buffer in Retrotink 4K). Meaning one uninterrupted fast flow (120Hz scanout), even though it started flowing BEFORE the input slow flow (60Hz signal) was finished buffering (tanked). Basically, the fastest/soonest/least laggy possible BFI for a standard 60Hz input signal versus a standard 120Hz output signal (no nonstandard signals, such as large VTs or Quick Frame Transport). The goal is soonest-possible uninterrupted double flow speed (double scanout speed), so the output scanout is timed halfway to the halftime of the input scanout.

However, if you use the HDR booster feature, remember HDR adds a bit of latency, so if you want only half a frame latency over LG 60Hz, you will need to use SDR (probably). However, you have the option to turn on/off HDR brightness boosting.
NeonPizza wrote:
28 May 2024, 01:08
Another thing I noticed with the internal 60hz BFI on my LG C1 OLED, is that it causes severe shadow detail crushing. And when using 120hz's Low, Medium and High BFI when gaming at 120fps, the black crushing also gets worse the higher your go. It's a deal breaker. Is BFI shadow detail crushing also a problem with TINK4K, or is that just the result of lazy half baked BFI implementation from LG?
That's why Retrotink 4K BFI is superior. You can adjust Retrotink 4K picture settings (contrast, brightness, gamma) and histogram the black/white level manually to avoid the crushing effect. You can even lift the blacks off black (make blacks into a grey color), or dim the whites below white (makes whites darker than fullwhite). Adjust accordingly to your room lighting conditions and you'll see way more detail with Retrotink 4K BFI.

Keep in mind though, you'll only have 8bpc (24bit color), the 30bit color will be decimated by Retrotink 4K, which doesn't interfere with retro uses, but may affect certain home theater uses (e.g. adding BFI to Blu-Ray or Netflix). However, this is one of the cheapest video processors on the market capable of 48Hz, 72Hz, 96Hz, 120Hz dejudder, with optional BFI simulation of 35mm projector double strobe -- a feature that not even many 5-figure-priced-units had!
NeonPizza wrote:
28 May 2024, 01:08
60hz 8.3ms persistence BFI at 60fps, with zero additional shadow detail crushing
And you can get this with only +8.3ms lag (half a 60Hz) over Retrotink non-BFI.
So that's whatever beam raced transceiver lags (~2ms-ish) plus whatever pre-existing LG lag has, plus the fastest BFI overhead (60Hz BFI adding only 8.3ms lag).
NeonPizza wrote:
28 May 2024, 01:08
combined with the HDR brightness boosting SDR trick
As long as you're okay with LG's native HDR lag, though adding BGI will only be +8.3ms above non-BFI.

In other words, it will be combined lag of:
SDR BFI -- will be roughly (~2ms transceiver lags + 8.3ms BFI lag) latency over whatever lag the normal LG SDR mode is
HDR BFI -- will be roughly (~2ms transceiver lags + 8.3ms BFI lag) latency over whatever lag the normal LG HDR mode is

*Assumes 60Hz input, 120Hz BFI'd output. Using different ratios and different refresh rates will change the arithematic.
NeonPizza wrote:
28 May 2024, 01:08
I can't tolerate 16ms/100% OLED motion blur. 8.3ms i feel gets you into plasma motion clarity territory and it's good enough to make games look playable.
It most certainly does.

I hope I've given you enough information to make a pros/cons decision.

If you want to give Blur Busters an affiliate commission, you're welcome to use ?ref=area51 to the end of the Retrotink URL:
Thanks again for the excellent response! :) Wasn't expecting to hear about that triple strobe BFI setting that gets film judder down to CRT levels, surpassing plasma asumingly as well?

So basically, using the TINK4K will automatically add 2ms or 2.5ms of latency, which is what you meant by transceiver lag? Combined with 8.3ms of 60hz BFI lag(And will it do so at 4K, or is it's 60hz BFI 8.3ms persistence when gaming in SDR & 60fps limited to just 1080p?), AND there's more additional latency added from the HDR brightness boosting SDR trick when in use with TINK4K 60hz BFI. But how much latency does the HDR SDR brightness boost trick add exactly, and how many nits do you gain to compensate for the 50% brightness drop using 60hz 8.3ms BFI in SDR game mode?

I plan on getting a QD-OLED(Last years Samsung S90C), but it's internal 60hz BFI has a whopping 29ms of latency. But that automatically becomes irrelevant, basically dropping down to 0 lag since my S90C would be paired with the TINK4K using it's own 60hz BFI which adds a grand total of 8.3ms of lag without any additional latency coming from the TV?


2ms lag - (TINK4K Transceiver)
8.3ms lag - (TINK4K 60hz BFI @1080p or 4K)
4ms lag, assuming - (HDR SDR Brightness boosting trick)

14.3ms of lag total, with no additional latency coming from the S90C? If that's so, that's pretty good with an 8.3ms persistence and all! -5ms of lag total would be ideal to get games feeling nearly as good as a CRT, but that's where vanilla 120fps or 144fps comes into play.


Another thing i was wondering about, is if the TINK4K HDR SDR Brightening trick has a negative impact on colour accuracy. I've noticed that turning the HDR Module 'ON' setting in my LG C1 OLED's service menu to gain more brightness when using the TV"s internal 60hz BFI in SDR game mode, ruins the colours, and doesn't really salvage enough brightness either. Does the exact same thing apply to what TINK4K is doing, colour wise, or is just a mere brightness upgrade without any negative impact to the picture?

The goal is to buy two RetroTINK4k's. One for my main future 65" QD-OLED, with triple strobe BFI for Streaming, Blu-rays & DVD's, along with Switch & PS5. And then a second for my smaller OLED display that i plan on using for SD 8-16 bit retro consoles. Reducing 50% of total motion blur, and lets say latency being around 14ms total, with the perks of QD-OLED technology definitely leaves me feeling pretty content. End game would obviously be 5ms or less lag, with zero motion blur. And when that happens i'll definitely be there, because CRT's are aging and diminishing in picture quality as time goes on.

It's getting very difficult to even find something like a 27" Sony WEGA Trinitron(2005) with low hours in this day and age. I'd rather just settle for QD-OLED + TINK4K at this point.

sodaboy581
Posts: 12
Joined: 01 Feb 2022, 10:37

Re: LG CX vs. RetroTINK-4K BFI - confusing...

Post by sodaboy581 » 31 May 2024, 10:55

Chief Blur Buster wrote:
27 Dec 2023, 22:05
IMPORTANT: There are input signal and output signal bandwidth limitations due to memory bandwidth performance. For example, 1080p60 input and 1080p240 output is not possible without overclocking the Retrotink 4K

EDIT: To purchase, Blur Busters Affiliate Link for Retrotink 4K
I was actually considering until I read this.

Overclocking what, the CPU of the RetroTink 4K? How hard is that to do?

I know HDMI 2.0 can definitely fit 1080p240 into it's signal bandwidth, but why would the RetroTink 4K not have the memory bandwidth to output 1080p240 when it can do 4k60?

User avatar
Chief Blur Buster
Site Admin
Posts: 11775
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: LG CX vs. RetroTINK-4K BFI - confusing...

Post by Chief Blur Buster » 07 Jun 2024, 01:12

sodaboy581 wrote:
31 May 2024, 10:55
Chief Blur Buster wrote:
27 Dec 2023, 22:05
IMPORTANT: There are input signal and output signal bandwidth limitations due to memory bandwidth performance. For example, 1080p60 input and 1080p240 output is not possible without overclocking the Retrotink 4K

EDIT: To purchase, Blur Busters Affiliate Link for Retrotink 4K
I was actually considering until I read this.

Overclocking what, the CPU of the RetroTink 4K? How hard is that to do?

I know HDMI 2.0 can definitely fit 1080p240 into it's signal bandwidth, but why would the RetroTink 4K not have the memory bandwidth to output 1080p240 when it can do 4k60?
There are discussions where 1080p240 hits a weak link. I'd have to ask Mike what that was. There were unofficial discussions of possible hacks to get it to work (overclock + 4:2:2) but I don't know if these hacks are officially published.

For the best info, go to the Retrotink 4K Reddit:
https://discord.gg/Jwut5CQf
Within it, is the Retrotink 4K channel.

Also, there are creative solutions possible, depending on the capabilities of your monitor. If your monitor has a "stretch to full screen" feature (stretch to fill screen that works for odd resolutions), you can try creating a custom ModeLine on the SD card for half vertical resolution (e.g. 2560x720) and use an aperturegrille mask filter. 240p games will generally look fine scaled to 720p, and then further vertically bilinear interpolated by a monitor's scaler to 1440p.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 11775
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: LG CX vs. RetroTINK-4K BFI - confusing...

Post by Chief Blur Buster » 07 Jun 2024, 01:29

NeonPizza wrote:
31 May 2024, 00:07
But how much latency does the HDR SDR brightness boost trick add exactly, and how many nits do you gain to compensate for the 50% brightness drop using 60hz 8.3ms BFI in SDR game mode?
Picture processing on Retrotink 4K is done realtime on a streaming basis (raster/beamraced), for things like brightness/contrast/etc. There is no lag penalty for using any of Retrotink 4K's picture adjustments. The brightness boost is part of this.

Only SDR lag and HDR lag difference would be applicable here.
NeonPizza wrote:
31 May 2024, 00:07
I plan on getting a QD-OLED(Last years Samsung S90C), but it's internal 60hz BFI has a whopping 29ms of latency. But that automatically becomes irrelevant, basically dropping down to 0 lag since my S90C would be paired with the TINK4K using it's own 60hz BFI which adds a grand total of 8.3ms of lag without any additional latency coming from the TV?

2ms lag - (TINK4K Transceiver)
8.3ms lag - (TINK4K 60hz BFI @1080p or 4K)
4ms lag, assuming - (HDR SDR Brightness boosting trick)

14.3ms of lag total, with no additional latency coming from the S90C? If that's so, that's pretty good with an 8.3ms persistence and all! -5ms of lag total would be ideal to get games feeling nearly as good as a CRT, but that's where vanilla 120fps or 144fps comes into play.
You asked me a question with additional unknown/undefined/unconfirmed scientific variables, so I'm not able to answer this. Apologies.

Also, I do not understand the 4ms rabbit-out-of-a-hat number. There is no lag difference between HDR without boost, and HDR with boost. Simply HDR on/off lag only, it's just transceiver lag (which includes the rolling scanline window used for realtime streaming picture processing) and half-frame lag only. There is no lag penalty for using picture adjustments like contrast/brightness/saturation boost.
NeonPizza wrote:
31 May 2024, 00:07
Another thing i was wondering about, is if the TINK4K HDR SDR Brightening trick has a negative impact on colour accuracy. I've noticed that turning the HDR Module 'ON' setting in my LG C1 OLED's service menu to gain more brightness when using the TV"s internal 60hz BFI in SDR game mode, ruins the colours, and doesn't really salvage enough brightness either. Does the exact same thing apply to what TINK4K is doing, colour wise, or is just a mere brightness upgrade without any negative impact to the picture?
You adjust the HDR boosts to stay within your TV's gamut. This will minimize color distortions.
Large HDR boosts will distort color but you can compensate using other picture adjustments on the Retrotink, like brightness/contrast/gamma/saturation/RGB gains/etc. You can save favourite profiles.

Remember, original CRT tubes also distorted colors when they became "bright" (e.g. overbright scenes can have color distortions versus dim scenes).

Some screens I tested on distort HDR less than other screens. For example, OLED screens tended to distort HDR boosts somewhat less than LCD screens, although you don't want to push it to more than roughly ~2x brightness on current LG WOLED's, although you may be able to get to ~3x+ on the brighter (higher end) OLED panels, before really noticing any colorspace distortions.

So this doesn't necessarily detract from the retro experience, as long as kept under sufficient control...
...because CRT tube color also distorted when you overbrightened/oversaturated/etc them.
NeonPizza wrote:
31 May 2024, 00:07
The goal is to buy two RetroTINK4k's. One for my main future 65" QD-OLED, with triple strobe BFI for Streaming, Blu-rays & DVD's
For 24fps triple strobe (72Hz flicker via 144Hz visible+black frame pairs), make sure your TV supports 144Hz, or at least 144Hz within its VRR range.

Also, the Retrotink 4K supports a VRR-flag trick for custom refresh rates. It can be used to trick a VRR-compatible TV to do fixed-frequency BFI at any refresh rate within its VRR range. The refresh rate won't vary during Retrotink 4K VRR, it's simply a VRR-flag trick a TV to accept custom fixed-Hz refresh rates when the TV doesn't support custom fixed-Hz refresh rates in non-VRR mode.
NeonPizza wrote:
31 May 2024, 00:07
End game would obviously be 5ms or less lag, with zero motion blur. And when that happens i'll definitely be there, because CRT's are aging and diminishing in picture quality as time goes on.
Long term, my dream is to help a next-generation retro video processor to support a CRT electron beam simulator. Using brute refresh rates (e.g. 480-960Hz) to do 8-16 digital refresh cycles of CRT beam simulation per 1 analog refresh cycle. Brute Hz for the win!
NeonPizza wrote:
31 May 2024, 00:07
It's getting very difficult to even find something like a 27" Sony WEGA Trinitron(2005) with low hours in this day and age. I'd rather just settle for QD-OLED + TINK4K at this point.
I've given up trying to buy a Sony FW900 CRT at this stage now.

Eventually, ~1000 Hz ~3000+ nit OLEDs will be available by end of decade, which will be fantastic for software-based CRT beam simulators. Did you know that a high-speed video (dynamic range equalized) of a CRT tube, played back in realtime on a ultra-high-Hz OLED, kind of looks like the original CRT tube? The aim is now to do it via software-based means, including box-in-middle techniques.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply