I hope SpecialK or Reshade manages to architecture itself to support subframe processors that I've been talking about for years. Earlier was multi-refresh BFI, now today it's CRT simulator that I released at
www.blurbusters.com/crt -- and tomorrow it'll be stuff like Plasma TV Simulator / Adjustable LCD GtG on OLED / High Quality LCD Overdrive Algorithms / etc.
I don't do the driver programming, I do the algorithms, please help Blur Busters by getting the ecosystem ready for subrefresh processors (including removal of native:simulated ratios, I can do CRT-VRR in the future too).
Simulated VRR + Simulated CRT like a software based GSYNC Pulsar, to any fixed-Hz OLED.[/]
Code: Select all
Select Simulated Display: [CRT | Plasma | Slow LCD | Fast LCD | DLP | 35mm Projector | Others...]
Adjust Simulted Settings [Phosphor Speed Settings>> | GtG Speed Settings>> | Subfield Dither Settings >>]
Select Simulated Hz: 24Hz [||||||||||||||-----------------------------------------------] 1000Hz
Enable Simulated VRR: [Off | On]
Given enough brute Hz, and access to linear colorspace, and a beautifully generic unspent-photons Talbot Plateau energy buffer approach, this becomes possible.
When I turn off the generic crappy chinese LCD GtG algorithms (
obsolete 17x17 OD LUTs, ugh), and enable a superior LCD GtG overdrive in a GPU shader, it suddenly becomes almost as good as a GSYNC monitor's overdrive! I'll eventually opensource this within a year or two. But it can only happen on the first refresh cycle after a new frame (regardless of content framerate).
Yeah, that inhouse experiment works but is not VRR compatible, but at least I can beat a cheap monitor's overdrive, that you simply turn off the montior overdrive, and use my shader overdrive instead. Might rob some % of your GPU and lower frame rates, but motion purists won't care during 60fps-locked material and/or movie playback and/or things like just browser / desktop scrolling (where bad LCD overdrive can turn into muddy mess).
Don't worry (to NVIDIA employees -- I know you all love me) -- NVIDIA is still far ahead (gamma-corrected overdrive during VRR and strobing? Oh, that's Ph.D graduate math; I'll focus on easier stuff like plasma subfields and VRR-CRT. Yes, making my own software-based GSYNC Pulsar is easier than VRR LCD overdrive)
Yep. Thanks to generic brute Hz + shaders. Plasma algorithms formerly in FPGA/firmware can now be done in shader on OLEDs too, and I can slow down OLED GtG optionally for 24fps Netflix to look better.
Make it happen, you software developers, you operating system developers, you driver developers.
Get ready for my temporal filters that gradually comes out one by one over 2025-2030s!
Refresh Cycle Shaders Please -- full stop -- Independent of content framerate.
Not just addressing anyone in particular (RealNC can spread the gospel around the channels like Discord). The software ecosystem is falling behind what is confirmed possible in the lab.
Even box-in-middles such as Retrotink is beginning to port my algorithms (the CRT simulator is coming to Retrotink 4K, the Pro edition, not the CE edition).
The Open Source Display Revolution beckons: BYOA (Bring Your Own Algorithm). I've mostly given up trying to sell to stubborn display makers who cheap out and don't implement my Blur Busters Approved stringent rules. My revenues are coming from contracts (TestUFO & Tester related) as spinoffs anyway.