It would indeed be great to be able to apply any arbitrary image-space processing tech via a fake windows display driver + shader injection.
I'll look into it! If I get this working, of course. I'm planning on buying either the Optoma UHD 65 or the Acer 4K model (the smaller one) shown at ISE 2017. Both should be out by june for under 3 grand. Not bad to get a 4K, 120hz, HDR, 140-inch projected image for 3k, innit?
For BFI on a 3D projector it might be possible to do it with a windows custom desktop manager program that can span multiple desktops into a final image. You know, have two pages side by side on the same monitor, but each one is treated as a separate screen.
The page on the right need only be set to black and then you just switch the projector into SBS 3D mode and voila, BFI. No driver hacks or shaders required whatsoever. Of course you lose 50% of your horizontal resolution doing that (or 50% vertical if you do it with over-under 3D).
Of course you could also do it for games with SweetFX with a custom windows desktop resolution of 960 x 1080, render the game in fullscreen, then the shader just scales the U texture coordinate by 2 and sets the value to black after U > 0.5. That would get you BFI in SBS mode but your screen aspect ratio would be wrong, you'd need to be able to hack only the X part of the FOV (FOV sliders are typically implemented isomorphically...what we'd need here is an anamorphic slider. I've implemented this in games before, to decouple the x / y resolution multiple from the FOV, i.e. treat "X" as something else, or just set it to 2.37:1 for example, to use an anamorphic lens but get the full frame of real rendered pixels instead of scaling the image vertically). You can always just set a custom windows res of 2560x1080 and let the GPU scale it to 1920x1080 anamorphic for that, which works well in all games / movies / desktop, but I digress.
It's probably easier to just render the game as 1920 x 1080 as you normally would and then use Sweet FX to compress it all leftward, set the right half of the image to black. It would amount to a sort of supersampling AA in the X direction only, so it's not like it would be entirely a waste of rendering power.
But BFI on a projector is not ideal because of the lumens cost, it's much better to render two successive 960 x 1080p images and then push them to the display as a single 1920 x 1080 SBS frame which will then be shown in the proper sequence. There's no reason this shouldn't work with 3D capable LCD monitors too. But those already support over 144hz refresh rate already so it's pointless. And the challenge is tricking the game into thinking it's rendering to a 120hz capable device context. A DX11 / 12 wrapper could intercept calls to do all this off-screen rendering internally but that's not going to get it working on windows desktop.
All these 1080p DLP projector hacks are doing is trading spatial resolution for temporal. But on the 4K DLPs, you trade spatial resolution for spatial resolution + temporal resolution. If the screen is static, you would still see a perfectly resolved 4K image, because the samples for each subframe are jittered diagonally.
Maybe Tim Lottes (who reads this forum sometimes) can get AMD to add a custom 4K mode to AMD drivers for this. He's probably way too busy but I for one would certainly invest in an AMD GPU if it gave me some ability to do custom desktop hackery like this.
The way VR frame compositing engines work is basically what we'd need for this to work in games. The game needs to believe it's running at 120fps at 2716 x 1528, with each alternate frame having a 0.5 pixel diagonal offset, and once two new frames are ready, pack them together using my shadertoy code above and present every 60th of a second to the actual projector which only accepts 60fps. But these projectors are definitely capable of running at 120 fps internally.
There's also the question of effective bit depth. I asked TI about whether they support 10-bit at 4K60 on their new wobulation chips and they said yes. So that means they are in fact doing 10-bit 2.7K 120 which is actually what we should render our games / movies to.
The diagonal sample jitter / offset can also be defeated in some of these DLP models, using a "silent mode".
Of course this all begs the question: why don't they simply expose a PC resolution of 2716 x 1528 x 120 and be done with it. Or 96hz with 10-bit (the literal maximum that HDMI 2.0a can support over 18gbps for half-res 2160p).
I even asked several of these projector companies why they don't add 120hz modes at 1080p at least and they all said "why would you need more than 60hz" ... /facepalm