deama wrote:So I got ahold of a black frame insertion fx script from reshade, however running a game at 130-ish fps (I got a 144hz monitor) it flickers like crazy? What's the problem here? Shouldn't I not be able to see this at 130+fps/hz?
You need a refresh-cycle-level filter, rather than a frame-level filter, to prevent the erratic-flicker-effect of Reshade-based BFI.
BFI is great for niche purposes. It isn't intended to be anymore than a workaround/solution.
RealNC wrote:You should probably ask the person who wrote it how it's actually supposed to be used.
The BFI module is an interesting experiment but it has a fundamental flaw:
It's frame-based, not refresh-based
. Basically, proper BFI should be guaranteed at 72Hz for 144Hz, no matter what the underlying frame rate is. In this situation, any framerate drops will simply create double images (e.g. like 30fps at 60Hz) but that's less objectionable than an erratic flicker of a frame-based BFI.
Remember I was the world's first website to test 480Hz. http://www.blurbusters.com/480hz
Any BFI errors are visible even at 480Hz.
Even a single framedrop in BFI at 480Hz is visible.
TRUE, many artifacts are NOT visible at high frame rates, but this is one of those 'super sensitivity' behaviors. It's one of those "Milliseconds Matters" topics at Blur Busters (0.5ms vs 1.0ms GtG is human-visible, 0.5ms vs 1.0ms MPRT is human visible, 480Hz vs 1000Hz is human visible, etc -- For more reading, see http://www.blurbusters.com/1000hz-journey
Flicker is a high-sensitivty threshold that even goes all the way to microseconds in some situations. Even a 10 microsecond erratic variance in a strobe backlight sometimes become human visible. So a 1.0ms strobe flash versus 1.01ms strobe flash is a 1% brightness difference. That's like RGB(252,252,252) versus RGB(255,255,255) -- that 1% brightness difference caused by 1% extra photons hitting your eyeballs because your strobe-length (MPRT) erratically varied by just a mere 10 microseconds (0.01ms). That was visible in a prototype monitor I had on my desk, and the firmware engineer had to clean up the interrupt scheduling and PWM logic so that such flicker stopped. Basically it randomly varied between 1.0ms strobe flash length and 1.01ms strobe flash length, approximately once a second. That looked like a faint candlelight flicker when displaying a full-screen Windows Notepad window.
While the human eye can't see the timing difference -- the average amount of photons is varying -- which your eye DOES see. If your eye captured more photons, things ARE going to look brighter, no matter how compressed the photons are (e.g. squeezed into a flash of a strobe backlight) -- this is one of those situations where flicker timing errors is easily seen. It's also why VRR strobing is an extremely hard to make comfortable (e.g. why the unofficial GSYNC+ULMB hack was never very popular). So any framerate errors heading into erratic BFI, creates a 1/144sec (6.9ms) flicker.
No shit sherlock, I can see flicker errors as small as 0.01ms -- your error BFI error is 690 times bigger whenever there's a framedrop during frame-based BFI. http://www.testufo.com/blackframes
-- do something like drag a browser window, or interrupt the BFI to make it run 143fps (71fps) temporarily instead of 144fps (72fps) -- that flicker will become visible.
Anyway, this is getting off a tangent, but the point is flicker falls very deeply into the "Millisecond Matters" sphere -- practically all the way to "Microseconds Made Human Visible".
TL;DR: BFI should ideally be done at the refresh cycle level, not at the frame level.
TL;DR2: It requires developing a window virtual display driver