Trip wrote:I have been browsing the forums a bit. And I see that the combination of both lightboost and g-sync is not a possibility at the same time. But what I think is strange is that Mark says it would not be possible via software (viewtopic.php?f=5&t=564). It seems rather weird that I see applications which can control strobe rate for benq monitors made by blur busters and it is obvious that you know frame render time on a computer.
Trip wrote:It seems to me a combination of those two could easily create a strobing rate which can dynamicly change with the frame rate. The formula for the required strobe length adjustment doesnt seem so difficult either.
Trip wrote:(Max refresh / fps) * starting strobe time = new strobe time.
Where starting strobe time is the strobe time used when setting up brightness / blur reduction.
Also when the frame render time has exceeded a certain threshold say 11ms = ~91fps. It will just stop using the strobe lighting so it will not cause annoying flicker.
Trip wrote:Is the software just not fast enough
Chief Blur Buster wrote:Strobe length needs to be ultra precise, so software controlling exactly when the light comes on, and when the light comes off, is extremely hard. With 1 millisecond persistence, a 10 microsecond mistake in the strobe length causes a 1% brightness variances. (1/100th of 1 millisecond flash, equals 10 microseonds). So if you vary the strobe length by software inaccuracies, there is noticeable flickering (it actually happened in my testing). If half of the strobes are randomly 1% longer and half of the strobes are randomly 1% shorter, you've got a candlelight-flicker effect (1% increase/decrease in average brightness). Do you have the real-time programming knowledge to time a software-driven strobe backlight (driven via a logic or GPIO line directly from software), with less than a 10 microsecond error?
Chief Blur Buster wrote:We welcome Blur Busters tweakers to make mods of their monitors. There are some monitor modders in the Area 51 Forum, including cirthix, who might be quite interested in hacking a GSYNC monitor to do some strobing. But you'll likely need an external circuit to do so, or some monitor firmware / FPGA programming skills. And I'd bet NVIDIA has already tried to do variable-rate strobing, but found some quality issues (e.g. flicker).
Trip wrote:Yes I did read it but that involved a combination of constant backlight or pwm in combination with a strobe light right? What I meant was to only use the strobe light but just longer and longer strobes if the refresh rate goes down.
Trip wrote:But maybe that would worsen the situation even more but I thought that would only become apparent at below 85hz (hence the 11ms limit I mentioned).
Trip wrote:I thought it was possible via software I just came up with this solution mentioned above.
The ownership and purchase of a BENQ Z-Series V2 monitor is very educational to researchers wanting to study the visual phenomena of a variable-persistence monitor that also lets you customize strobe timings (e.g. make strobe early, late, bigger, smaller) and even lets you use ToastyX CRU or NVIDIA Custom Resolution to create mammoth blanking intervals (e.g. 1500-scanline refresh cycle for 1080p) to create the huge pauses beneficial to let GtG settle between refreshes before strobing. Some monitors, such as the Z-Series scan in realtime, so it works well with Vertical Total tweaks. While LightBoost does the fast-scanout in hardware (slightly buffers the refresh coming from the cable, and then does a faster-than-cable scanout to LCD).
Trip wrote:Does this also mean lcd response time is effectivly improved since all the pixels are activated earlier?
Trip wrote:Can a lot of monitors do this btw or is it specific to just this monitor. I have tweaked these settings with my monitor and it would not let me do this without an out of bound error.
Trip wrote:I think I might have formulated it a bit wrong by saying the response time will be improved. What I meant is that the pixel is activated earlier meaning it has more time to transition between colors right? But if this is the case what about ips displays that in general have slow response times. By using this technique you can pretty much eliminate the problem since it all happens on the dark side of the moon right. The fully scanned out frame will only be shown after the strobe has been passed. Eliminating the problem of slow response times (GtG). I dont see any high hz displays which are also ips displays I quess the biggest reason is the slow pixels. Wouldnt this technique help to actually mask the slow pixels.
Trip wrote:I can also see some potential in those overclockable 120 hz 2560 x 1440 displays since the sweep is happening faster the pixels have more time to transition properly before updating. Resulting in properly showing there colors instead of just transitioning to another subsequent color. These displays are running out of spec and I cant imagine the pixels arent affected.
Users browsing this forum: No registered users and 1 guest