PanzerIV wrote:Hi guys, I've finaly got my
XL2420Z replacement from BenQ Canada with firmware v2.0 this time

However I'm encountering a very annoying issue which I'm sure there's a way to fix it but for the moment is a total deal breaker and even worse than v1.0. Let me explain below but I'll also have a few questions too since this whole monitor tweaking is rather complex and a first for me.
PanzerIV wrote:1- Setting the stobe phase to its earliest through the BlurBuster's utility will lead to the less possible input lag but increase ghosting. What is the default value as it seems to be already at the earliest. Also from the earliest to the latest, how much milliseconds of difference in input lag there is? To have the cleanest center possible, is the best value the same for all Z series user or it can differ from 1 panel to another?
The value is always the same for a specific timings (e.g. specific refresh rate, resolution & specific timings - Vertical Total).
You have to recalibrate for a different refresh rate, or a different Vertical Total.
Usually, if you use a large Vertical Total (e.g. VT1350), you can set Crosstalk to 0 with less artifacts than on a V1 monitor.
PanzerIV wrote:2- If it says that it's recommanded to run the refresh rate to 100 or higher to avoid flicker but that the VT trick only works at 120Hz or less, why were you guys trying to many very low refresh rates such as 50-60Hz?! Ain't it much worse for motion panning, using 1000Hz mouse and flickering???
It depends on your goals of input lag, or motion fluidity.
Prioritizing on input lag & VSYNC OFF: If you use VSYNC OFF and you run ultrahigh frame rates, you definitely don't want to run a low refresh rate. You want to run at 120Hz or even 144Hz to get the best performance. You need minimum time between the control method and the onscreen reaction, even if it means microstutters. (During VSYNC OFF, even 200fps@120Hz can still have visible microstutters to those people sensitive to microstutters). Reducing input lag is EXTREMELY critical in competitive gameplay; first person shooters -- since reacting 10ms before your opponent can win the frag. Having less input lag than your competition helps a great deal. There's really no way around it; a gamer often has to compromise motion perfection (e.g. microstutters) in order to gain minimum input lag.
Prioritizing on motion fluidity: If you prefer zero stutters, zero tearing, then sometimes a lower framerate at a lower Hz, can actually look smoother, since it's easier for a GPU to run framerates matching resfresh rates. The "perfect motion effect" (CRT effect, TestUFO-smooth effect, Super Mario Brothers Nintendo butter-smooth pan effect, space-shoot-em-up smooth pan effect) occurs on strobed displays at framerate == refreshrate == stroberate. Especially on CRT and plasma. If the "perfect motion effect" is what you are seeking, then the framerate physics are different. You want a form of low-latency VSYNC ON in order to achieve this kind of effect (not suitable for elite professional competitive FPS gameplay, as you really need VSYNC OFF for that). So 75fps@75Hz@2ms persistence can look equally clear motion as 100fps@100Hz@2ms persistence, since persistence (strobe length) dictates the amount of motion blur, rather than refresh rate or frame rate, provided you got an exact match between framerate, refreshrate, and stroberate. You do get more flicker and stroboscopic effects at lower refresh rates, but the motion is still "perfect looking" when you track your eyes along the motion; much like 8-bit smooth panning in old games at 60Hz. Consequently, 75fps@75Hz strobe looks MUCH smoother (butter-smooth pan effect) than 113fps@120Hz. For fully refresh-rate-synchronized motion, there is a full and complete elimination of all forms of microstutters, provided the game engine isn't bottlenecked by anything else. e.g. keyboard strafing in Source Engine games produces a "perfect strobe-free pan effect" on strobed displays. Sliding the rails in Bioshock Infinite looks "perfect motion", with zero microstutters, if Bioshock Infinite has no framedrops. (Bioshock is singleplayer, so input lag of VSYNC ON is less critical, if you're prioritizing motion fluidity). There's certainly more flicker/stroboscopic effect, but you gain the butter-smooth stutter-free tear-free perfect pan effect, especially for keyboard-controlled or game-controlled panning (e.g. no mouse microstutters).
Some people around here prioritize for input lag, and some people around here prioritize for perfect-looking motion (CRT effect).
PanzerIV wrote:3- You say the image becomes brighter with VT1350 but how much brigther does it gets. Is it like +40cd/m2? Is it also only noticeable when enabling "BenQ's Motion Blur Reduction"? I didn't seem to really notice a brightness difference.
It is only noticeable during BENQ Blur Reduction, and only at higher settings (e.g. settings above 2.5ms persistence). Apparently the monitor calculates different strobe flash lengths during large VT settings. Strobe Utility's persistence scale is calibrated for a VT1350-tweaked 120Hz mode. The brightness of the 0.5ms setting is exactly the same, while the brightness of the 3.0ms setting is about 50% brighter, while the brightness of the 5.0ms setting is actually twice as bright. If you're using low persistence settings, you won't be noticing much of a difference (if any).
PanzerIV wrote:4- Some very bad issues I'm having is the BB Utility keep resetting everytime I open it. How can I make sure it saved the settings I last applied before closing it?
It should be automatically reloading the last-memorized settings.
When you adjust "Persistence", does the brightness of the monitor change? (it should).
PanzerIV wrote:5- When I had firmware v1.0 I could set VT1350 at 100-120Hz and still use Blur Reduction. Now with my v2.0 replacement, I use the same high quality DVI-D cable and the same computer that's on Win 8.1 64bit with a SLI of Geforce GTX 670, yet whenever now I activate Blur Reduction with VT1350 no matter if 100 or 120Hz, I get a black screen!

That is currently abnormal.
Are you adding the VT1350 tweak via NVIDIA or via ToastyX?
-- The default settings might be a bit messed up. Try activating the service menu (instructions at bottom of
Strobe Utility page), and setting "Strobe Phase" to 00, before re-enabling BENQ Blur Reduction. Once this is done, reload Strobe Utility, and see if adjusting Persistence changes the brightness of the monitor.
-- Are you getting a green checkmark when you run Strobe Utility?
-- When you adjust Persistence (Blur Reduction ON), does the brightness of the screen change?
-- Try without the VT1350 tweak at first. Does it work?
-- Try testing BENQ Blur Reduction at 100Hz too. Does it work?
-- If it does, try setting Vertical Total to 1350 again, but changing Horizontal Total to 2040 (to allow VT1350 to work at 120Hz without needing the ToastyX patcher). Usually you'll see 2080 for Horizontal Total, and sometimes larger (2200) depending on how the utility calculates timings. Large Horizontal Totals is not currently desirable in most cases, and can lead to a black screen when combined with large Vertical Totals.
-- If all else fails, apply the ToastyX dot clock patch, and see if that solves your black screen.
Please do tell us what solution works for you -- as I am interested in what happened to cause your black screens, as well as what modes worked and what modes doesn't work; as those will provide clues on what's going on.