Asus VG248QE: Problems with LightBoost in Battlefield 4

Ask about motion blur reduction in gaming monitors. Includes ULMB (Ultra Low Motion Blur), NVIDIA LightBoost, ASUS ELMB, BenQ/Zowie DyAc, Turbo240, ToastyX Strobelight, etc.
Falkentyne
Posts: 2796
Joined: 26 Mar 2014, 07:23

Re: Asus VG248QE: Problems with LightBoost in Battlefield 4

Post by Falkentyne » 16 Nov 2014, 14:26

I think you should try the 2420G.
One thing the 2420G does, which the other screens do NOT do, is allow you to switch between the gsync module (with all of Nvidia's own settings, and the Nvidia only ULMB-and remember Gsync boards do NOT have onboard scalers!), and the normal Benq module, that has all its options, scalers, gamma settings, blur reduction, inputs, etc. Then you retain console compatibility AND AMD video card compatibility (you can use blur reduction on your Radeon or console, UMLB only works with Nvidia, and consoles of course, don't have displayport connections).

Otherwise, you can go for the expensive ROG swift for the 8 bit TN panel or the AOC. I'd much rather choose the 2420G if you were going to choose between the benq and aoc, though (more features, more useful, and more connectivity), so 2420G if you're sensible, and ROG Swift if you're rich (and can get a good panel without inversion artifacts, dead pixels or backlight bleed/oval issues). There have been some QA reports on the Rog boards, including a few that actually died/got unfixable massive corruption (even on the BIOS screen or with video DP connector unplugged) after some time, and no one has determined if that's a defective panel issue, or firmware corrupting something, but an 8 bit 1600p panel looks GREAT if you get a good one.

hbjdk
Posts: 15
Joined: 15 Nov 2014, 13:21

Re: Asus VG248QE: Problems with LightBoost in Battlefield 4

Post by hbjdk » 16 Nov 2014, 22:42

What does "onboard scalers" mean? I don't understand. I haven't read about that before.

Good points about 2420G. I won't be buying any consoles though - I'm a pc kind of guy - and as for AMD video cards, my Fractal Design R4 case has poor ventilation so the extra heat of using AMD video cards instead of Nvidia is not tempting. But of course future products from AMD could be different in that regard so it could very well be a good thing to at least have the option of using AMD video cards instead of Nvidia.

But can't you add an AMD video card to your computer even though you're using a GSync enabled monitor?
I think you can disable GSync in (most of the) monitors, can't you?

The Rog Swift is just too expensive for me so that is not an option.
I've read about the quality issues also, that's really scary with such an expensive product.

Here's another option I just found though:
http://www.computersalg.dk/produkt/2210659

A somewhat reasonably priced 27" Acer 1080p screen with GSync - and it's available now.
Not sure if it has ULMB though and I miss out on all the nice gaming features of the BenQ.

I'm just eager to get a new monitor now and not in 1-2 months time.
I'm a little worried about continuing to use my current screen now that I've found out the built-in PWM hurts my eyes.
I have some permanent injuries from doing intensive sport during my youth so that way I have learned it's important to take care of yourself and listen to whatever signals your body is sending to you.
Last edited by hbjdk on 16 Nov 2014, 22:44, edited 1 time in total.

hbjdk
Posts: 15
Joined: 15 Nov 2014, 13:21

Re: Asus VG248QE: Problems with LightBoost in Battlefield 4

Post by hbjdk » 16 Nov 2014, 22:44

Btw. I have a thread on another forum about this matter:
http://hardforum.com/showthread.php?t=1841311

Check what this guy Marm0t writes - he's got hands-on experience with using GSync and says it's something you can live without if you get high fps.

Falkentyne
Posts: 2796
Joined: 26 Mar 2014, 07:23

Re: Asus VG248QE: Problems with LightBoost in Battlefield 4

Post by Falkentyne » 17 Nov 2014, 00:20

hbjdk wrote:What does "onboard scalers" mean? I don't understand. I haven't read about that before.

Good points about 2420G. I won't be buying any consoles though - I'm a pc kind of guy - and as for AMD video cards, my Fractal Design R4 case has poor ventilation so the extra heat of using AMD video cards instead of Nvidia is not tempting. But of course future products from AMD could be different in that regard so it could very well be a good thing to at least have the option of using AMD video cards instead of Nvidia.

But can't you add an AMD video card to your computer even though you're using a GSync enabled monitor?
I think you can disable GSync in (most of the) monitors, can't you?

The Rog Swift is just too expensive for me so that is not an option.
I've read about the quality issues also, that's really scary with such an expensive product.

Here's another option I just found though:
http://www.computersalg.dk/produkt/2210659

A somewhat reasonably priced 27" Acer 1080p screen with GSync - and it's available now.
Not sure if it has ULMB though and I miss out on all the nice gaming features of the BenQ.

I'm just eager to get a new monitor now and not in 1-2 months time.
I'm a little worried about continuing to use my current screen now that I've found out the built-in PWM hurts my eyes.
I have some permanent injuries from doing intensive sport during my youth so that way I have learned it's important to take care of yourself and listen to whatever signals your body is sending to you.
Scaling by the GPU (gpu scaling or software scaling) has higher input lag than if you scale by a monitor.
The Gsync boards don't have a scaler.
Scaling allows a monitor to accept a lower resolution as a default resolution and then use the monitor controls to scale and change the image to your favorite aspect ratio or sizes, instead of using the videocard drivers to set an aspect ratio. You can also use custom resolution utility (CRU) to add a refresh rate and resolution (if you don't get cable errors) that your game (usually an older game) uses, then switch to it and then use the monitor scaling features to set the aspect ratio to the original resolution of the game.

I'm sure Chief or ToastyX will come and rip my post apart but hey, I tried :)

Post Reply