Is having hardware G-Sync all that important these days? Also, just how big of a problem is burn-in?

High Hz on OLED produce excellent strobeless motion blur reduction with fast GtG pixel response. It is easier to tell apart 60Hz vs 120Hz vs 240Hz on OLED than LCD, and more visible to mainstream. Includes WOLED and QD-OLED displays.
lukeman3000
Posts: 21
Joined: 12 Jul 2014, 21:06

Is having hardware G-Sync all that important these days? Also, just how big of a problem is burn-in?

Post by lukeman3000 » 15 Mar 2023, 18:51

I have been learning a lot about the currently available monitors on the market and I find myself being heavily interested in the Alienware AW3423DWF. This monitor seems to be at the top of its class right now. However, two things concern me -

1. I have a 4090 and as such, can take advantage of hardware G-Sync. But does this even matter these days? The DWF variant of this monitor does not have hardware G-Sync (though it supports G-Sync) - am I missing out on anything here?

2. Burn-in is my other big concern. If I let the monitor run it’s anti burn-in features when it wants to, do I really have anything to worry about? Is this also affected by brightness levels and such (such that it’s not likely I’d be running at a brightness where this would be likely given my desire for color accuracy and such)? I don’t really know how all of these things play together or how big of a risk this is. I don’t really like the feeling that I can’t use my monitor for “regular stuff” out of fear that certain static images could be burning themselves into my screen over time…

3. What about a Mini LED 4K monitor as an alternative? As I understand it, these don’t have the same risk of burn-in, so maybe it makes more sense to go for a higher resolution and trade the OLED for no burn-in risk? According to hardware unboxed the MSI Mini LED competes fairly well with the Alienware in terms of HDR accuracy…

Just looking for some insight and opinions here. Basically, I’m really interested in the Alienware but afraid of burn-in. And also I don’t know how much more I might appreciate 4K instead, having not yet experienced it.

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Is having hardware G-Sync all that important these days? Also, just how big of a problem is burn-in?

Post by jorimt » 15 Mar 2023, 19:19

lukeman3000 wrote:
15 Mar 2023, 18:51
1. I have a 4090 and as such, can take advantage of hardware G-Sync. But does this even matter these days? The DWF variant of this monitor does not have hardware G-Sync (though it supports G-Sync) - am I missing out on anything here?
Hardware G-SYNC primarily provides three advantages over equivalent official G-SYNC Compatible FreeSync displays; 1) immune to any (admittedly rare) driver-level behavioral differences from version-to-version since VRR functionality is dependent on the module, not the GPU, 2) module-level vs. driver-level LFC, making it more accurate/stable, 3) dynamic/variable overdrive at the module-level, better preventing ghosting at all framerates within the refresh rate.

However, #3 is not needed on OLED due to its near instantaneous pixel response times.
lukeman3000 wrote:
15 Mar 2023, 18:51
2. Burn-in is my other big concern.
OLED has uneven per-pixel aging exacerbated by a lack of varied content over a very long period of time, so if you're using the monitor for desktop/productivity/browsing, and/or playing the same game with a prominent static hud for hours at a time, OLED may not be the best panel-type for your use-case.

If however you're primarily using it with a variety of games and media, and using a secondary monitor for productivity, you shouldn't have any issues in this respect, so long as the panel ages evenly.

Also, it's worth mentioning that current-gen OLED gaming monitors don't have the best text clarity due to their subpixel structures. If you're gaming, it won't much matter, but for desktop/productivity, it can be noticeable vs. a traditional LCD.

Finally, current-gen QD-OLED-type monitor don't have a polarizer layer, so can look grayish in certain lighting conditions.
lukeman3000 wrote:
15 Mar 2023, 18:51
3. What about a Mini LED 4K monitor as an alternative?
Mini LED beats OLED where HDR brightness and image retention are concerned, but it still fails in black-levels due to very noticeable blooming from the extreme backlight brightness capabilities and limited number of dimming zones. FALD (full array local dimming) backlights can also create DSE (dirty screen effect) in medium to brighter scenes where the screen is more uniform (think deserts, snow, etc).

If you plan on using VRR, however, OLED has one serious weakness that must be considered; near-black flicker which can only be mitigated by ensuring frametime performance is rock solid per game. Large enough spikes or variances will cause very noticeable flicker.

That said, I'm not sure how the latest Mini LED models fare with VRR operation either. It's possible some of them have flicker issues in VRR in certain situations, especially those that don't contain modules; VRR flicker on LCD-type panels within the normal working range is typically not present on monitors with modules vs some without.
lukeman3000 wrote:
15 Mar 2023, 18:51
I don’t know how much more I might appreciate 4K instead, having not yet experienced it.
How noticeable the difference between 1440p and 4k is can depend on the screen size. 27" 1440p vs 42"or higher 4k, for instance, is not a massive visual difference at typical viewing distances. You also have to factor in the performance cost of the higher resolution (around ~25%+ in most cases).
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

lukeman3000
Posts: 21
Joined: 12 Jul 2014, 21:06

Re: Is having hardware G-Sync all that important these days? Also, just how big of a problem is burn-in?

Post by lukeman3000 » 15 Mar 2023, 21:09

jorimt wrote:
15 Mar 2023, 19:19
lukeman3000 wrote:
15 Mar 2023, 18:51
1. I have a 4090 and as such, can take advantage of hardware G-Sync. But does this even matter these days? The DWF variant of this monitor does not have hardware G-Sync (though it supports G-Sync) - am I missing out on anything here?
Hardware G-SYNC primarily provides three advantages over equivalent official G-SYNC Compatible FreeSync displays; 1) immune to any (admittedly rare) driver-level behavioral differences from version-to-version since VRR functionality is dependent on the module, not the GPU, 2) module-level vs. driver-level LFC, making it more accurate/stable, 3) dynamic/variable overdrive at the module-level, better preventing ghosting at all framerates within the refresh rate.

However, #3 is not needed on OLED due to its near instantaneous pixel response times.
lukeman3000 wrote:
15 Mar 2023, 18:51
2. Burn-in is my other big concern.
OLED has uneven per-pixel aging exacerbated by a lack of varied content over a very long period of time, so if you're using the monitor for desktop/productivity/browsing, and/or playing the same game with a prominent static hud for hours at a time, OLED may not be the best panel-type for your use-case.

If however you're primarily using it with a variety of games and media, and using a secondary monitor for productivity, you shouldn't have any issues in this respect, so long as the panel ages evenly.

Also, it's worth mentioning that current-gen OLED gaming monitors don't have the best text clarity due to their subpixel structures. If you're gaming, it won't much matter, but for desktop/productivity, it can be noticeable vs. a traditional LCD.

Finally, current-gen QD-OLED-type monitor don't have a polarizer layer, so can look grayish in certain lighting conditions.
lukeman3000 wrote:
15 Mar 2023, 18:51
3. What about a Mini LED 4K monitor as an alternative?
Mini LED beats OLED where HDR brightness and image retention are concerned, but it still fails in black-levels due to very noticeable blooming from the extreme backlight brightness capabilities and limited number of dimming zones. FALD (full array local dimming) backlights can also create DSE (dirty screen effect) in medium to brighter scenes where the screen is more uniform (think deserts, snow, etc).

If you plan on using VRR, however, OLED has one serious weakness that must be considered; near-black flicker which can only be mitigated by ensuring frametime performance is rock solid per game. Large enough spikes or variances will cause very noticeable flicker.

That said, I'm not sure how the latest Mini LED models fare with VRR operation either. It's possible some of them have flicker issues in VRR in certain situations, especially those that don't contain modules; VRR flicker on LCD-type panels within the normal working range is typically not present on monitors with modules vs some without.
lukeman3000 wrote:
15 Mar 2023, 18:51
I don’t know how much more I might appreciate 4K instead, having not yet experienced it.
How noticeable the difference between 1440p and 4k is can depend on the screen size. 27" 1440p vs 42"or higher 4k, for instance, is not a massive visual difference at typical viewing distances. You also have to factor in the performance cost of the higher resolution (around ~25%+ in most cases).
Thank you for the very detailed reply!

So on burn-in, is it really my responsibility to vary the media being consumed and babysit static images? Doesn’t the AW have built-in automatic protection features that run every so many hours to mitigate this? I would be using primarily for gaming and I play different games, but sometimes indeed I do play the same game for a few hours at a time. Are you telling me that I can’t do safely do this? Surely my usage is not unusual; I have to imagine many other people game much longer than me in the same games and etc..

Regarding G-Sync - the Alienware has a version with a G-Sync module and a newer version without. The newer version apparently has some benefits over the older that make it a bit more attractive (such as being able to update the firmware, for example, and having a quieter fan) — Are you of the mind that I wouldn’t really miss out by getting the newer version with the G-Sync module?

Finally, are you aware of any impending releases that might be better or more interesting? I guess you could always wait forever but I wonder if more and perhaps better options are coming soon. I almost might prefer a Mini LED over an OLED due to the burn in concerns, but so many people are telling me to get the OLED and that I won’t regret it lol.

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Is having hardware G-Sync all that important these days? Also, just how big of a problem is burn-in?

Post by jorimt » 15 Mar 2023, 21:56

lukeman3000 wrote:
15 Mar 2023, 21:09
So on burn-in, is it really my responsibility to vary the media being consumed and babysit static images? Doesn’t the AW have built-in automatic protection features that run every so many hours to mitigate this? I would be using primarily for gaming and I play different games, but sometimes indeed I do play the same game for a few hours at a time. Are you telling me that I can’t do safely do this? Surely my usage is not unusual; I have to imagine many other people game much longer than me in the same games and etc..
No, I'm saying is if all you do is play Destiny 2 with the HUD on full blast 100 hours per week, and then browse the internet, read reddit threads, and watch twitch streams (and not in fullscreen) in-between, you're probably going to want to go LCD.

If not, you vary your content, and you don't walk away with static content left on your screen for hours at a time, the dangers of OLED IR is exaggerated, even without the protection features, which (beyond the automatic compensation cycles that run after a few hours of use whenever you shut the display off) I have never kept enabled on any of my OLED displays, and I've owned three so far (over the period of several years) and have had no discernible permanent IR issues on any of them.
lukeman3000 wrote:
15 Mar 2023, 21:09
Regarding G-Sync - the Alienware has a version with a G-Sync module and a newer version without. The newer version apparently has some benefits over the older that make it a bit more attractive (such as being able to update the firmware, for example, and having a quieter fan) — Are you of the mind that I wouldn’t really miss out by getting the newer version with the G-Sync module?
If the model in question is OLED, a G-SYNC module is much less useful.

I notice little difference in base G-SYNC operation between my two gaming displays (one LCD with native G-SYNC and the other OLED with G-SYNC Compatible). The biggest difference is variable overdrive, which isn't a factor with OLED.

Again though, keep OLED VRR flicker in mind; you will get noticeable near-black flicker in any game that doesn't have rock solid frametime performance, so for games that don't, you'll have to use RTSS to limit the framerate to a level your system can sustain on average the majority of the time so that RTSS remains the limiting factor, and thus keeps frametime performance stable enough to avoid the issue.
lukeman3000 wrote:
15 Mar 2023, 21:09
Finally, are you aware of any impending releases that might be better or more interesting?
Monitor features are a venn diagram of trade-offs. It really depends on what you're going for specifically.

Me, for instance? I wouldn't buy another OLED display for use as a dedicated monitor right now (already own an LG CX) because of the near-black VRR flicker still not being "fixed," a lack of glossy screen and/or polarizer, the pixel-structure issues with text for desktop/productivity use cases, and the fact that more than half of the time I use my monitor for work purposes with lots of static scenarios.

As such, instead of buying one of the latest 240Hz OLED monitors, I own the best 240Hz 1440p native G-SYNC LCD monitor currently available (PG279QM).

That, and honestly, I calibrate all of my displays to the 100 nits standard for SDR mode, and while I watch HDR on my dedicated home theater setup (77" OLED), I play 95% of my games on my PG279QM at 100 nits in SDR mode.

In my opinion, HDR gaming just isn't up-to-par with movies right now (there's just no accuracy, let alone consistency from game-to-game, especially on PC), so even though I can play games in HDR at 120Hz 4k on my 48" LG CX OLED, I currently prefer the higher refresh rate and lower latency in SDR on my LCD monitor (which can't do "real" HDR in its HDR mode anyway, due to it being a lower contrast IPS-type panel).

I'm obviously in the minority/on the fringe with much of this though, but that's the point; monitor requirements will vary heavily on the individual, their preferences, tolerances, and use-case.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

lukeman3000
Posts: 21
Joined: 12 Jul 2014, 21:06

Re: Is having hardware G-Sync all that important these days? Also, just how big of a problem is burn-in?

Post by lukeman3000 » 15 Mar 2023, 23:10

jorimt wrote:
15 Mar 2023, 21:56
Again though, keep OLED VRR flicker in mind; you will get noticeable near-black flicker in any game that doesn't have rock solid frametime performance, so for games that don't, you'll have to use RTSS to limit the framerate to a level your system can sustain on average the majority of the time so that RTSS remains the limiting factor, and thus keeps frametime performance stable enough to avoid the issue.
When you say OLED VRR flicker, and noticeable near-black flicker, does this mean that the monitor flickers when the image that is predominately being displayed is black? As in, the monitor flickers on/off quickly in these scenarios, or something else?

Does VRR flicker only happen with G-Sync compatible OLED monitors, and not monitors that actually have a G-Sync module? Is it a problem equally across all OLED monitors?

And finally, do you mean that if your frame rate is constantly dipping you would want to limit it to the lowest point it can dip to in order to prevent this?

Interesting; this is the first time I've heard of such an issue regarding OLED monitors.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Is having hardware G-Sync all that important these days? Also, just how big of a problem is burn-in?

Post by Chief Blur Buster » 16 Mar 2023, 00:13

jorimt wrote:
15 Mar 2023, 19:19
OLED has uneven per-pixel aging exacerbated by a lack of varied content over a very long period of time, so if you're using the monitor for desktop/productivity/browsing, and/or playing the same game with a prominent static hud for hours at a time, OLED may not be the best panel-type for your use-case.
The LG version of 240Hz OLED has been longevity-tested with office applications. I Visual Studio on my 240Hz OLED now.

BTW, the new RTINGS OLED test show that the latest LG-branded C2 OLED is much more burn-in resistant. There is a risk, but frankly, OLED is now office-ready if you don't drive it hard. People have been reporting years of OLED operation for office use now, though not everyone.

(Note: Disambiguating the permanent wear-tear based burn in, from the phosphorescent-style image retention effect -- these are two independent mechanisms).

OLED VRR flicker is less visible on the 240hz OLEDs than on the older LG 4K OLED TVs, so you might not see it as much.

Also, G-SYNC native has better gametime:photontime sync, and the LFC behavior is better -- e.g. fewer stutters punching through VRR. However, that said, this may not be important at the low frame rates that VRR is the most helpful to (e.g. 40-80fps erratic stutter mechanics).
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
jorimt
Posts: 2484
Joined: 04 Nov 2016, 10:44
Location: USA

Re: Is having hardware G-Sync all that important these days? Also, just how big of a problem is burn-in?

Post by jorimt » 16 Mar 2023, 08:51

lukeman3000 wrote:
15 Mar 2023, 23:10
When you say OLED VRR flicker, and noticeable near-black flicker, does this mean that the monitor flickers when the image that is predominately being displayed is black? As in, the monitor flickers on/off quickly in these scenarios, or something else?
See:
https://youtu.be/RWwqzQ6CD2M
https://youtu.be/VpDBAZ0vNuo?t=794
lukeman3000 wrote:
15 Mar 2023, 23:10
Does VRR flicker only happen with G-Sync compatible OLED monitors, and not monitors that actually have a G-Sync module? Is it a problem equally across all OLED monitors?
VRR flicker has different causes on different display types. For LCD, it's typically because of a sub-optimal LFC range (starts too late), whereas for OLED, it's currently because the display's gamma response is fixed to the max refresh rate of the monitor, which means said response isn't able to dynamically adapt to the variable refresh rate.

So the particular LCD VRR flicker we're talking about is typically only present on some G-SYNC Compatible FreeSync displays that have poor LFC tuning, whereas on most LCD displays containing G-SYNC modules, this is avoided.

For OLED, however, its particular form of flicker is present with or without the module.
lukeman3000 wrote:
15 Mar 2023, 23:10
And finally, do you mean that if your frame rate is constantly dipping you would want to limit it to the lowest point it can dip to in order to prevent this?
No, gradual drops in average framerate typically won't trigger the flicker. What I mean is when the frametime is rapidly different enough from frame-to-frame, it will cause micro-brightness shifts in darker scenes due to the dynamically changing refresh rate during VRR operation, since each "Hz" is offset slightly differently from that fixed gamma value.
Chief Blur Buster wrote:
16 Mar 2023, 00:13
The LG version of 240Hz OLED has been longevity-tested with office applications. I Visual Studio on my 240Hz OLED now.
That's good to hear.

From recent RTINGs IR tests, it does appear that the latest generation of LG WOLED panels (C2/G2 series and newer) are more IR-resistant than first-gen QD-OLED. Probably goes for monitor models using those panel types as well.

Regardless, IR certainly isn't as much of an issue on either OLED panel type as many non-OLED owners fear.
Chief Blur Buster wrote:
16 Mar 2023, 00:13
OLED VRR flicker is less visible on the 240hz OLEDs than on the older LG 4K OLED TVs, so you might not see it as much.
Would be interesting to see how much it has improved over 120Hz WOLED and QD-OLED panels in like-for-like scenarios. I have yet to see a video (or text) review from any source that covers that yet though, so I'll take your word for it.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48CX VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Is having hardware G-Sync all that important these days? Also, just how big of a problem is burn-in?

Post by Chief Blur Buster » 16 Mar 2023, 19:50

Yes, we need to see more 240Hz OLEDs be tested by reviewers. I'm mighty pleased with 240Hz OLED pros/cons versus LCD pros/cons.

I think it has a big place in future desktop monitors, especially as longevity continues to increase;
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

BoredErica
Posts: 9
Joined: 19 Nov 2022, 08:41

Re: Is having hardware G-Sync all that important these days? Also, just how big of a problem is burn-in?

Post by BoredErica » 16 May 2023, 02:51

Chief Blur Buster wrote:
16 Mar 2023, 00:13
The LG version of 240Hz OLED has been longevity-tested with office applications. I Visual Studio on my 240Hz OLED now.

BTW, the new RTINGS OLED test show that the latest LG-branded C2 OLED is much more burn-in resistant. There is a risk, but frankly, OLED is now office-ready if you don't drive it hard. People have been reporting years of OLED operation for office use now, though not everyone.

(Note: Disambiguating the permanent wear-tear based burn in, from the phosphorescent-style image retention effect -- these are two independent mechanisms).

OLED VRR flicker is less visible on the 240hz OLEDs than on the older LG 4K OLED TVs, so you might not see it as much.
Where can I read about the longevity test of a LG 240hz OLED monitor? That would be interesting read.

When you say that VRR flicker is less visible on 240hz OLEDs, does this require achieving 240fps or does simply having the newer panel or 240hz refresh rate minimize the VRR flicker? How does it reduce the issue?

Thanks

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Is having hardware G-Sync all that important these days? Also, just how big of a problem is burn-in?

Post by Chief Blur Buster » 18 May 2023, 20:00

BoredErica wrote:
16 May 2023, 02:51
When you say that VRR flicker is less visible on 240hz OLEDs, does this require achieving 240fps or does simply having the newer panel or 240hz refresh rate minimize the VRR flicker? How does it reduce the issue?
Well, it's hard to answer this one. Fluctuations between 100<->200fps has less gamma flicker than fluctuations between 50<->100fps. The gamma flicker is simply like a 1%-3% brightness change in dark greys during sudden framerate changes. You don't always notice.

IMHO, not a big deal. I've seen worse LCD VRR flickers before (uncertified VRR that is non-native GSYNC). Especially uncertified VRR LCDs that aren't well overdrive-tuned.

LCD vs OLED are a game of pick-poisons, and OLED has way fewer image-quality poisons. If your calculus leans more towards best motion quality in the 80fps-240fps regime, it's really hard to beat OLED + VRR now, even with the gamma-udulation quirk (more visible during sudden sharp dips from 100fps+ suddenly down to 30fps, and only creates a 1-3% brightness difference only in the dark greys portion of the color spectrum). Motion even looks better than 360Hz OLEDs.

Even many of the 240Hz OLEDs are even debutting at reasonably low lags (lower lag than the world's first 240Hz LCDs in 2016-2017). But if your priority is a couple milliseconds lower lag, you may still lean towards an esports LCD.

And if you need to use a dashboard display like an airport arrivals board, or continuously-displayed weather or calendar, then LCD definitely is better.

However, motion-quality purists who don't want strobing, definitely should consider OLED. Even with VRR.
lukeman3000 wrote:
15 Mar 2023, 18:51
1. I have a 4090 and as such, can take advantage of hardware G-Sync. But does this even matter these days? The DWF variant of this monitor does not have hardware G-Sync (though it supports G-Sync) - am I missing out on anything here?
OLED VRR for GSYNC native vs non-native is actually smaller difference than it is for LCD VRR.

I always turn on my non-native VRR on my OLED when playing games like Cyberpunk 2077. VRR, even non-native, is definitely worth it. I don't see the gamma flickers often (FALD blooming is WAY more noticeable than OLED VRR gamma flickers to my specific eyes for my specific games) as most of my framerates don't fluctuate sharply enough to amplify the gamma-flickers. I do usually disable VRR for Windows desktop, because, yes, the VRR gamma flickers are more visible in my grey windows -- it's still subtle and I can tolerate it.
lukeman3000 wrote:
15 Mar 2023, 18:51
2. Burn-in is my other big concern. If I let the monitor run it’s anti burn-in features when it wants to, do I really have anything to worry about? Is this also affected by brightness levels and such (such that it’s not likely I’d be running at a brightness where this would be likely given my desire for color accuracy and such)? I don’t really know how all of these things play together or how big of a risk this is. I don’t really like the feeling that I can’t use my monitor for “regular stuff” out of fear that certain static images could be burning themselves into my screen over time…
Your monitor is QD-OLED, which was recently tested to be more prone to burn-in than WOLED panels. I currently use my Corsair Xeneon Flex (prototype) for everyday Windows work, including Visual Studio. Now that being said, I have less experience with QD-OLED, but you can bring your brightness down quite a bit (e.g. 100-150 nits) whenever you are using the Windows desktop. On some OLEDs, a halving of brightness can add 10x+ more lifetime to an OLED before faint burn-in appears. So, definitely don't neglect dialing-back your OLED when using Windows.

You can also use switchable color profiles, one for desktop, and one for gaming. In some cases, you may be able to automate the profile-switching -- though that can be tricky.
lukeman3000 wrote:
15 Mar 2023, 18:51
13. What about a Mini LED 4K monitor as an alternative? As I understand it, these don’t have the same risk of burn-in, so maybe it makes more sense to go for a higher resolution and trade the OLED for no burn-in risk? According to hardware unboxed the MSI Mini LED competes fairly well with the Alienware in terms of HDR accuracy…
It's a personal decision based on the pros/cons of the various technologies. Both OLED and MiniLED represents massive upgrades to the very staid status-quo of displays.

Plain non-FALD LCDs have changed very little in 15-20 years, with almost the same color quality over the whole era. Even my 2006-era Samsung 245BW looks remarkably similar in color quality to a typical 2022-era TN esports LCD, 16 years later, with the sole exception of 72% NTSC CCFL versus 99% NTSC LED backlight.

So going MiniLED FALD or OLED, is just such a dramatic display-quality upgrade.
lukeman3000 wrote:
15 Mar 2023, 18:51
Just looking for some insight and opinions here. Basically, I’m really interested in the Alienware but afraid of burn-in. And also I don’t know how much more I might appreciate 4K instead, having not yet experienced it.
If you're worried about burn-in, why not consider the upcoming 240Hz WOLED ultrawides? I love my Corsair Xeneon Flex.

And 240Hz is definitely worth it even if you're a casual gamer. Smoothscroll-apps such as browser scrolling is exactly 4x clearer than at 60Hz, so I see the 240Hz-ness even in everyday use. OLED is an extremely efficient sample-and-hold technology that scales linearly with Hz (2x Hz = 1/2 scrolling blur), whether be gaming, scrolling, panning, etc.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply