VRR in conjunction V-sync

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
User avatar
jorimt
Posts: 2634
Joined: 04 Nov 2016, 10:44
Location: USA

Re: VRR in conjunction V-sync

Post by jorimt » 23 Aug 2024, 09:04

Chief Blur Buster wrote:
22 Aug 2024, 21:51
jorimt wrote:
19 Aug 2024, 10:37
which means VRR can still tear within its range in instances where frametime performance is unstable enough.
Disambiguation Required: This can turn the same statement simultaneously true/false, depending on who is reading.

There are infinite ways to compute frame rates (averaging, median, instantaneous, and how big a trailbehind window to average/median on). Different people make different assumptions, so I add this detail for those readers who might pounce on this technicality.

During 300 frames per second, there's potentially 300 different instantaneous frame rates involved (1/frametime = instantaneous framerate for specific frame).

It is true that average frame rate (over many frametimes), you can still have tearing at average frame rates below VRR range, since instantaneous frame rates (aka 1/frametime for a specific frame) can exceed VRR range. Like you may have 142fps, but is actually a variable 137-147fps. In that case, you still have a few instantaneous 144-147fps frames that exceed VRR range of 144Hz.

Also, frametime can be further jittered by GPU overheads and other factors beyond the application software, so a frametime a game thinks is within VRR range, may momentarily exit VRR range (frametimes that are faster than 1/maxHz) when including driver and GPU overheads. That error margin is generally under a millisecond or so, but can sometimes spike to multiple milliseconds during power-management situations.
Correct, as already echoed in entry #2 of my Closing FAQ:
https://blurbusters.com/gsync/gsync101- ... ttings/15/

That was just a conceptual one line shorthand for the layman. I've answered this VRR + V-SYNC on/off question over the years more times than I can count, so I have a million and one way of putting it. I'm tired :lol:

A somewhat less shorthand, but still conceptual layman way of putting it would be "VRR on + V-SYNC option off can still tear with an average framerate within the refresh rate whenever the frametime of a frame is significantly higher or slightly lower than the scanout time of the current physical refresh rate."
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48C4 Scaler: RetroTINK 4k Consoles: Dreamcast, PS2, PS3, PS5, Switch 2, Wii, Xbox, Analogue Pocket + Dock VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

User avatar
Chief Blur Buster
Site Admin
Posts: 12077
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: VRR in conjunction V-sync

Post by Chief Blur Buster » 25 Aug 2024, 11:44

Stizzie wrote:
23 Aug 2024, 02:52
Such neat magical trickery.
It's such a minor modification to a 100-year-old raster topology, to simply vary the size of VBLANK by keeping extra lines to the end of the blanking interval ...until the software is ready. Instead of the computer having to sync to the display, the display is now syncing to the software through this roundabout way.

Now that being said, as refresh rates get higher (1000fps 1000Hz), the fine granularity of fixed refresh cycles tend to behave more and more like VRR since you can begin a new refresh cycle at the nearest ultra-fine-granularity fixed refresh cycles.

So ironically, as we get to extremely high refresh rates, VRR becomes less useful. Stutters are much more visible at 144Hz than at, say, 1000Hz. 1/144sec = 6.9ms. And 6.9ms from a missed VSYNC at 1000 pixels/sec is a 6.9 pixeljump. Very easy to see. Still possible to see VSYNC stutters at 500 Hz but tends to require faster motion speeds and higher detail graphics (e.g. 4000 pixels/sec at 1/500sec error = 8 pixeljump).

I wrote an article about brute framerate-based motion blur reduction someday replacing flicker-based motion blur reduction. 1000fps 1000Hz combines the benefit of VRR, strobing, flickerfree, blurfree, all at the same time, while keeping HDR brightness, so it's kind of the Blur Busting holygrail nowadays -- www.blurbusters.com/framegen#dev for the Developer Best Practices. Now, there are reprojection algorithms that avoids double-image artifacts (on sample hold displays) and converts a varying frame rate to constant framerate=Hz so that is an alternative method to skip needing VRR. A bit brute of a hammer, but it does have other benefits (motion blur reduction). VR headsets flicker out of necessity because we don't have enough framerate to kill blur strobelessly. Quest 2 flickers at 0.3ms MPRT, which would require 3333fps 3333Hz to do strobelessly.

And with OLED recently unlocking refresh rate race (120Hz vs 480Hz on OLED is much more visible than 60Hz vs 120Hz on LCD, even to mass-market mainstream and office productivity), some developments may also occur to produce technologies that eventually complete with VRR, while providing other benefits to humankind.
Stizzie wrote:
23 Aug 2024, 02:52
I've always assumed that the Vsync signal in the middle is the VBLANK that APIs try to wait for in "wait for VBLANK", and the 2 porches act as sort of a timing padding (similar to a delay for hotkeys, if a hotkey is send to window in real time, often time it will not get registered). To reiterate Chief: the whole VBI ( Front porch + vsync + back porch ) is read from the perspective of the API as VBLANK; meaning at any point of time as long as the current pixel signal falls in the VBI then it is considered VBLANK and the API can flip if it's V-synced.
In theory the VRR implementation is GPU could loop on the middle VSYNC scanlines, and GPU simply begin transmitting the fixed (low) count of Vertical Back Porch (as specified in EDID) before the visible refresh cycle. That would only delay the refresh cycle by microseconds at the most. VRR could be achieved this way with exactly the same visual benefits, without delaying the active refresh cycle much (just a few scanlines of Back Porch).

However, the industry decided over ten years ago that we'd just keep duplicating the last scanline of VBLANK (which is always the Vertical Back Porch) in an infinite loop that only exits when either (A) software presents new frame, or (B) VRR range is violated. That produces the maximum flexibility and maximum backwards compatibility with displays not normally designed for VRR.

Even before VRR, there were laptop panels that could switch refresh rates seamlessly for power management, my old IBM Thinkpad could switch to 50Hz automatically in year 2005 during battery saver mode, with no mode changes. Those panels even were able to function with VRR, there are ways for an EDID override to force VRR over DVI and VRR over VGA! (In fact, there were forum members on Guru3D who successfully tested a couple of Multisync CRTs without digitally mode-change stabilizers or watchdog firmware, functioned fine with VRR). Basically 800x600 with a VRR range of 56-75Hz over VGA, forced into some old Diamondtron. As long as framerates didn't suddenly vary too much, it worked fine without mode-change blackouts. This only worked with certain very old "dumbest" HDMI-to-VGA adaptors that did a blind 1:1 conversion and ignored all the weird behaviors on the HDMI. It is incompatible with most newer HDMI-to-VGA adaptors that fritz on signal weirdnesses. (It is convenient that 1080p is identical analogly and digitally, timing-wise, for syncs and porches, so VRR is a bolt-on in both analog and digital domains, which makes all of this all possible, even in the first place).

VRR existed on vector CRTs, if you had a Vectrex system you could see the erratic flickers, as more vectors meant the global refresh rate was lower. And the 1983 Star Wars video game arcade machine flickered erratically all the time (most especially during the Death Star explosion that used a massive number of vectors). But VRR actually works on a few raster CRTs, as long as it's a multisync CRT that doesn't have very strict mode-change blackouts.

Changes to Vertical Porch on such a tube tended to be fairly gentle even if it sometimes slightly shifted the position of the CRT image (slightly upwards and slightly squatter, as framerates changed), but the variable flicker of the raster CRT worked fine with the 800x600 at 56-75Hz range.
- Requires Radeon card
- Requires ToastyX CRU EDID override to force blind output to an EDID-less display
- Requires only certain tubes (multisync CRTs with less mode-change blackout circuitry or firmware range cop)
- Requires only certain adaptors (dumbest and oldest HDMI 1.0 adaptors)

It is much harder to reproduce now, as you have to pull some legacy equipment (extremely old HDMI 1.0 spec adaptors that converted HDMI to VGA much more blindly, fully preserving blind VRR behavior without glitching). Another technique to verify that your adaptor chain works with VRR, is simply to use HDMI-to-DVI adaptor (single link), and use a separate DVI-to-VGA adaptor. DVI-to-VGA adaptors are guaranteed to be "dumber" than HDMI-to-VGA adaptors, and therefore a two-adaptor chain may sometimes work more reliable for VGA CRT VRR experiments.

This belies the fact that modern raster VRR is such as clever minor modification to a 100-year-old raster topology, and is backport-able retroactively to older tubes.

DVI sometimes worked fine with blind-forced generic VRR too, as some old DVI monitors (2006-2010ish) could do a VRR range of 50-72Hz(ish) when similar trick is done. The protocol of DVI and HDMI 1.0 is identical bitstream (it's why audio mysteriously worked over DVI to some early HDMI 1.0 monitors, as audio was a simple "audio-in-blankings" extension). HDMI 1.0 is just merely extensions to DVI and a brand new connector, and displays recycled the same chips for both DVI and HDMI. Funny how sound worked fine over DVI on certain computer monitors.

So there is a heavy specification blur between VGA and DVI (1080p=1080p temporally identical), a heavy specification blur between DVI and HDMI (same bitstream), and also VRR specification can be fully backported all the way back into VGA analog (all the same variable Vertical Back Porch trick). Voila.

VESA Adaptive-Sync is VRR at the panel level, and is used by all major technologies (GSYNC, FreeSync, etc), as GSYNC and FreeSync piggybacks various extensions, requirements, and certification logos. FreeSync is just AMD-certified generic VRR, and G-SYNC Compatible is just NVIDIA-certified generic VRR. G-SYNC native (module) is proprietary from GPU to scaler/tcon, but the panel may still be VESA Adaptive Sync internally, etc. It kind of blends together, but it all has the same commonalty of variable VBLANK via end-of-VBI-extension (loop on Back Porch).

________

Anyway...

More Information about Digital Screens And Special Considerations

There are proprietary differences in how VRR range is handled. At the panel level, all of them are nigh identical (since they have to use the same factory LCD panels). But at the display scaler/tcon level, NVIDIA G-SYNC module does things a bit differently from FreeSync/AdaptiveSync/HDMI VRR for frame rates below refresh rate.

When too much time elapsed before a new refresh cycle, there are problems -- like an image fading to black or white gradually. (This explains some of the VRR flicker behaviors too, by the way; the gamma of minimum framerate is slightly different from the gamma of maximum framerate, simply because the pixels have decayed a bit more at minimum framerate).

Modern screens are in some ways, very similar to giant glass DRAM chips, essentially. Or 8LC/10LC SSD chips (8-bit to per visible subpixel, unlike 3-bit-per-storage TLC SSDs). Yes, screens are essentially giant lithographed glass computer chips, complete with the same kind of semiconductor transitors (Albiet much larger), with the similar circuitry of a DRAM chip, except at ginormous scales, controlling light valves (liquid crystals) or light emitters (OLED/LED), just for our human eyes to see a picture on displays.

So just like a DRAM refresh, DRAM memory chips can go corrupt when not refreshed a few times a second. And flash memory can go bad if they're not refreshed roughly annually. For digital displays, there are visible artifacts if we wait more than a fraction of a second (fade to white, fade to white). Even a 1% fade = flicker. So we want to avoid that.

Which is why VRR displays have a minimum Hz.

If frametimes exceed minimum Hz (Hz < 1/frametime), the screen has to refresh again.

That's Low Framerate Compensation (LFC).
Generic VRR will require the graphics drivers to automatically refresh (software-driven repeat refreshing)
Proprietary VRR (G-SYNC module) uses the framebuffer stored in monitor's memory to repeat-refresh (display-side repeat refreshing).

Now, a lower MinHz range is sometimes good, but sometimes I prefer a higher MinHz, and simply relying on LFC. Sometimes the LFC performs better than ultra-low native MinHz. For generic FreeSync, 30Hz can flicker too much, so LFC 30Hz (via 60Hz + repeat refreshing 30fps) often looks better.

That's why you see some people do range edits to set MinHz to different numbers, like 55 or 65 or even 80 (e.g. 360Hz OLEDs often look better with an 80-360Hz range since LFC-driven 79fps-and-below can bypass some of the OLED flicker reports of low-OLED framerates).

Now, LFC is not without disadvantages. Just like Ethernet, a new refresh cycle frame from software can "collide" with a monitor busy repeat-refreshing. Which adds stutter, if a software is suddenly ready with a new frame while the monitor is still repeat-refreshing. That new frame must wait for the display to finish repeat-refreshing. Ooops. Now the user has to endure a stutter caused by LFC algorithm.

Displays with "1-240Hz" LFC is simply extremely smart display-side LFC that uses algorithms to predictively avoid those collisions. G-SYNC native is really good at LFC nowadays, while generic LFC can vary widely in quality (if it's just a very rudimentary LFC algorithm that doesn't try to predictively avoid collisions between new attempts to deliver refresh cycles while repeat refreshing).

Repeat-refresh cycle forces the monitor to be busy for (1/MaxHz) second, where it cannot start a new refresh cycle. MaxHz generally dictates how long a monitor needs to refresh one refresh cycle (no matter what framerate or frametime), that's your shortest interval between refresh cycles, regardless of new frames or repeat-refresh frames. "Frames" and "refresh cycles" are essentially the same thing in VRR, so you may notice sometimes I use the terms interchangebly.

Either way, so if you have variable framerates near LFC region, you can have unwanted erratic stutters that show up if (A) the LFC algorithm isn't good, (B) the framerates are erratic to defeat the LFC algorithm, (C) the MaxHz is relatively low. The average stutter error margin of LFC will be approximately half MaxHz refreshtime, so that's why LFC becomes a nonissue at high MaxHz. On modern 480Hz VRR displays, repeat-refresh events are only 1/480sec and stutter of LFC is only 0.5/480sec average (average penalty of any random collision between native and repeat refreshes). Those 1ms stutters aren't going to be human visible at 33ms frametimes (30fps), so LFC looks native-refresh with ginormous VRR ranges, which is why I recommend 80-480Hz VRR ranges for 480Hz OLEDs, forget about 30-48Hz minimums, since 80-480Hz looks exactly like 1-480Hz VRR range, but without the OLED VRR flicker.

Generalities:
1. Get a high MaxHz whenever you want to buy a brand new VRR display.
2. Make sure your Min:Max is at least 3:1 for very good LFC that looks native
3. Purchasing VRR ranges bigger than framerate ranges, lets you avoid worrying bout the capping disadvantages
4. Even 100fps at 480Hz VRR is much lower lag than 100fps at 144Hz VRR/VSYNC ON/VSYNC OFF.
5. Even a very crappy LFC algorithm performs well if you have a 4:1 Min:Max Hz range, so you bought insurance if you got a 360-480Hz display.

So, my advice to new VRR purchasers who have no budget concerns want maximum VRR benefits, is to... just get way more MaxHz than you think you need. The new 240-480Hz OLEDs actually are great for office productivity too, and even some mainstream writers now agree high Hz is fantastic ergonomically for non-gaming uses too. LCDs are great as well, although refresh rate differences are massively more visible on OLED than on LCD, which is why the refresh rate bang for the buck is usually more apparent.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on: BlueSky | Twitter | Facebook

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
jorimt
Posts: 2634
Joined: 04 Nov 2016, 10:44
Location: USA

Re: VRR in conjunction V-sync

Post by jorimt » 25 Aug 2024, 20:01

Chief Blur Buster wrote:
25 Aug 2024, 11:44
Now, LFC is not without disadvantages. Just like Ethernet, a new refresh cycle frame from software can "collide" with a monitor busy repeat-refreshing. Which adds stutter, if a software is suddenly ready with a new frame while the monitor is still repeat-refreshing. That new frame must wait for the display to finish repeat-refreshing. Ooops. Now the user has to endure a stutter caused by LFC algorithm.

[...]

Repeat-refresh cycle forces the monitor to be busy for (1/MaxHz) second, where it cannot start a new refresh cycle. MaxHz generally dictates how long a monitor needs to refresh one refresh cycle (no matter what framerate or frametime), that's your shortest interval between refresh cycles, regardless of new frames or repeat-refresh frames. "Frames" and "refresh cycles" are essentially the same thing in VRR, so you may notice sometimes I use the terms interchangebly.
Yup, framerate, frametime, and the number of scanout cycles per second can be "variable," but the scanout time remains fixed to the currently set physical refresh rate regardless, bottle-necking us into some of these troublesome gotchas.

As such, VRR isn't actually 100% achievable in the truest sense, just as close it can be within the restraints of current display scanout technology, hence the ultimate need for brute force "infinite" frame/refresh rates.

VRR is a clever hack (and indeed a sophisticated one in the case of G-SYNC native module implementation), but still a hack.
(jorimt: /jor-uhm-tee/)
Author: Blur Busters "G-SYNC 101" Series

Displays: ASUS PG27AQN, LG 48C4 Scaler: RetroTINK 4k Consoles: Dreamcast, PS2, PS3, PS5, Switch 2, Wii, Xbox, Analogue Pocket + Dock VR: Beyond, Quest 3, Reverb G2, Index OS: Windows 11 Pro Case: Fractal Design Torrent PSU: Seasonic PRIME TX-1000 MB: ASUS Z790 Hero CPU: Intel i9-13900k w/Noctua NH-U12A GPU: GIGABYTE RTX 4090 GAMING OC RAM: 32GB G.SKILL Trident Z5 DDR5 6400MHz CL32 SSDs: 2TB WD_BLACK SN850 (OS), 4TB WD_BLACK SN850X (Games) Keyboards: Wooting 60HE, Logitech G915 TKL Mice: Razer Viper Mini SE, Razer Viper 8kHz Sound: Creative Sound Blaster Katana V2 (speakers/amp/DAC), AFUL Performer 8 (IEMs)

Luviaz
Posts: 33
Joined: 07 Apr 2024, 16:34

Re: VRR in conjunction V-sync

Post by Luviaz » 25 Aug 2024, 23:20

Chief Blur Buster wrote:
25 Aug 2024, 11:44

Now, LFC is not without disadvantages. Just like Ethernet, a new refresh cycle frame from software can "collide" with a monitor busy repeat-refreshing. Which adds stutter, if a software is suddenly ready with a new frame while the monitor is still repeat-refreshing. That new frame must wait for the display to finish repeat-refreshing. Ooops. Now the user has to endure a stutter caused by LFC algorithm.
I belive this "collision" is exactly the same as a potential tear within VRR range V-sync OFF right ? and the added stutters V-sync On would translate to a tear V-Sync off.

In realistic use case though it never ever happended to me once. it possible techically but with RTSS cap and a high enough max Hz it's very very unlikely to happend.
At this point it's fair to say all perceptible visual stutters mostly rendered by applications's performance, rarely by display syncing.
Chief Blur Buster wrote:
25 Aug 2024, 11:44

Repeat-refresh cycle forces the monitor to be busy for (1/MaxHz) second, where it cannot start a new refresh cycle. MaxHz generally dictates how long a monitor needs to refresh one refresh cycle (no matter what framerate or frametime), that's your shortest interval between refresh cycles, regardless of new frames or repeat-refresh frames. "Frames" and "refresh cycles" are essentially the same thing in VRR, so you may notice sometimes I use the terms interchangebly.
Chief Blur Buster wrote:
25 Aug 2024, 11:44
Generalities:
1. Get a high MaxHz whenever you want to buy a brand new VRR display.
2. Make sure your Min:Max is at least 3:1 for very good LFC that looks native
3. Purchasing VRR ranges bigger than framerate ranges, lets you avoid worrying bout the capping disadvantages
4. Even 100fps at 480Hz VRR is much lower lag than 100fps at 144Hz VRR/VSYNC ON/VSYNC OFF.
5. Even a very crappy LFC algorithm performs well if you have a 4:1 Min:Max Hz range, so you bought insurance if you got a 360-480Hz display.

So, my advice to new VRR purchasers who have no budget concerns want maximum VRR benefits, is to... just get way more MaxHz than you think you need. The new 240-480Hz OLEDs actually are great for office productivity too, and even some mainstream writers now agree high Hz is fantastic ergonomically for non-gaming uses too. LCDs are great as well, although refresh rate differences are massively more visible on OLED than on LCD, which is why the refresh rate bang for the buck is usually more apparent.
"Go for maximum Hz" is the next buying guide! It's encouraging to see that, in my region's market, high Hz monitors are becoming increasingly affordable. Of course, there are trade-offs, such as basic LFC integration, but overall, a sufficiently high max Hz can overpower any potential artifacts.
The concept of brute-forcing frame rate/refresh rate is quite straightforward :lol: The main issues for consumers now are, of course, the cost of such panels and the real-time rendering performance required to maintain such refresh rates.

User avatar
Chief Blur Buster
Site Admin
Posts: 12077
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: VRR in conjunction V-sync

Post by Chief Blur Buster » 27 Aug 2024, 06:42

Luviaz wrote:
25 Aug 2024, 23:20
I belive this "collision" is exactly the same as a potential tear within VRR range V-sync OFF right ? and the added stutters V-sync On would translate to a tear V-Sync off.
Yes, refresh cycles (new or repeat) can be interrupted by VSYNC OFF. This is easy in the case of driver-side VRR, it just lets the new frame suddenly replace the frame being scanned out. Tearing appears for frame rates beyond VRR, but can occur with framerates a bit below max Hz as frametimes will often fluctuate to momentary frametimes faster than the fastest-possible refreshtime.

There is an exception, repeat-refresh cycles are never interrupted by new refresh cycles, otherwise low frame rates would have sudden erratic tearing. So even with VSYNC OFF, the driver still essentially uses zero-frame-queue-depth VSYNC during LFC operations. So your VSYNC setting never affects LFC. The drivers of any generic LFC VRR implementation, will merrily repeat-refresh with a mandatory VSYNC for that specific refresh cycle during framerates below min Hz.

But on a 540 Hz VRR monitor, who cares about an additional 2ms VSYNC lag during a 25fps worst-case LFC collision (new frame ready early in repeat refresh). Consider 25 frames per second already has 1/25sec rendertime lag = 40ms. :D .... And if you're playing competitive games on proper gaming rigs, you're not going to deal with 25fps operations anymore in Fortnite or Valorant or CS2, unless you're playing really-low-end or older mobile hardware, at which point you're no longer caring about lag anymore (e.g. playing more casually).

Either way, the esports way for VRR is if you want VRR in esports, purchase more refresh rate range than your planned uncapped frame rate range, then you're not worrying about what happens below/above VRR range. That's why 360Hz+ monitors are excellent if you want to use VRR=ON in esports.

That said, this isn't important if you're just looking for motion fluidity ergonomics and just enjoying games without the stutter.
Luviaz wrote:
25 Aug 2024, 23:20
In realistic use case though it never ever happended to me once. it possible techically but with RTSS cap and a high enough max Hz it's very very unlikely to happend.
Yes, if well-capped, the fallback sync technology (VSYNC ON or VSYNC OFF) doesn't occur, and those are just vestigal settings essentially for the majority of frametimes & refresh cycles.
Luviaz wrote:
25 Aug 2024, 23:20
At this point it's fair to say all perceptible visual stutters mostly rendered by applications's performance, rarely by display syncing.
The elephant in the room: Drivers and bugs. You've got generic VRR implementations and very jittery VRR-range handling completely in bad drivers, exacerbated with power management, and bugs. So sometimes a game that tries to framepace perfectly, and RTSS that tries to cap perfectly, the drivers still screws up at the end, and adds tearing.
Luviaz wrote:
25 Aug 2024, 23:20
The concept of brute-forcing frame rate/refresh rate is quite straightforward :lol: The main issues for consumers now are, of course, the cost of such panels and the real-time rendering performance required to maintain such refresh rates.
Let's not forget this is true spatially. We have 4K displays we still watch DVD and 1080p on, and still enjoy it fine. Sometimes our copy of Fritz Lang Metropolis doesn't really need the 4K treatment, so we can just enjoy it at its own merry low resolution, low frame rate, and monochrome.

Likewise, the same is true for frame rate. Inclusion of high native refresh rate is good, even if it's not constantly used. As one example, can enjoy Hollywood 24fps, video 60fps, and gaming 100-500fps on the same display.

Having the refresh rate flexibility there (at low cost) is great, consider that 480Hz OLEDs are debutting at prices less than what an IBM T-221 monitor cost in year 2001. The price premium of 480Hz will become as cheap as the 120Hz premium someday, just not yet. 4K was a $10,000 luxury in 2001, and now it's a $299 bargain basesment television at your favourite discount shop.

While 120Hz is not quite there, it's now "Higher end mainstream" instead of "gaming only niche". Android, Apple, Xbox, Playstation, televisions, etc. Heck, Dell now has a 120Hz office monitor too, and several Microsoft Surface tablets now has added 120Hz. You generally can't get an OLED television that doesn't have 120Hz included and some of the cheaper $500 TVs now already includes 120Hz.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on: BlueSky | Twitter | Facebook

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
Chief Blur Buster
Site Admin
Posts: 12077
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: VRR in conjunction V-sync

Post by Chief Blur Buster » 27 Aug 2024, 07:00

Luviaz wrote:
25 Aug 2024, 23:20
Chief Blur Buster wrote:
25 Aug 2024, 11:44
"Go for maximum Hz" is the next buying guide!
There used to be a lot of caveats with that, since going higher Hz often required degraded image quality. Like older TN panels.

Fortunately, that's no longer true, since high Hz is now even available on certain FALD LCDs and HDR OLEDs, and generally no longer degrades image quality.

That being said, the first 240Hz monitor was a bit more motionblurry than the best 144Hz monitor, due to poor overdrive and not-yet-optimized LCD GtG. And many had a slight amount more tape delay latency than the best 144Hz. There occasionally still is some drawbacks that was like "two steps forward, one step back", but generally that does not happen to a large extent anymore.

But yes, "Go for maximum Hz" is now a bullet item that is no longer as incompatible with other items. Except maybe a preference to an exact size such as 24". It's still hard to find 24" OLEDs. Something our readers have been squealing for, due to its sweet-spot nature in esports.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on: BlueSky | Twitter | Facebook

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply