Display vs GPU scaling (almost 2025)

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
Post Reply
LoveTilapia
Posts: 6
Joined: 17 Dec 2024, 00:19

Display vs GPU scaling (almost 2025)

Post by LoveTilapia » 20 Dec 2024, 12:32

Looking at a previous thread, our Chief's stance is, at best they're the same, at worst the monitor screws it up and lags worse.

I've been playing with this lately with VRR, and I thought this discussion can be extended further if we introduce scaling quality + taste into the equation.

On my S90D, using a sharpness test pattern like this one Sharpness

Running 2560x1440 into 3840x2160, the smallest checkered squares and lines look even and neutral colored if I am using the S90D scaling. But if I use my 7900xt's GPU scaling, the smallest checker boxes pick up a slight color tinge and has a wavy diagonal zebra pattern if I step back a little. The vertical/horizontal line boxes have uneven thickness. I tried this on my 2080, and it's similar but even more blurry.

So, if we make some assumptions, that the bigger (reputable) brand TV/Monitor engineers are not incompetent, and they get their scaling done without broken input latency, then should the debate move into "how it looks." vs the fraction of 1ms difference?

Would you sacrifice 1ms, for clearer scaling quality?

Perhaps a contrarian would also make the argument, when the image starts moving, it's zebra patterns all the way down, so LoveTilapia, you're full of sh**, give me my 1ms back. :lol:

juliaroberts
Posts: 2
Joined: 28 Sep 2025, 01:35
Contact:

Re: Display vs GPU scaling (almost 2025)

Post by juliaroberts » 10 Oct 2025, 03:02

That’s actually a great take—and honestly one that doesn’t get discussed enough. Once you start factoring in scaling quality, sharpness handling, and the character of the upscaling itself, the “just use GPU scaling because it’s faster” argument starts looking a bit shallow.

User avatar
naporitan
Posts: 300
Joined: 09 Jun 2021, 06:16

Re: Display vs GPU scaling (almost 2025)

Post by naporitan » 11 Oct 2025, 05:26

GPU scaling is better, but if you use your display at its native resolution, no scaling occurs and you get good image quality.

User avatar
kyube
Posts: 573
Joined: 29 Jan 2018, 12:03

Re: Display vs GPU scaling (almost 2025)

Post by kyube » 11 Oct 2025, 10:50

juliaroberts wrote:
10 Oct 2025, 03:02
That’s actually a great take—and honestly one that doesn’t get discussed enough. Once you start factoring in scaling quality, sharpness handling, and the character of the upscaling itself, the “just use GPU scaling because it’s faster” argument starts looking a bit shallow.
See explanation below. None of those matter when discussing GPU vs Display scaling topic.
naporitan wrote:
11 Oct 2025, 05:26
GPU scaling is better, but if you use your display at its native resolution, no scaling occurs and you get good image quality.
From the last data set we have, which is from 2015 and on Kepler-era hardware, it's ≤120µs.
Even flood wasn't sure about his data. This is on microsecond-level precision hardware.

Non-native resolutions can be used without any filters applied by using the “No Scaling” (NV) or “Center” (AMD) settings.
You retain both the static (PPD remains the same, no bilinear interpolation or nearest neighbour lowering PPI) & dynamic (no upscaling artifacts in motion due to lower render resolution) image quality compared to other solutions (DLSS, FSR, NIS, LSFG)
In this way, you retain

The entire GPU vs Display scaling topic is a black box.
With the sheer amount of different GPU drivers, GPU architectures, scaler IC's, display FW, and no testing data, there cannot be an accurate assessment for this topic.

Post Reply