I've been playing with this lately with VRR, and I thought this discussion can be extended further if we introduce scaling quality + taste into the equation.
On my S90D, using a sharpness test pattern like this one Sharpness
Running 2560x1440 into 3840x2160, the smallest checkered squares and lines look even and neutral colored if I am using the S90D scaling. But if I use my 7900xt's GPU scaling, the smallest checker boxes pick up a slight color tinge and has a wavy diagonal zebra pattern if I step back a little. The vertical/horizontal line boxes have uneven thickness. I tried this on my 2080, and it's similar but even more blurry.
So, if we make some assumptions, that the bigger (reputable) brand TV/Monitor engineers are not incompetent, and they get their scaling done without broken input latency, then should the debate move into "how it looks." vs the fraction of 1ms difference?
Would you sacrifice 1ms, for clearer scaling quality?
Perhaps a contrarian would also make the argument, when the image starts moving, it's zebra patterns all the way down, so LoveTilapia, you're full of sh**, give me my 1ms back.
