Why 4k? Why not call it 2160p?

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
User avatar
RealNC
Site Admin
Posts: 3757
Joined: 24 Dec 2013, 18:32
Contact:

Re: Why 4k? Why not call it 2160p?

Post by RealNC » 07 Jan 2014, 02:22

What we need are monitors with the same pixel size as CRTs. I'm getting tired of scaling blur... Has anyone calculated what resolution this would require? It would need to be insanely high, I assume, since the factor with CRTs was the thickness of the electron beam?
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

joncppl
Posts: 10
Joined: 26 Dec 2013, 06:44

Re: Why 4k? Why not call it 2160p?

Post by joncppl » 07 Jan 2014, 03:13

RealNC wrote:What we need are monitors with the same pixel size as CRTs. I'm getting tired of scaling blur... Has anyone calculated what resolution this would require? It would need to be insanely high, I assume, since the factor with CRTs was the thickness of the electron beam?
I don't really follow what you are saying, but if you mean that someone needs to work out how to make old content (say a game or video that runs at 800x600 ) look good on a widescreen, HD display, I agree with you.

User avatar
RealNC
Site Admin
Posts: 3757
Joined: 24 Dec 2013, 18:32
Contact:

Re: Why 4k? Why not call it 2160p?

Post by RealNC » 07 Jan 2014, 06:06

joncppl wrote:I don't really follow what you are saying, but if you mean that someone needs to work out how to make old content (say a game or video that runs at 800x600 ) look good on a widescreen, HD display, I agree with you.
I mean that running a 1280x720 resolution on a 1920x1080 LCD monitor doesn't work and scaling is needed. CRTs didn't need scaling. All resolutions would look sharp and crisp. In order to have the same effect on an LCD, the pixel size would need to be extremely small. Maybe one day, when the warp drive has also been invented, we'll get such displays :mrgreen:
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

spacediver
Posts: 505
Joined: 18 Dec 2013, 23:51

Re: Why 4k? Why not call it 2160p?

Post by spacediver » 07 Jan 2014, 17:55

RealNC wrote:What we need are monitors with the same pixel size as CRTs. I'm getting tired of scaling blur... Has anyone calculated what resolution this would require? It would need to be insanely high, I assume, since the factor with CRTs was the thickness of the electron beam?
With a monochrome monitor that has a coating of phosphor with no mask structure, the resolution is only limited by the frequency with which the beam's intensity can be updated.

The aperture grille found in trinitron tubes has an essentially unlimited vertical resolution, but the horizontal resolution is limited by the phosphor pattern. The aperture grille has a repeating pattern of three vertical phosphor stripes, red, green, and blue. Thus, if you want to create, say, a high frequency spatial grating pattern (alternating black and white stripes), the finest pattern you can create is limited by the horizontal distance of each triplet. This is known as the horizontal dot pitch.

To my knowledge, the finest dot pitch ever manufactured was the Sony GDM F520, which had a horizontal dot pitch of 0.22 mm. Even though the monitor can transcode a signal containing up to 2048 horizontal pixels, one can only properly address about 1835 horizontal pixels, given the horizontal viewing size of the monitor. There is a bonus caveat, however. You can scan with a signal that exceeds the dot pitch resolution, and not suffer a great loss of image quality.

The FW900 has a variable dot pitch, ranging from 0.23mm in the center to 0.27mm at the edges of the screen, so it's not as straightforward to figure out the maximum addressable resolution. It has a much wider screen than the F520, however, and the National Information Display Laboratory has tested the addressability of the FW900 and found that it passed the 1920x1200 test.

The retina display has up to 326 pixels per inch. That works out to a dot pitch of 0.078 mm! This is on very small screens, however. The macbook air 15 inch screen has 220 pixels per inch, which is a dot pitch of 0.11 mm. If you ever get the chance to look at a retina display, look carefully at rendered text. It is unbelievable.

The advantage of a CRT is not necessarily in how fine a resolution it can display, but the fact that it can naturally render a variety of resolutions without artifacts. With fixed pixel displays, one needs to employ scaling algorithms, which may or may not produce artifacts (depending on the scaling ratios, and the algorithms), and may or may not increase input lag.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Why 4k? Why not call it 2160p?

Post by Chief Blur Buster » 07 Jan 2014, 18:07

Good reply, spacediver!
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

User avatar
RealNC
Site Admin
Posts: 3757
Joined: 24 Dec 2013, 18:32
Contact:

Re: Why 4k? Why not call it 2160p?

Post by RealNC » 08 Jan 2014, 02:14

Indeed, nice reply!

So scaling will always be needed. I wonder though, do "true" retina displays (meaning displays where humans cannot make out individual pixels anymore) still result in blur when scaling? In order to avoid artifacts, interpolation is currently performed that averages nearby pixels in order to hide the artifacts. If retinas would just keep the errors while scaling (in order to keep the image sharp), would those errors actually be visible?
SteamGitHubStack Overflow
The views and opinions expressed in my posts are my own and do not necessarily reflect the official policy or position of Blur Busters.

Neo
Posts: 47
Joined: 17 Dec 2013, 20:43

Re: Why 4k? Why not call it 2160p?

Post by Neo » 08 Jan 2014, 17:11

It's the non-integer scaling ratio that gets ya! This is one reason why I like the chosen Rec. 2020 resolutions. Just as one can display low framerate content on a high Hz monitor without interpolation but rather simple frame repeating, one can display lower spatial resolution content on higher resolution displays with pixel repeating. 1px in Npx out. No slow calculations needed but always an option for potentially higher quality (oversampled input) post-processing for things like video and image quality. Properly prepared source content should not have jaggies and thus look smooth (with anti-aliasing) on a high ppi display with no screen door effect.

4K does this:
1280x720 3x/9x total
1920x1080 2x/4x total

8K even supports linear QHD scaling:
1280x720 6x/36x
1920x1080 4x/16x
2560x1440 3x/9x
3840x2160 2x/4x

I'm currently viewing a 2010 Samsung 350 31.5" 720p (1366x768) LCD TV at 3 ft (40° viewing angle) and the quality limiting factor is the screen door effect. Not even the non-linear downscaling from 1080i to 768p and the low-pass filtering destroy quality the way the off-white crystalline shimmer of subpixels and repeating, regular pattern of horizontal and vertical lines do. The dot and line structure is obvious, and it doesn't take 20/10 vision to be negatively influenced by it. I can say right now even if I never watch native 4K source content that all things being equal a 4K display will improve even 720p content viewing.

Post Reply