CRT-like Emulation on Low Persistence display?

Everything about displays and monitors. 120Hz, 144Hz, 240Hz, 4K, 1440p, input lag, display shopping, monitor purchase decisions, compare, versus, debate, and more. Questions? Just ask!
skrueck
Posts: 5
Joined: 14 May 2014, 17:36

CRT-like Emulation on Low Persistence display?

Post by skrueck » 14 May 2014, 18:31

I have a question.

If an OLED can have low persistence (Oculus claim of 2ms or 3 ms), could it be used similar to old CRT interlacing?

So instead of 1920x1080 @ 75 hz

use 1920x540 @ 150 hz (every other line "interlacing")
or 1920x270 @ 300 hz (every fourth line)
or 1920x180 @ 450 hz (every sixth line)
etc.

Only render, transmit, and display those sets of lines each "refresh". Each refresh is a temporally accurate partial picture with the rest of the display rendering nothing. Entire complete frames for any moment of time would never be displayed. Total light output would be the same. Bandwidth and processing power would be approximately the same. But should reduce latency significantly I would assume. And could work over HDMI or could be made to with a little work.

Is this possible?

Q83Ia7ta
Posts: 761
Joined: 18 Dec 2013, 09:29

Re: CRT-like Emulation on Low Persistence display?

Post by Q83Ia7ta » 14 May 2014, 19:02

no

spacediver
Posts: 505
Joined: 18 Dec 2013, 23:51

Re: CRT-like Emulation on Low Persistence display?

Post by spacediver » 14 May 2014, 20:11

don't see why not, but why would you want to do this?

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: CRT-like Emulation on Low Persistence display?

Post by Chief Blur Buster » 14 May 2014, 20:28

Alas, there are nasty temporal motion artifacts caused by interlacing, including 3-way interlacing and 4-way interlacing. I added some undocumented adjustments (&eastergg=1) to the www.testufo.com -> Tests -> Interlace animation to demonstrate this, from a motion artifacts perspective. The temporal offsetting of each refresh creates "venetian blinds" style artifacts.

TestUFO Demos of Interlacing
- Regular interlacing Animation
- 3-Way Interlacing Animation
- 4-Way Interlacing Animation
- 6-Way Interlacing Animation

For convenience, I've embedded the regular 2-way interlacing animation:



So, this is why you _really_ want progressive scan, even on a CRT.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

spacediver
Posts: 505
Joined: 18 Dec 2013, 23:51

Re: CRT-like Emulation on Low Persistence display?

Post by spacediver » 14 May 2014, 21:26

Charles Poynton (who also was one of the early voices about persistence based motion blur) also discusses the problems with interlacing when it comes to motion blur, as Mark has illustrated.

But as for the poster's question, I don't see why it wouldn't be possible to implement.

skrueck
Posts: 5
Joined: 14 May 2014, 17:36

Re: CRT-like Emulation on Low Persistence display?

Post by skrueck » 14 May 2014, 21:50

Thanks for the responses. Any thank you Chief for making an animated gif to explain.

But no, I don't think you quite understand what I am saying. There should be no erratic temporal offsetting. Each set of lines is rendered, transmitted and displayed exactly as they should be for that period of time. What you showed in your animated gif appears to me to be full images of a moment in time, cut into lines, and then displayed in pieces at different times, thus creating offsetting when lines from different moments in time are displayed together. You seem to be showing me what normal interlacing on an LCD does. (at least that is how your animated image appears to me on my monitor. my apologies if I am incorrect.)

I'm talking about something slightly different. So for the 1920x180 @ 450 hz example I gave, it would be the exactly the same as a true 1920x1080 @ 450 frames per second except that 5/6ths of the lines are blacked out for each refresh. No pixel on the screen should ever be more temporally offset than any other pixel on the screen. Would blacking out 5/6ths of the lines of each frame of a true 1920x1080 @ 450 hz display with a computer pumping out 450 fps cause the problem you are describing?

In response to spacedriver, the reason you would do this is for the massive decrease in latency. If it takes X milliseconds to render, transmit, and display a frame, then it should take around X/6 time to render, transmit and display 1/6 of the lines of that size of frame. So if latency were 24 ms, you could drop it down to approximately 4 ms instead.

Please correct me if I am wrong. And thank you again for any input.

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: CRT-like Emulation on Low Persistence display?

Post by Chief Blur Buster » 15 May 2014, 14:12

skrueck wrote:Thanks for the responses. Any thank you Chief for making an animated gif to explain.
It's actually embedded animated HTML5 (javascript in an IFRAME), since that has much more precision than an animated GIF. This is an embedded TestUFO.com webpage via the [testufo] tag, which I programmed both into this forum system as well as embeddable capability into the TestUFO.com page.
skrueck wrote:But no, I don't think you quite understand what I am saying.
I think we are talking about exactly the same thing, just at a different Hz.
skrueck wrote:There should be no erratic temporal offsetting. Each set of lines is rendered, transmitted and displayed exactly as they should be for that period of time.
That is exactly what the animation is doing, for the framerate==refreshrate object. So there is no problem for the full framerate, but you can see the major interlace artifact problems occurs at framerates less than refreshrate. And we haven't even started talking about the motion resolution problems that interlacing causes during vertical motion (even at 1000Hz).
skrueck wrote:What you showed in your animated gif appears to me to be full images of a moment in time, cut into lines, and then displayed in pieces at different times, thus creating offsetting when lines from different moments in time are displayed together.
This is essentially what interlacing does.
skrueck wrote:You seem to be showing me what normal interlacing on an LCD does.
And CRT. The effects look the same on a CRT too.

Even just looking at the regular http://www.testufo.com cover page on an interlaced CRT too -- without loading the interlaced pattern -- will show the same venetian-artifacts problem for the lower-framerate objects. Just try it! And when you double the CRT refresh to 120Hz, the venetian artifact becomes half-as-much offset, but it is still there. And if you increase the motionspeed, the temporal offsetting of the scanlines (from a RETINAL PERSPECTIVE) will be even greater.
skrueck wrote:I'm talking about something slightly different. So for the 1920x180 @ 450 hz example I gave, it would be the exactly the same as a true 1920x1080 @ 450 frames per second except that 5/6ths of the lines are blacked out for each refresh. No pixel on the screen should ever be more temporally offset than any other pixel on the screen.
You still have the eye-tracking and the low-framerate problem. As you track your eyes, your eyes are in different positions at all times, even in 1/450sec. If you run at less than 450 frames per second, your eyes will have moved onwards in 1/450sec (At 2000 pixels, that's a ~4 pixel offset) between the different passes. Say, you are only able to run at 100fps or 200fps at 450Hz, you will have temporally-offset passes of the same frame that interact with the continually moving eyes tracking motion. Your eyes are in a different position 1/450sec later. During eye-tracking of 2000 pixels/sec motion your eyes are approximately 4 pixels further down the motion vector in 1/450sec. That means the different interlace pass will be temporally offset from a retinal/sensor perspective (and would also show up in pursuit camera images too). Thusly, it will STILL create venetian-style artifacts. You cannot guarantee 450fps@450Hz at all times to avoid the temporal offsetting artifacts caused by eye-tracking. The venetian blinds offset will be only 60/450ths as bad as it was at 60Hz, but it will still remain, especially at faster motionspeeds. It will likely look really good at slow motion speeds because the interlacing occurs so fast that the venetian blinds effects are not visible at moderate motion speeds, but when you start doing things like turning in FPS (which creates motionspeeds similar to 2000 pixels/second), you will still start noticing the interlace artifacts even at 450Hz.
skrueck wrote:Would blacking out 5/6ths of the lines of each frame of a true 1920x1080 @ 450 hz display with a computer pumping out 450 fps cause the problem you are describing?
Yes.. Although at smaller offsets, as it scales based on motion speeds. At 120Hz, the offsetting caused by eyetracking is 1/2 of 60Hz, and diminishes beyond. It will still be visible at 450Hz for material that runs at less than 450fps. At motionspeeds of 450 pixels/second (only a quarter screen width per second motion), you will still have 1-pixel offsetting of 225fps material (half framerate), and 2-pixel offsetting of 112fps (quarter framerate), because you're temporally offsetting the passes even as the eye keeps moving, so the pixels show up in different positions in the retina, as the eye continues moving onwards. I understand the effect very well -- just look at my work at http://www.testufo.com/eyetracking -- and http://www.testufo.com/blackframes#count=3
skrueck wrote:In response to spacedriver, the reason you would do this is for the massive decrease in latency.
Are you aware that spacediver is a vision researcher? He knows his stuff well, so understands this stuff too.
skrueck wrote:If it takes X milliseconds to render, transmit, and display a frame, then it should take around X/6 time to render, transmit and display 1/6 of the lines of that size of frame. So if latency were 24 ms, you could drop it down to approximately 4 ms instead.
Yes, but if you display the next pass of lines 1/450sec later, your eyes (during eye-tracking motion) will have moved 1/450th of the motion vector. If you're not running at 450fps@450Hz, and have framerates lower than that, you're going to be temporally offsetting the lines from a retinal perspective (eye-tracking is analog -- your eyes are always in a different position at every single instant -- so even just 1/450th of a second later, your eyes have moved onwards along the motion vector -- but displaying certain lines of the same frame at different times -- ALWAYS creates these artifacts outlined, for framerates less than refresh rates, regardless of refreshrate, even at ultrahigh Hz (1000Hz). There is certianly a diminishing effect of the interlace skewing effect, but even at 1000Hz, then for animations running at less than 1000fps, you still have a temporal offset in a retinal perspective, due to the nature of eye tracking. At a motionspeed 2000 pixels/second, that's still a 2 pixel line-offsetting artifact (guaranteed) even at 1000Hz 2-way interlaced. 1/1000th of 2000 pixels/second is 2 pixels.

There are certainly benefits to low latency, and 450Hz is a worthwhile goal. But interlacing isn't a visually comfortable/desirable way to achieve this end goal.

This discussion is fun and worthwhile, but it is critical to understand the temporal offsetting problems caused by eye tracking; as Einstein said; everything is relative. The frame of reference of the eye position relative to the position of pixels; you cannot think only of the positions of pixels on the display itself, but also the position of the eye too; and how it creates artifacts (Example: http://www.testufo.com/eyetracking#pattern=checkerboard -- I correctly predicted this artifact even before I created this test pattern. This shows how well I understand human vision from a temporal perspective).

TL;DR: There will be interlacing artifacts even at 450Hz. This is already mathematically provable, from the very nature of eye tracking.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

skrueck
Posts: 5
Joined: 14 May 2014, 17:36

Re: CRT-like Emulation on Low Persistence display?

Post by skrueck » 15 May 2014, 18:02

First, thank you Chief for the long and very thorough explanation.

Sorry I mistook your incredibly useful HTML 5 animation for a gif.

I agree with everything you are saying, and we are on the same page everywhere except:
You cannot guarantee 450fps@450Hz at all times to avoid the temporal offsetting artifacts caused by eye-tracking.

I agree we can't get 1920x1080 @ 450 fps. But I'm not talking about that. I'm talking about 1920x180 @ 450 fps.

so Display Refresh Rate = Video Source Material Frames Per Second

One way to say this would be the human eye would never be given a complete snapshot of any moment in time. For this particular example, it would instead be given 450 snapshots of 1/6 of each of the 450 different moments per second. The other 5/6 is never rendered, transmitted or displayed.

A modern video card can handle 1920x1080 @ 75 fps. (155,520,000 pixels per second)

So why not 1920x180 @ 450 fps. (155,520,000 pixels per second)?

GPU should be able to handle it. CPU probably. And if it were CPU-limited, I imagine tweaks could be made to accommodate extremely high frame rates.

You also mention vertical motion may cause problems. But this approach is not limited to rows. Columns or other patterns are possible. For example, maybe just use 150 frames per second and use a checkerboard pattern. Or maybe it could be dynamic depending on direction of motion.

But overall, assuming CPU and GPU could handle this, and display refresh is equal to video source material frames per second, do you still see problems with this approach? Of course it's not as good as progressive at the same Hz. But could this approach be beneficial and achievable considering current hardware limitations?

User avatar
Chief Blur Buster
Site Admin
Posts: 11653
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: CRT-like Emulation on Low Persistence display?

Post by Chief Blur Buster » 15 May 2014, 19:50

skrueck wrote:I agree we can't get 1920x1080 @ 450 fps. But I'm not talking about that. I'm talking about 1920x180 @ 450 fps.
Clarification: You mean 1920x180 progressive scan -- 180 scan lines at 450fps.
Theoretically, a monitor with the bandwidth to do 1920x1080p at 120Hz could do 540p at 240Hz (half resolution), 270p at 480Hz (quarter resolution), so you wouldn't even need to go down all the way to 180p.

Doing this on a higher-performance CRT would be quite easy, assuming the deflection was fast enough to pull it off. You can push about 240Hz on some CRTs, and sometimes a little beyond at really low resolutions.

However, doing this on LCD would be extremely difficult due to the way they are currently wired, but it would be good to hear from display engineers (familiar in LVDS signalling) if running an LCD at ~480Hz is feasible by skipping scanout to certain parts of the panel. You would have the major ghosting problem of refreshes bleeding into each other since even most TN LCDs do not respond fast enough for 480Hz to be really practical (True 240Hz LCDs, however, should be practical within a few years).

By avoiding refreshing the LCD, you will get white gaps between scanlines since most gaming monitor LCDs fade to white when there's no voltage driving the pixels. This would be one of the engineering problems, unless you put some bias voltage in those scanlines to keep those LCD pixels black. Or simply simultaneously driving multiple scanlines with the same color values (which would look like crude point-sampled scaling). I'm not sure display technology will be headed in this direction, but it could be an interesting academic exercise for a display modder (like cirthix who did the 240Hz LCD mod).

So if this is what you meant, the topic title threw me off a little -- This isn't CRT-like emulation as this is achievement of low-persistence via ultrahigh frame rates instead of via flicker/strobing (like CRT or LightBoost). You actually meant "Achieving CRT-like clarity with low persistence without strobing", and the use of low-resolution progressive scan to get the bandwidth headroom for higher refresh rates.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

skrueck
Posts: 5
Joined: 14 May 2014, 17:36

Re: CRT-like Emulation on Low Persistence display?

Post by skrueck » 16 May 2014, 23:13

Chief, thanks again for the input and information. :)

We were almost exactly on the same page and then I think we drifted apart again. Probably my fault for poor explanation and improper terminology used. If so, my apologies. I'll try one last time if you will indulge me.

The main concept is that new low persistence displays (OLED displays with 2 ms persistence, for example) could provide a different way of displaying graphics than the primary one we are currently using.

When I was picturing an OLED screen with 2 ms persistence flashing an image for 2 ms and then going dark until the next frame was ready, the thought occurred that there might be a better way. Why not break this image up and send it in pieces and display them as they arrive? The reason you wouldn't do that is temporal offsetting as you pointed out. But after you send a piece, why not update the rest of the scene with new temporally accurate information before sending the next piece? So the source material frames per second increases to match the now increased refresh rate of the display. With the huge benefit being that latency is greatly reduced by the same factor as how many pieces the scene is broken into. Latency is the answer to why one would do this.

In other words, "stream" temporally accurate image fragments (rows, for example) into the eye instead of trying to flash entire images with much longer black intervals in between as we do now. Do this in order to greatly reduce overall latency of rendering, transmitting and displaying pixels.

So for the example I have been considering a 1920x1080 OLED display with 2 ms persistence. Oculus claimed 2 ms persistence and 75 Hz refresh rate capability. So for simplicity let's say I'm talking about a single 1920 x 1080 OLED display instead of two 960 x 1080 displays for each eye, but with the same 2 ms persistence.

Currently we can do:

1920 x 1080 @ 75 fps (source material at 75 fps and display at 75 Hz)
155,520,000 pixels per second

What if we then double the display refresh Hz, double the source fps, and interlace the display:

1920 x 1080 @ 150 fps (source material now at the higher rate of 150, display at 150 Hz, and only display every other line each display refresh)
Still only 155,520,000 pixels per second. Total light output is also the same.

You could then expand this all the way to the limit of the persistence of the pixels. In that case, we'd be talking about 500 Hz and 500 FPS if the persistence were 2ms.

Could an OLED display currently handle 500 Hz? I think it probably could. If not, I think it could be made to if light output is already good enough that persistence is possible at 2 ms.

Can a current PC or transmission technique (HDMI cable, for example) handle this? No, not full progressive 1920x1080 @ 500 FPS. PCs aren't that fast and we don't have the bandwidth available for this much data. This is the entire reason I was discussing breaking up the frames into smaller pieces, and never even rendering most of these pieces.

The reason I used 450 Hz and 450 FPS for my example was that the math was relatively clean when displaying every 6th line for each full 75 Hz scene refresh. And this got close to the 500 FPS target for 2 ms persistence displays.

At 1/450 seconds: render, transmit and display only rows 1, 7, 13, ...
At 2/450 seconds: render, transmit and display only rows 2, 8, 14, ...
At 3/450 seconds: render, transmit and display only rows 3, 9, 15, ...
At 4/450 seconds: render, transmit and display only rows 4, 10, 16, ...
At 5/450 seconds: render, transmit and display only rows 5, 11, 17, ...
At 6/450 seconds: render, transmit and display only rows 6, 12, 18, ...
At 7/450 seconds: render, transmit and display only rows 1, 7, 13, ...
...

Each set of rows represents a different period of time 1/450 seconds later than the previous set and is displayed 1/450 seconds later. For this example 5/6th of each source material frame is never rendered, transmitted or displayed. In this example 5/6 of the screen is always black.

using 1920x180 @ 450 Hz the image could be transmitted over HDMI in this fashion without the HDMI interface needing to know (or support) that only every 6th line of a 1920x1080 image is being transmitted. That is assuming HDMI allows that resolution and speed of transfer. But as long as it supports that frequency, the image could be trivially encoded in any identical or larger total resolution (480x720, for example, is the same number of pixels)

The concern expressed seemed to primarily be with temporal offsetting, which is not a concern if the source material is at the same fps as the Hz of the display, which is exactly what I am talking about.

Concern was also expressed with vertical motion if using rows of pixels. I'm not sure how bad that would be at 450 Hz rows compared with 75 Hz full frames? But I would assume there might be some ways to alleviate this if it were much of a problem. In the case of a VR headset like the Rift, maybe display rows for one eye and columns for the other to possibly maintain clarity in at least one eye at all times. Implementation of that would seem relatively trivial. Other solutions could be maybe achieved dynamically changing different things. But these specific details start to diverge from the main point.

So now that I hope I explained this more accurately and completely, are there other concerns or thoughts?

Again, apologies if I was not clear in my explanation or if I used any terms improperly. And thank you Chief for your time and to anyone else who reads this or comments.

Post Reply