Post
by Chief Blur Buster » 22 Sep 2014, 20:55
Yes, so many different considerations. Even the world's best OLEDs would not have the color gamut of real life. And even with VR we are forced to make a single plane focussable at all depths, which is still kind of artificial. Infinite resolution and infinite framerate still wouldn't pass the "Holodeck-vs-RealLife Turing Test", because of such considerations.
Still, 1000fps@1000Hz should still be "good enough" of a Holy Grail for VR -- for the vast majority, even including videophiles -- the simultaneously flicker/strobe sensitive (especially those still sensitive to 360Hz rainbow artifacts of DLP) and blur sensitive (especially those who love CRTs and hate old fashioned LCDs).
<GEEK DREAM>Dreaming forward to the 2030s or 2040s: massive wide-gamut 6-subpixel OLEDs (red/green/blue/yellow/cyan/magenta) at 8K resolution, running at 1000Hz in the laboratory could be possible! This could cover more than 90% of human vision color gamut, and could be "retina-quality" at 180 degree FOV for virtual reality. If combined with on-the-fly realtime eyeball focus plane tracking (and DoF compensation of various forms, such as via onscreen effect & deformable liquid lens) it might even finally start to successfully pass the "Holodeck-vs-RealLife" Turing Test. if the computer kept a perfect 1000fps@1000Hz sync at ~1-2ms latency. Let's remember, the year 2040 is as far forward into the future as year 1988 was backwards in time (MPEG1 was first tested in the laboratory around 1988 on what was literally a supercomputer for its era). I remember as a kid, reading Scientific American and other magazines when compressed video was still a theoretical lab curiousity, and NASA was still trailblazing the art of transmitting highly compressed image transmission. The concept of video at less than 1 bit per pixel was outlandishly "mission impossible" (until the late 1980s when we finally figured out macroblock based compression with the advent of MPEG1). Today, Netflix SuperHD video is only 0.05 bits per pixel on average. Now today, we can do 1080p H.EVC video at lower bitrates than 320x240 VCD MPEG1 video, transmitting truecolor video at very high quality at an average bitrate of 0.02 bits per pixel (via H.EVC/H.265 compression). Now we have 4K video compressed with H.EVC available. Technologically 1000fps 4K is likely possible within a mere decade or two, but the conceptual/engineering leap of understanding the necessity of 1000fps@1000Hz will take much longer to figure out -- for now, the world needs to move to "true 120fps HFR" first; a worldwide standardization of 120fps video. But even with the Moore's Law slowdown, 1000fps 8K 6-primary (RGBYMC) displays may actually be possible within 25 years, given the improvements already accomplished in the last 25 years. We already have 8K experiments, 6-primary-color experiments, and 1000Hz experimential displays being tested in the lab already -- just not all in the same display. Even if the digital display technology equivalent of Moore's Law was 4x slower from now on, we might still be able to get a VR display capable of passing at least some "Holodeck Turing Test" in less than 25 years. If the display needs to do inky black all the way to blindingly bright white, then we need more bits per channel to prevent banding artifacts, possibly 16 bits per channel (96 bit display total, for a 6-channel display). The GPU horsepower to generate "real-life-like scenes" at 8K AA 6-channel 16-bit-per-channel is actually, probably, the bigger engineering challenge, than the display itself. But before that, there's an "earth is flat, humans can't benefit beyond 120Hz" mentality by many long-time display engineers. The display engineers' disbelief on some concepts ("who needs 1000fps? thats stupid") is the bigger challenge to pass first. People and organizations like Oculus, John Carmack, Palmer Luckey, Michael Abrash -- combined with sites like Blur Busters and myself -- are among the trail blazers to dispel display myths.</GEEK DREAM>