Apple M1 DCP (Display Coprocessor)

Advanced display talk, display hackers, advanced game programmers, scientists, display researchers, display manufacturers, vision researchers & Advanced Display Articles on Blur Busters. The masters on Blur Busters.
Post Reply
stl8k
Posts: 23
Joined: 15 May 2019, 07:59

Apple M1 DCP (Display Coprocessor)

Post by stl8k » 27 Sep 2021, 15:39

...The silver lining of using this complicated DCP interface is that DCP does in fact run a huge amount of code – the DCP firmware is over 7MB! It implements complicated algorithms like DisplayPort link training, real-time memory bandwidth calculations, handling the DisplayPort to HDMI converter in the Mac mini, enumerating valid video modes and performing mode switching, and more.
https://youtu.be/uTZISTjqy9Q?t=7951
https://asahilinux.org/2021/08/progress ... gust-2021/

phpBB [video]

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Apple M1 DCP (Display Coprocessor)

Post by Chief Blur Buster » 01 Oct 2021, 18:11

Very interesting! Thanks for posting this.

But here’s a bigger co-processing feat. Coprocessor GPUs built into the display!

In-Headset GPU Being Used As Secondondary Coprocessor GPU in some good VR Headsets Connected to PC GPUs

Long term, we may also have a co-GPU in the display for certain tasks. We already are doing this architecture for PCVR streaming to a Quest 2 headset.

The main GPU renders the 3D graphics on the PC, but the GPU in the Quest 2 can do rotational 3dof reprojection (spherically rotating the 3D graphics locally in the in-headset GPU) to do lagless head turns without needing PC rerenders.

Due to the limited bandwidth of USB-C (compared to an uncompressed video signal), and the use of H.EVC codec at E-Cinema bitrates (up to 300 Mbps) transmitted between PC and headset over a USB-C cable (or over WiFi 6 at 100 Mbps) -- there is a tiny amount of streaming lag due to video compression/decompression (using H.EVC in place of DisplayPort or HDMI). So to solve this lag, GPU coprocessing is used.

The new Quest 2 VR headset can run VR content locally (on its built-in mobile GPU), but for more complex graphics (e.g. a game like Half Life Alyx), you can connect the Quest 2 via a cable to a PC running a PC-based VR game for better graphics (via Oculus Link or Air Link).

This is where a powerful local in-display GPU comes in to modify (via a reprojection algorithm) previous frames based on real time input (head tracker) to shift the scenery around faster during a head turn. Without waiting for a re-render from the PC. So scenery during head turning feels instant on a Quest or Quest 2. Instead of graphics lagging behind during a head turn. The GPU coprocessing can now run at up to 120 frames per second (90fps default, but Quest 2 got a software upgrade allowing 120Hz), eliminating video codec lag away from head turns.

In other forms, Quest 2 uses its GPU like a coprocessor during AirLink/Virtual Desktop as a rudimentary frame rate amplification technology -- even low frame rates (30fps or 60fps) streamed from the PC GPU will still headturn laglessly at 90fps or 120fps thanks to the local Quest 2 GPU. This is called 3dof spherical reprojection in currently industry parlance, but fundamentally it's an emerging building block of frame rate amplification -- decoupling frame rate from the media.

Ideally you want 120fps 120Hz or 90fps 90Hz, but the GPU coprocessing prevents head turn frame rate from ever running less than display refresh rate! So the VR scenery pans past your view at 120fps whenever you're head turning left and right, no matter what the original frame rate of the underlying VR graphics was.

The result is that the monster in the VR world may be running at only 45fps sometimes (PC GPU struggled to generate complex VR graphics), but scenery on the VR screen scrolls left/right at 120fps (thanks to GPU in the headset) whenever your turn your head left/right.

Nice GPU coprocessing task, already being done today! They don't call it GPU coprocessing (they call it "local reprojection"), but it is essentially meets the definition of co-processing, since the PC GPU and the Quest 2 GPU are working together concurrently to keep frame rates high.

So in-display co-GPU’s have many use cases in today’s world now.

Future Video Codecs May Require GPU Coprocessors

Tomorrow’s H.268 or H.269 codecs of the 2030s of the 2040s may be 3D geometry based and/or framerateless (vectorized video that can be natively played back at any desired frame rate), which will definitely require display-side co-GPUs to decode. I expect gigabytes + powerful GPUs to decode these video files. Some early lab tests show promises — AI compressors will eventually do reasonably good 4K today at VHS bitrates (e.g. mere kilobits per second) but almost require a supercomputer and a large stock Earth texture library to encode/decode. The longer the video file, the easier it is to build in the texture library into the video file, with AI algorithms to autocomplete missing detail at excellent S/N ratios to the original uncompressed. It’s pretty wild napkin exercises going on about co-GPUs of the future.

Today modern equivalent of the 1970s/1980s science labs testing HDTV, are now researching advanced concepts like these through multiple possible methods of eliminating a frame rate away from video files (codecs with AI, codecs utilizing 3D geometry, codecs utilizing temporally dense raytracing, codecs utilizing timecoded photons, etc). The theory is that switching to geometry instead of planar video, allows incredible compression ratios, and also makes certain video files easier to convert to game geometry / VR geometry (6dof) without the odd “spherical map” look.

— Want the video file to play at 24fps like a cinema. Plays perfectly native 24fps with movie-camera shutter strobe.
— Want the video file to play at 59.94fps like television with a fast shutter? Plays perfect native 59.94fps

Same video file — because the file is framerateless and the frame rate responsibility is now the decoder’s responsibility and/or the displays’ responsibility (can be either). There would be frame rate hinting (“preferred frame rate”).

No interpolation, all frame rates are all simultaneously native!

More reading: Framerateless Video Files

Some of this will require display coprocessor GPUs to achieve ultra high frame rates at ultra high refresh rates (e.g. 1000fps 1000Hz) since video cables may have difficulty transmitting 8K 1000fps 1000Hz Dolby Vision HDR uncompressed. However, legacy displays will work too at lower frame rates (by having the player software run a frame rate that is appropriate for the legacy display).

Obviously, far-future stuff (10-25 year time window), at least two or three major H. revisions away.

Note: We have a design in-house for an 8K 1000fps 1000Hz non-DLP display, that is technologically possible with today’s technologies (ideal for a simulator environment). But video codec, realtime rendering, and camera technology is behind at the moment.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

stl8k
Posts: 23
Joined: 15 May 2019, 07:59

Re: Apple M1 DCP (Display Coprocessor)

Post by stl8k » 07 Dec 2021, 22:08

Fascinating Chief! I now have some US Christian holidays break research streams to follow!

User avatar
Chief Blur Buster
Site Admin
Posts: 11647
Joined: 05 Dec 2013, 15:44
Location: Toronto / Hamilton, Ontario, Canada
Contact:

Re: Apple M1 DCP (Display Coprocessor)

Post by Chief Blur Buster » 07 Dec 2021, 23:05

stl8k wrote:
07 Dec 2021, 22:08
Fascinating Chief! I now have some US Christian holidays break research streams to follow!
I am hoping to add one or two new articles to Blur Busters Area 51 (website section), which needs some new content soon.
Head of Blur Busters - BlurBusters.com | TestUFO.com | Follow @BlurBusters on Twitter

Image
Forum Rules wrote:  1. Rule #1: Be Nice. This is published forum rule #1. Even To Newbies & People You Disagree With!
  2. Please report rule violations If you see a post that violates forum rules, then report the post.
  3. ALWAYS respect indie testers here. See how indies are bootstrapping Blur Busters research!

Post Reply