thatoneguy wrote: ↑20 Sep 2020, 04:54
Therein lies the philosophical dilemma.
thatoneguy wrote: ↑20 Sep 2020, 04:54
Now this might not even happen before I die but the thought of it saddens me. It definitely seems like that's where video games are heading towards. Video Games have already been going downhill in the past decade but the death of the medium as an artform seems inevitable.
<PandoraBox state="open">
Sure, various concerns do exist. We can continue that in the Lounge area though. There are a lot of bigger issues to worry about such as general historical preservation, since lots of old film reels are degrading in archival vaults, and data rot in lots of data no longer accessible (witness: How hard it was to read the old Apollo mission videotapes) due to obsolete equipment and formats. And of course, the deepfake concerns and AI-based newsfaking, etc. But these are multiple separate pandora boxes being spawned which I'm leaving this out of this thread -- but they can be good vigorous discussion in Lounge / Offtopic area instead of the Laboratory area.
</PandoraBox>
Without further ado, I'll solely focus on the technology and the angles of Blur Busters flavour;
thatoneguy wrote: ↑20 Sep 2020, 04:54
For some people the AI Created world will always be an approximation and will never feel like the real thing.
Perhaps a Holodeck won't ever be five-sigma perfect, but an eventual objective is to get reasonably close, maybe two-sigma or better.
It may take three or four decades, but the object is to pass a Holodeck Turning Test, where you can A/B test clear ski goggles versus VR headset, and not be able to tell apart reality from virtual, at least for an economically significant percentage of human population (say, >90%). That requires retina resolution + refresh + really good AI.
thatoneguy wrote: ↑20 Sep 2020, 04:54
Even if we were to achieve a somewhat feasible interpretation of it in VR that is similar to a 4D cinema experience
I'm not talking about even 4D experiences. I'm talking about being immersed in virtual reality not knowing you're in virtual reality. Many of us now know the various vision/display limitations that prevent this from being achievable.
Now, imagine AI generating that quality.
I think that technologically, it potentially will happen this century -- the question is if it's year 2040 or 2050 or 2060 or such. It requires a kind of convergence between retina resolution + retina refresh rate, and the graphics horsepower to keep up (probably with assistance of
frame rate amplification technology). it might need a realtime renderfarm for some batch processing like preparing unseen worlds beyond view like around corners, and/or tough tasks such as ultra-fine-granularity global illumination that can be batch-tasked to different GPUs in very massive parallelization), and some possible divergences away from traditional polygon architectures to improve efficient ultramassive parallelization.
____
Although we likely still have to wear equipment (glasses/headset), there is a theoretical engineering path to an A/B blind test between clear goggles and a VR goggles, where the person isn't told what they're wearing, and then asked if they're in reality or virtual reality when presented a perfect 6dof vertigo-sync situation (which is already being done today with Oculus Rift -- like standing in a virtual room and leaning to look underneath a table surface, taking a step forward and back, tilting head. Obviously, the touch problem will exist (trying to touch virtual reality objects that don't exist in real life), but perfect-feeling vertigo sync is now mostly a solved problem at least for a roomwalk (without touching objects).
Walking a room in Oculus VR (like the Oculus trainer app, "First Contact") is MUCH less dizzying than watching a Real3D film or sitting in a 4D amusement ride, because vertigo sync is maintained in that Oculus VR app; you tilt your head 2cm, VR tilts 2cm. You look under a desk, it feels realistic. You crawl (physically on the floor) under a VR desk, it feels realistic (as long as you're not touching the nonexistent VR desk). Etc. Most of the better VR games have a default mode of Half Life: Alyx is very well vertigo sync'd (although you have to teleport to move distances further than the size of your room), since traditional movement unsync'd with real life, can inject nausea/dizziness.
For me specifically, my current high-end VR headset already feels than 100x better than the best 4D amusement ride, at least in a sufficiently big enough room that allows me to walk (1:1 VR-to-reallife vertigo sync) around instead of using teleporting. So perfect vertigo immersion was achieved, but the lowness of the Ouclus screen resolution + flicker/stroboscopics (90Hz strobed screen) + lens-based artifacts (aberration at edges), betry that I"m in virtual reality. But if you can (eventually) pretty much retina-out all of that, AND bring the graphics rendering up, then the Matrix effect / Holodeck effect can become real -- not knowing if you're in VR or real.
Ignore the rollercoaster demos and focus on the immersion, like being able to walk around on an actual public-transit vehicle and leaning physically to look out of the window (without touching window), and not knowing it's real-life versus virtual reality. You can already roam in VR today by walking around a room in VR. Oculus App Store uses a comfort rating system -- with three levels; Comfortable / Moderate / Extreme. The "Comfortable" (good vertigo sync for most stuff, even grandma will go wow), "Moderate" (like occasional locomotion moments that has no vertigo sync to real life) and "Extreme" (VR rollercoasters and nausea-inducing stuff go here). And many of these headsets will automatically pop up a grid (or fade the video room into view, from the VR's exterior cameras) if you accidentally almost walk out of bounds of the your assigned VR room space -- preventing you from accidentally crashing into walls or coffeetable. It might temporarily break the immersion effect, but at least when staying within the room bounds, you're generally operating in perfect or near-perfect vertigo sync during roomscale operation.
If you haven't used a modern 6-degrees-of-freedom VR headset in a roomscale walkabout mode -- the most "Star Trek Holodeck" you can get today -- even just an Oculus Rift, or better yet, a HTC Vive Pro (sharper than Rift) on an RTX 2080+ rendering the best-resolution VR-immerssion demos with 1:1 vertigo sync (physical room leaning/jumping/walking around = virtual room leaning/jumping/walking around), you probably haven't yet properly imagined what I'm currently imagined. Just walking up to a 4D ride or demo VR headset or Google Cardboard and suddenly seeing a rollercoaster that gives you nausea, isn't the right approach for a person to try and understand this forum thread.