Fully Packaged Telepresence Future
Hi everybody,
if you feel like taking a look at another ongoing research project regarding a possible telepresence future, this one is the right one for you to check out. It combines head-mounted display presentation, Kinect-based 3D-scanning for remote-person presentation and the switch for full to partial immersion (adding remote elements in AR or including oneself in 100% VR (or well: in RR – remote reality). The concept looks promising, combining different technologies, though the video showing the intermediate results is a big shaky (I would have liked to see the remote person on a beach, where I could immersive, too!). :-)
The corresponding paper: “General-Purpose Telepresence with Head-Worn Optical See-Through Displays and Projector-Based Lighting.” will appear in IEEE Virtual Reality 2013, in Orlando, FL, in March. Written and coded by the people from the University of North Carolina at Chapel Hill and the Shanghai Jiao Tong University.
An interesting idea is to use a projector system to get a better opaqueness for virtually added objects (that might be translucent due to the presentation via an HMD): “… Projector-based lighting control is used to illuminate only local objects that are not occluded by remote objects, allowing those virtual objects to appear opaque. Kinect-based depth sensing also supports occlusion of virtual objects by real objects — only parts of virtual objects closer than real objects are drawn.”
The 2nd part of the video shows this better.
Looking forward for more of this! … and the virtual beach!