ISMAR – Live Demos to enjoy real world interaction
Time for the 1st ISMAR demo summary! Since you can get a pretty good overview of all demos at ISMAR on their own page, it won’t make sense to quote all here, but to focus. The main interest lay in products using mobile planar tracking for ipads & alike, but also in advances in the reconstruction of the real world as polys, i.e. via the Kinect or even via monoscopic camera setups for mobile devices. Also social media and social tagging (via facial recognition) were hip. To show some, starting with two racing games:
AR Micromachines: An Augmented Reality Racing Game
Adrian Clark and Thammathip Piumsomboon from the University of Canterbury came up with this concept of AR Micromachines as a table top car racing game. Real world obstacles are being recognized in real-time via a Microsoft Kinect and combined with augmented reality graphics to create a race game. Users create their own tracks using real objects, placing coffee cups and books, and then race virtual cars head to head with realistic physics, occlusion and shadows:
Again it’s the real working concept of mixing the realities, that makes this demo so great. Another reason why the Kinect (and similar products) will have such a huge impact on further AR development. This interaction and real world collision is so crucial for a good experience! Using Kinects you no longer need stupid markers to define walls or floors (as I still had to in my old AR:Race demo ;-)). Great demo!
Sketch a race
Another neat ipad based racing demo comes from our blogger buddy from gamesalfresco – Ori inbar. Ogmento came with a sketch a race approach of creating your game world: Through the iPad camera, the game detects a hand drawn sketch on a paper (any sketch), and accurately overlays a race track on top – in real time.
Again, it’s the real world interaction I like most. The winning combination of playful real world actions to yield in a truely augmented experience. Just putting a marker there, would again only end in the same boring (well, not yet, but one day…) experience of “oops, there is something on a card, that wasn’t there before… well… ok”. Problem with these demos is often, that the real world is just another background for your game. It could most of the time be a better virtual-only game. If you chose to have the real world as an optional component, you have to make proper use of it and integrate it. That is what really happens here, when drawing the tracks! :-) Thanks, Ori.
AR Coloring Book
The coloring book demo from the HITLab and Adrian Clark and Andreas Dünser (University of Canterbury) shows the same great understanding of true AR, lets quote: users are able to color in the pages, and these pages are then recognized by the system and used to produce three dimensional scenes and textured models reflecting the artwork created by the users. This three dimensional virtual content is then overlaid on the real book pages, providing a three dimensional experience using the users own content.
It’s just such a great combination for kids to have 100T% haptics and “traditional” media to be creative and then just flick a button to bring those drawings to live in 3D! Same critic as above: just great! :-)
Audio Augmented Reality
Touching another area of AR, I’d like to shortly mention the demo called “ARA Indoor-Outoor Navigation for Pedestrians” from Mathieu Razafihamazo, Yohan Lasorsa, David Liodenot, Audrey Colbrant (INRIA), Jacques Lemordant (UJF-INRIA-LIG):
We have build a high precision (at a level of a step) navigation system using path integration, 3D audio cues, and structured environment representation which are probably the navigation aids used implicitly by visually impaired people.
Audio guides via sound-based AR are just to seldomly spread. The accuracy was amazing and I hope we soon get a google maps plug-in, audio-overlaying our smartphone mp3 tracks during the day…
BurnAR: Feel the Heat
One more… right when entering the demo room, you bumped into a black wall with markers and people waving their hands with a bulky HMD. Intrigued? As you get closer and get a short introduction from the guys from the University of South Australia and Graz (with Fairlight):
In this demo, a user will experience their own hands interacting with complex graphics simulating smoke and fire effects in the environment. A user will look through a stereo head-worn display (HWD) at their own hands which will start to smoke and interact with flames. A real-time fluid simulation running calculates the volumetric effects using the users hand as input for motion and interaction surface. The hands’ location and depth will be estimated from the stereo view delivered by the HWD’s camera pair. Overall, the immersive experience of the user’s own body interacting with the striking, high-quality graphics effects will create an exciting demo.
I have to underline this part of exciting. This – just being a demonstrator and no product – already clearly shows what is important to give an immersion within the real world: a close relation to the user (within-reach-of-your-hand augmentation) and a plausible physical behaviour. The flames going up and reacting to hands (skin-tones) only, make it clearly a winner’s demo! … and so they won the demo award! Congratulations to all involved, named at ISMAR page: Matt Swoboda (Fairlight), Thanh Nguyen (University of South Australia, Graz University of Technology), Ulrich Eck (University of South Australia), Gerhard Reitmayr, Stefan Hauswiesner, Rene Ranftl (Graz University of Technology), Christian Sandor (University of South Australia).
Thanks to all! To be continued… :-)