Keep focused with MagicLeap

Finally! Some more insights into MagicLeap’s work! A new video hit the web on their youtube channel (without any comments in their blog, etc.)! Besides, their website is also updated (jumping whale in the gym and other cool fun things), so take another look if you haven’t.

Their new video demo shows briefly two scenes, claiming to be recorded directly through “Magic Leap technology”: a first scene showing a flying tinman robot thingy below a table and secondly a floating solar system above a desk.

Three major technologies can be seen in here: a rather stable tracking (seen before), real-world knowledge (for occlusion, seen before) and a focus switch from foreground (augmented solar system) to background (real world) with rather realistic-looking blur outside the focus (not seen before). Cool! Take a look:

The tracking and the occlusion have been seen before – and sometimes better and more stable, though it’s a bit hard to judge from the brief piece (I want objects glued down to the ground! Not floating things!). The new part is the focus-change resulting in out-of-focus areas blurred. This gives a way better convincing integration of CGI content within the real environment… well done! But… the real question remains unshown: how does the actual device look like and was the re-focus triggered by the user’s eyes (jumping focus between near and far objects)? What does the term “ML technology” really refer to? If it were see-through technology (you wear in front of your eyes) you would not see the real world if recording the CGI part. It would only work by fixing a recording camera in front of the HMD (where your eyeball would be)…

So, the most interesting part reamins in utter darkness (as usual). They tease us again with two known tech parts and one snippet of news. Aaargh! Tell us more! :-)