Kniefall for VR – Blending Realities at FMX
The FMX conference with their tagline Conference on Animation, Effects, Games and Transmedia is typically all about the visual effects industry and the technology around “classic” productions. But for a while now we can see real-time tools around and especially the last two years lots of virtual reality demos. This year VR was even the focus for a complete track. AR was touched rarely, but we sure can learn a thing or two for AR, too – while following the path of VR.
Many Vive, Oculus and GearVR demos were shown. Many demos again and still with glued-on tracking targets for OptiTrack or Vicon systems to allow the user to run around with their positionally untracked mobile VR devices. Many talks circled around VR productions, lessons learned or the future of story telling with the new media. So much Virtual Reality in the Vfx industry – a kniefall for VR in Stuttgart. Let’s take a look at some demos today.
Some of my favorite demos include the untethered multi-player Oculus demo from artanim where you would need to solve a simple task of inserting sticks into a box and walk through a small dungeon (Don’t step on the wrong stones, Indy!). The demo (video below, a video of the first SciFi part here) shows nicely how easy VR is understood by first-time users and how immersive it really is when you have no wires attached and a simple hand and feet tracking added. People don’t dare to run through walls and really duck when appropriate. Shaking hands and passing over a torch between players worked flawlessly. We all need bigger (and empty) living rooms in the near future! Multi-player VR can get huge like this. I want to become an unshaved, unwashed addict (like in Ernest Cline’s Ready Player One – did I recommend it yet? Read it!).
Another multi-player demo is shown by Ken Perlin and his team from NYU. It has been on the road before known as Holojam and shows nicely how people engange in VR together and cooperate (or disturb) in the virtual space. The simple painting task is made a lot more fun through the simple avatar representation of one self and the others. The lightweight GearVR made it definitely more comfortable than the backpacked demo above. We can see where this is all heading! Great!
First Lessons Learned
Of course, big industry players had their demos as well. While Unreal Engine was used by many demos anyway, Epic showed their great Bullet Train demo again (see a chat with Tim Sweeney on it here from Tested). For many it was the first time to lay hands on the Oculus Touch controllers. Nick Whiting from Epic explained extensively the development of this quick tech demo (10 weeks of overall production) and the lessons learned in a talk. Of course it feels bigger than just a tech demo! It’s so intuitive and fun to play – you can feel the brain power that was invested. Quick look first:
The demo is obviously a lot of fun, but many trials were made to optimize it and see how we can interact with the virtual worlds. The controller-focused demo lets us grab and throw things, fire guns and teleport around. Some lessons learned Nick mentioned relate to these topics. E.g. teleportation can be weird in VR if you just appear somewhere else instantly. They helped the user by point-and-clicking the desired location first (with a time bending SloMo to not get lost) and then teleporting: a short fade to white (like blinking) helped with the cut and a trail is left behind, that you will notice at your arrival. This especially helps to find your way around (as direction also changes on teleport arrival facing the center of action).
Keeping it simple is key to success right now. Interaction with gamepads or keyboards is so well understood, that you think: well, I can handle 10 buttons in VR! But that’s just not the case, Nick commented. I found it to be true: while playing bullet train you only use 3 buttons – and yet you can get confused easily as the button metaphor is not really realistic. Everything feels so natural in the immersion – but the buttons still break it a bit. You need to use one button (and hold it) to grab a gun and the index finger button to trigger a shot. Here you can mix things easily up and drop the weapon (at least if you don’t have practical shooting range experience like me). As easy as possible (maybe until we all get VR pros). Same is true for aiming & shooting – especially throwing rockets or grenades. They cheated a bit to support us: if we suck too hard, they will kick in some helper algorithms to let us have fun and have some success. Best quote on it: “The longer you suck, the less you suck!”
Many other minor optimizations help the experience to become great. E.g. an easy way of picking up objects by extending the collision volume of your hands or by adding a slight rumblepack buzz when overlapping of hands to objects occurs. Smoothing out the tracking info for your gun-holding hand helps to make it more stable – to make up for the missing weight of a real gun. These minor adjustments help a lot. If you haven’t tried it out – go for it to learn and for the
gun fun of it!
The last demo (of many others) I want to mention today gave quite a uncomfortable feeling, which was good. Why good? Because you believe the demo and the immersion. You can really feel present in your teleported location and context and we can see how (big words) mankind can build up empathy and gain knowledge about the world through VR. The Project Syria from Emblematic is good example for this.
Be right back for more! :-)