Augmenting your Nightly Fears
Today, I’d like to look at Night Terrors, subtitled with Augmented Reality Survival Horror. It’s a planned AR game from Novum Analytics and they are trying to kick it off with fund-raising for the home stretch.
From their kickstarter page we see the description of “A photorealistic, ultra immersive gaming experience that transforms your home into a terrifying, unfamiliar hellscape.” featuring binaural spatial sound and on-the-fly scanning of your very own home to play inside (to make it more fun: you can only play at night)! A short quote:
The game controls what you see, what you hear, and where you go. Your device’s LED is all the light you get. The camera and microphone feeds are analyzed and processed in real time. Photorealistic elements are added to the camera feed. Audio is spatialized, mixed with the microphone feed, and then routed to the headphones delivering an immersive binaural audio experience.
The story is said to be kept simple “save the chic and survive”, but the game nonetheless highly immersive to turn your own home into a nightmare experience with very responsive and reactive elements in the game that react on the real environment and your actions. Maybe you have already seen the teaser from November, now a longer kickstarter video went live. Let’s watch:
The video looks quite promising, but I’d love to learn more about their technological concept side. They state to be mainly focussing on four algorithms to enable the experience in dark environments, only resorting to the sensors of a mobile device (being iphones at first). They list algorithm names like INVERSE² (ref. to lighting), PRIMES (on positioning and spatial information), 2.5D ARC (on inserted elements) and an undescribed FLEXTEL.
They are taking a few assumptions to help their algorithms (like asuming 90° wall relations in the house), but seem to get pretty far with the standard device sensors. Their “PRIMES” algorithm stands for “Position Ready Inertially Motivated Environment Scanner”. The gyro and other device sensors are exploited to locate the player and generate coarse 3D map of the world (without the use of “actual” depth sensors) using optical flow algorithms.
Their “2.5D ARC” piece describes the concept of not using rendered 3D objects as the inserted story elements, but rather photographed/recorded real people and props (which I like a lot). They get layered together with specific situation-reactive lighting to get a more convincing feeling for the AR story than with cheap mobile renderings. First edits in the video look very nice, but judge for yourself.
I’m very intrigued by the concept. I like the approach of focusing on one (small) game at first, that exploits all technology to showcase. An AR game with vision-based tracking at night is quite challenging and I wonder how good their combination of algorithms will be in the end. Looking for ghosts or night terrors with a special device is a good match for the story (that explains why you have to have an hand-held device) – even though it has been there before. I’m looking forward to see more of it.
Their approach of really letting the game characters and logic react to your own real space (your house) and your actions is an important step AR games still have to undertake! Changing the audio of enemies (behind walls in other rooms) and their behavior, interacting with your real home – it all sounds good…
…but funding has still to come. They are aiming for a pretty high $140,000 with 24 days to go today. Check it out for sure!
If you are too afraid, you might want to look at the latest family-fun video on the latest plans from CastAR instead! :-)
On a personal note, I’ll be traveling for a while. Therefore expect the next big update only early in May!
Stay tuned on more AR!