Here’s what Project Tango isn’t. It isn’t an advertising strategy for an orange-flavoured fizzy drink. Nor is it anything to do with learning Latin-American ballroom dancing. Instead, it’s an ongoing commitment by Google to develop “a mobile device that can see how we see.”
This means building software that brings together motion-tracking data, area learning and depth perception. It means having hardware that can sense movement and orientation (using an accelerometer and gyroscope), plus multiple cameras that can perceive depth (calculating distance, size and surface textures) to map an area in 3D.
This high-tech combination will allow a smartphone or tablet equipped with Project Tango to understand ‘where’ it is in any given environment and how it moves through it. By tracking orientation and motion, a Project Tango device can do some amazing things — enhancing augmented reality applications and precisely measuring/scanning the environment.
The first wave of Project Tango hardware has been designed specifically for developers to play with. The Intel RealSense Smartphone Developer Kit, for example, is an Android smartphone with a 6-inch QHD (2560×1440) display, Intel Atom x7-Z8700 processor and 64GB of internal storage. It looks like a conventional phone and, like many conventional handsets, it features an 8MP rear camera and a 2MP front-facing lens.
But you’ll also find a ZR300 Intel RealSense camera package squeezed inside it. This combines a camera for computing high density depth (>10 million points per second) with a wide-field of view VGA camera (with a FOV in excess of 160 degrees). These work together with a high-precision accelerometer and gyroscope to enable the motion and feature tracking that Project Tango is aiming for.
Developers have been busy exploring Project Tango’s potential. Early ideas include outdoor navigation apps for the visually impaired, mixed reality filmmaking, virtual pets and real time construction site scanning. While the Elementals Home AR Designer lets you try out virtual furniture via an augmented reality view of your home.
There are gaming opportunities too. Project Tango could give mobile-powered systems like Google Cardboard and Gear VR the ability to track/map the position of their users, much like high-end virtual reality rigs do. A Project Tango plugin for Unreal Engine 4 already allows developers to experiment with spatial awareness.
In 2014, NASA even rocketed two Google Tango devices to the International Space Station, where they were trialled in football-sized autonomous robots called SPHERES. The aim? To research whether the 3D modelling and mapping functionality of Tango could help a robot flyer learn and navigate its surroundings.
We’ve already seen how Intel RealSense camera technology is giving drones self-flying abilities and is helping fashion designers create made-to-measure clothing. Project Tango gives us a glimpse of what the technology can do inside the smartphones of the future.