Tech Innovation

How the RealSense 3D camera works (and some clever things you can do with it)


The RealSense 3D camera is arguably Intel’s most exciting and intriguing technology. The compact camera system that can scan in 3D objects, bring onscreen avatars to life and allow photos to be refocused after they’ve been taken, is the product of over three years of R&D.

It has the potential to change the way computers see the world.

The RealSense camera itself isn’t a single lens but three — a conventional lens, an infrared lens, and an infrared laser projector lens. Together with onboard image processing hardware, these components work together to capture a 3D image, measuring the distance between pixels on any objects that fall within the camera’s multiple fields of view.

RealSense camera technology
The Intel RealSense camera technology is small enough to fit inside a laptop or a tablet.

Dedicated software makes sense of what the RealSense camera sees. It works by making a 3D model of whatever it the camera is trying to track and then it compares that model to what the camera is currently seeing. The software that Intel has developed is so precise, it can model and track the 3D positions of 22 joints in a human hand.

RealSense can track gestures and recognise speech

At the Intel Future Showcase 2015, RealSense technology turned the HP Sprout from a basic all-in-one PC into a dual-touch, gesture-responsive, 2D/3D scanning Windows 8 machine. Keyboard-less and mouse-less, the HP Sprout offers you a different way to work.

RealSense is already starting to appear in laptops from Dell, Asus, Acer and Lenovo. But by implementing the technology into a tablet like the Dell Venue 8 7840, a RealSense camera is able to digitally scan anything. Including me, as you’ll see in the video below.

Using a RealSense-equipped tablet, Intel Product Manager and Marketing Manager Scott Dwyer was able to scan in a 3D model of my upper body. He did this by walking around me with the 3D camera, recording me from all sides to capture a 3D image.

Digitise your face with 3DMe and become a video game character

This unprocessed data is impressive on its own, if a little rough around the edges. But RealSense is more than just powerful hardware. The software that Intel has developed to make the most of it is the technology’s secret sauce. In this case, it’s used to refine the visual data in the 3D model, intelligently filling in any blanks and smoothing out jagged edges.

This 3DMe scanning technology can be used to create an onscreen avatar that moves and responds in real time as your face moves. It could even be used to digitise your face and graft it onto a video game character, etch it in glass, or 3D print it to customise an action figure. Check out the process in the video below.

If these examples feel like gimmicks, then consider the wider possibilities of a camera that can map and model its surroundings over 200 times every 0.02 seconds. According to Scott Dwyer, a RealSense camera is also capable of scanning an environment and digitally augmenting it in real time.

RealSense has the potential to transform computing

Think digital characters that run across your tabletop or procedurally generating a new game level based on the room you’re standing in. Or how about pointing a RealSense camera at something you want to fix, like a car engine — the camera could potentially model and map it, overlaying the names of the various parts so you can see what you’re working with.

Again, RealSense’s almost magical abilities require a combination of cutting edge hardware and powerful software. The challenge is getting both to work together successfully outside the lab. After all, everyone remembers the problems that Microsoft had with its Kinect system — the reality was often far more imprecise and underwhelming than the early tech demos suggested.

But get it right and RealSense could transform computing, ushering in an age of gesture-controlled PCs and a new generation of robots that can finally comprehend, navigate and interact with their surroundings. — Dean Evans (@evansdp)

Share This Article

Related Topics

Tech Innovation

Read This Next