Drone360 Menu


Enter keywords or a search phrase below:

News & Notes

Making Sense of Intel's RealSense

Yuneec's Typhoon H introduced us to collision avoidance, but how does it work?

August 17, 2016

At CES in Las Vegas, NV, earlier this year, the Yuneec Typhoon H with Intel RealSense Technology received an avalanche of recognition, mainly due to its groundbreaking collision avoidance system. In a live demonstration on the show floor, it detected not only a pillar placed in the center of its flight cage, but also the cage itself: a lattice of cords narrower than the thinnest twig.

Except for the few people developing the Typhoon H, no one realized that the seeds of its success had been sown five years earlier and 550 miles away, by a team working at the Intel Perceptual Computing Group in Santa Clara, CA.

“It was about having the computer be able to sense its environment, so that it could understand the world better and so that people could interact with it in a more intuitive, immersive, and natural way,” recalls Anders Grunnet-Jepsen, chief technology officer and director of the Advanced Technology Group, Perceptual Computing at Intel.

Drones were not high on the list of potential applications for this new technology. It was 2011, and pioneering enthusiasts were creating their own flight management systems for multirotors using parts scavenged from Nintendo Wii controllers. The notion of integrating a GPS receiver to enable position hold or autonomous flight was a cutting-edge concept — the idea that a drone could see and avoid obstacles on its own was laughable.

The fact that so much progress has been achieved in such a short time once again validates the concept of “drone years,” akin to the concept of dog years: A year’s development in drone tech occurs at a rate equivalent to seven calendar years.

The rapidity of drone tech development can make us complacent — making it seem as though advancements happen organically. However, the achievement of sense and avoid was hardly inevitable. It could not have been predicted with the same mathematical precision that Moore’s Law anticipated the evolution of computer processing power. Instead, it is a perfect example of technological serendipity, of luck being the happy result when preparation meets opportunity.
Start talking sense

RealSense camera technology is among a group of many sensory technology initiatives at Intel, according to Natalie Cheung, the company’s UAV products manager. Up until January 2016, sensory tech’s most prominent applications were, quite literally, yet to take flight.

“The goal is to add human-like senses to computer systems — ears, eyes, voice, and touch — to provide a natural connection with the user,” Cheung says. “At CES, our CEO Brian Krzanich talked about how this type of technology can be extended to all kinds of new markets: virtual reality, augmented reality, robotics, and smart homes, for example.”

While popular tech like Apple’s Siri and Amazon.com’s Alexa are busy replicating human hearing, the RealSense project deals with sight. Specifically, it focuses on depth perception by providing a computer with the ability to understand a 3D environment — the size, shape, and relative location of objects — in a manner comparable to human vision.

An example of this technology in use is the robotic hospitality butler Relay, made by robotics company Savioke. This service industry robot was demonstrated last year at the Intel Developers Forum in San Francisco, CA.

“The idea is the following: Let’s say that a guest forgot to bring a toothbrush. Instead of the receptionist having to step away from the front desk, they can just give it to the Relay Service Robot, who will take it up to the room,” Cheung says. “RealSense allows it to avoid obstacles, like luggage and housekeeping carts, or even people moving around in the hallway. It can even use the elevator!”

Another example she cites is Memomi, a smart mirror that is currently in use at Nieman Marcus stores.

“You can stand in front of it and change the color of the clothing you are wearing to see how you look in different outfits, without having to actually change clothes,” says Cheung. “RealSense technology not only tracks your position to map the different colors onto your body, you can also use gestures to make those changes.”
Get real

Scientists and engineers at Intel originally anticipated that the RealSense hardware would be built into the frame of a laptop computer or a tablet, where regular, visible-light cameras have found a ubiquitous perch. Consequently, they made the system small — very small.

The board is less than 4 millimeters thick and weighs less than 8 grams.

“We put a real premium on keeping it small and light,” Cheung says.

There is a remarkable amount of technology squished into that tiny package: a pair of calibrated infrared cameras to provide stereoscopic depth perception, a visible light camera, as well as an application-specific integrated circuit (ASIC). That chip takes the raw data provided by the cameras and uses it to create a 3D point cloud representing the objects in its field of view. The cloud is updated at a rate of 60 frames per second, with each frame requiring 300,000 calculations, for a total of 18 million depth-ranging measurements per second.

The RealSense module even includes an infrared flashlight that projects a specific pattern of invisible markers to help the system cope with flat walls and other near-featureless surfaces.

According to engineers inside Perceptual Computing at Intel, without reference points, it’s hard to calculate depth, similar to the human eye. If you don’t have those reference points and you stare at a uniform surface, you won’t know how far away it actually is.
Active IR Stereo Vision Technology
We humans can perceive depth because our eyes are placed next to each other, but just slightly apart. Close one of your eyes and notice how your depth perception degrades, especially as objects get farther away. The infrared cameras on Yuneec’s Typhoon H perform similarly. The cameras act as eyes, with the left camera providing a slightly different point of view from the right. Utilizing some phenomenal computing power, the software takes over, assessing the digital images and comparing pixels from what one camera sees against those from the other. From the differences between the images, the drone calculates depth and distance.
Illustration/Rick Johnson
Outdoors, in sunlight, the RealSense system has an effective range of up to 30 feet. Indoors the infrared projector’s range is much shorter: between 2 and 12 feet.

“Outside, the range is not limited by the small [infrared] flashlight on the module, because it can use the [infrared] light from the sun to see farther,” explains Grunnet-Jepsen. “However, at shorter ranges — less than 12 feet — it actually performs better indoors than outdoors because that pattern it projects helps the stereo matching and depth measurements.”

Of course, the accuracy of the measurement relies on the mechanical precision of the instrument itself: Specifically, the distance between the two infrared cameras. That value must be known to the tiniest fraction of a millimeter, because it serves as the basis for triangulating the distance between the sensor and the points being mapped in the surrounding environment.

To meet this high standard, each RealSense system is individually calibrated before it leaves the factory. In order to ensure that the critical separation between sensors remains constant, the design incorporates a steel stiffener to which both cameras are affixed. But what if your system still needs adjusting?

“It is possible to use a process called dynamic calibration in the field in order to tweak the calibration constants by a small amount, to account for any mechanical bending that may occur,” Grunnet-Jepsen says. “This can be done without having a large calibration system and lots of known targets. Instead, you launch the app and wave the system around for a few seconds — not unlike when you re-calibrate the magnetometer in your cell phone.”
Drew Halverson
Up in the air

According to Cheung, several years after starting work on RealSense, Intel began to realize the potential value that this technology could contribute to the emerging field of drones.

“We publicly announced that we were getting involved with drones at CES in 2015,” she says. “At the show, we did a collision-avoidance demonstration with Ascending Technologies from Germany. We gave them six [RealSense] cameras and said, ‘Show us what you can do with these.’ ”

The demonstration — which involved a drone flying among people while maintaining a comfortable distance — must have made quite an impression. Intel subsequently purchased Ascending Technologies and did an encore performance on the steps of the U.S. Capitol building in Washington, D.C. Intel has also stepped up its participation in the drone industry in other areas as well, joining the Federal Aviation Administration’s recently established Micro UAS Advisory and Rulemaking Committee.

While Intel was making that first public foray into the industry at CES, drone manufacturer Yuneec began searching for a partner to speed the development of its own systems in order to capitalize on the success of its Typhoon Q500.

“We were looking for technology investors,” explains Yuneec USA CEO Shan Phillips. “We wanted to find companies that could validate our technology and help us define our product road map. When you think about Intel’s decades of experience building systems and their desire to be right at the center of the Internet of Things, working with them started to make a lot of sense.”

The companies struck a deal, and Intel invested more than $60 million in Yuneec. The two then combined resources to make the first-ever collision-avoidance system on a consumer drone, just in time to unveil it at CES 2016.
"The goal is to add human-like senses to computer systems — ears, eyes, voice, and touch — to provide a natural connection with the user"
The chips are down

Cheung recalled her first visit to Yuneec’s manufacturing facility in China, located in the city of Kunshan — a distant suburb of the bustling megalopolis of Shanghai. The region where Kunshan is located is renowned for its lakes and canals, and portions of the city still reflect its ancient heritage. For Cheung, however, it wasn’t the landscape, history, or architecture that impressed her — it was the people.

“They fly drones right outside of the [manufacturing] building,” she says. “It was amazing how well some of those pilots could fly manually. You could just tell right away that they are experts.”

Nearly 2,000 employees work at the factory, which produces more than 1 million drones annually.

Phillips says, “We’ve got everything we need to build drones on site: from surface mount technology to injection molding machines. We can do everything but fabricate the chips ourselves.”

Using RealSense technology to create an effective collision-avoidance system requires more than bolting one of the modules to the front of a drone — the flight management system must be able to interpret the 3D map it creates, along with data from the aircraft’s accelerometers and gyroscopes, GPS antenna, and barometric pressure sensor, and the pilot’s control inputs.

With 300,000 depth points being delivered 60 times per second, the drone needs some substantial onboard computing power just to keep up. That capability arrived in the form of an Intel Atom chip: a small, high-efficiency processor developed specifically for mobile applications like smartphones and tablets.

“On the smart transmitter that ships standard with the Typhoon H — the ST16 — we’ve included a second Atom processor,” says Phillips. “Its integrated graphics capabilities let us do better video management, which is important because we’re providing full 720p HD video right there on the transmitter.”
Intel integration

From China to California, the engineers from Yuneec and Intel quickly came together and formed an effective team, according to Cheung.

“We’ve had some fun times together, but nothing compared with having all of us together in Las Vegas for CES,” says Phillips. “Winning all of those awards together, as a team, was really special to us. It represented the larger community of drones and personal electronics coming together to validate all of our hard work.”

The successful integration of RealSense with the Typhoon H brings forth a question: Is Yuneec planning to reach deeper into Intel’s vast trove of technological marvels to add even more advanced features to future drones?

“Certainly, yes,” Phillips says.

Note: A version of this story appears in Drone360 's September/October 2016 issue, which is available online and hits newsstands on Sept. 6. A photo that appeared in the print story was wrongfully placed, but the correct photo has been placed in this web story.
Featured image: Drew Halverson