Embedded sensing in autonomous navigation

While autonomous navigation is still in its early stages, there have been many technological breakthroughs.

By AIA August 17, 2018

Autonomous navigation is a difficult thing to achieve, and dozens of companies are racing to become the leader in this technology. While there’s no one way that’s proven the most effective yet, many companies are zeroing in on a similar set of technologies for autonomous navigation.

Whether it’s drones, boats, military vehicles or consumer vehicles, the world is full of unpredictable variables. Any autonomous vehicle has to be able to account for these unpredictable variables without pre-programming. This requires the use of a number of different types of sensors and data streams, which is currently one of the biggest challenges in autonomous navigation.

A typical autonomous vehicle has to estimate its own position, calculating acceleration and angular rate, while mapping or sensing the environment around it. This requires layered sensing networks with machine vision and lidar technology at the heart of it all. However, there are other streams of data that must be input for true autonomy.

Many autonomous vehicles are heavily dependent upon a GPS, which delivers vital information to accelerometers, gyroscopes, magnetometers, and on-board inertial measurement units (IMUs) for accurate position estimation. Microelectromechanical systems (MEMS) technology has made these systems small and powerful enough to bed within a wide range of vehicles for autonomous navigation.

Real world example of embedded sensing layers for autonomous navigation

An autonomous mobile robot (AMR) for use in warehouses leverages a lidar scanning range finder that uses time-of-flight measurement to locate objects up to 25 meters away. A host CPU combines data from an IMU system and polar positional coordinates from the lidar to generate an accurate 2-D map of the surrounding environment. A short-range 3-D imaging sensor is also deployed for better visualization outside the lidar’s field of view. After the data from the IMU, lidar and wheel encoders are combined, creating a comprehensive map that allows for 3-D autonomous navigation.

There are many real-world examples of embedded sensing layers for autonomous navigation, but the one mentioned above may best highlight the interworking streams of data and how they’re combined. Autonomy is easier in a warehouse than on the road because the warehouse is a far less dynamic environment than the outdoors—autonomous vehicles that operate outside need to compensate for these variables with more advancing sensing systems.

Autonomous navigation is difficult. It requires a complex system of embedded sensing layers and data fusion to create intelligible information for a robot or vehicle to move on its own. While autonomous navigation is still in its early stages, there have been many promising breakthroughs.

This article originally appeared on the AIA website. The AIA is a part of the Association for Advancing Automation (A3). A3 is a CFE Media content partner. Edited by Liam Johnson, content specialist, CFE Media.

Original content can be found at www.visiononline.org.