Embedded vision systems improve 3-D mapping capabilities

Embedded vision technology is used to combine known locations with movement tracking to autonomously navigate new and diverse environments and is playing an important role for feeding data to autonomous systems for navigation.

By AIA June 16, 2018

Embedded vision is playing an important role feeding visual and location data to autonomous systems for navigation. Embedded vision technology is used to combine known locations with movement tracking to autonomously navigate new and diverse environments.

The ability to quickly and accurately map surroundings in three dimensions is a key capability for autonomous vehicles. Whether they’re cars, trucks, mobile robots, drones, or other pilotless aircraft, the ability to see and react to dynamic surroundings is an indispensable component of autonomous operation.

Typically, multiple sensors are used on a single autonomous vehicle, feeding several different streams of visual and location data at the same time. These data streams usually undergo some form of data fusion during processing to handle the sheer volume of available data.

For this to be possible, vision systems must be able to construct a map of the environment while simultaneously locating the vehicle within the map. This process is called simultaneous localization and mapping (SLAM).

Visual SLAM technology is a relatively new but sophisticated method of 3-D mapping for autonomous vehicles, with key advantages over GPS and other systems. Embedded vision systems capable of high-speed streaming and processing enable the visual SLAM capabilities that are advancing 3-D mapping for autonomous vehicles.

Embedded vision and SLAM technology deliver autonomous operation in certain applications, providing significant advantage. Mobile robots in logistics, for example, were traditionally automated guided vehicles (AGVs) were used in warehouses. These robots were expensive to integrate as they had to have some form of external guidance.

Now, with embedded vision and SLAM technology create automated mobile robots (AMRs) that do not need external forms of guidance because they generate 3-D maps of the warehouse as they’re moving. This drastically lowers integration costs and increases the flexibility of these systems to adapt to changes in the flow of goods.

Embedded vision and SLAM technology take mobile robots to a new level of autonomy, presenting many benefits for industrial businesses, as well as a technological leap forward for mobile robots.

Embedded vision is a key component of autonomous vehicles. From cars to drones and robots, embedded vision is playing an increasingly central role in 3-D mapping for safe, reliable navigation in new and diverse environments.

This article originally appeared on the AIA website. The AIA is a part of the Association for Advancing Automation (A3). A3 is a CFE Media content partner. Edited by Chris Vavra, production editor, Control Engineeringcvavra@cfemedia.com.

ONLINE extra

See more from AIA below. 

Original content can be found at www.visiononline.org.


Related Resources