Helping autonomous cars see with embedded vision

Sensor companies are creating a detection system that can sense a vehicle’s environment better than human eyesight through a combination of camera, radar, and light detection and ranging (LiDAR) sensors.

By AIA August 30, 2019

If autonomous cars are going to drive better than humans, cars must use embedded vision to see better than humans. Developers are creating a detection system that can sense a vehicle’s environment better than human eyesight through a combination of camera, radar, and light detection and ranging (LiDAR) sensors, which is a remote sensing method that uses light in the form of a pulsed laser to measure ranges.

Working together, diverse sensors can verify accurate detection. Camera, radar, and LiDAR provide the car visuals of its surroundings and also detect the speed and distance of nearby objects. Inertial measurement units are used to track a vehicle’s acceleration and location.

Cameras provide a visual

Cameras are the most accurate way embedded vision can capture a visual representation of the world. Autonomous cars use cameras placed on every side: front, rear, left, and right. These images are used to put together a 360-degree view of their environment.

Cameras include wide-angle lenses with a shorter range, and others offer a narrower view capable of long-range visuals. Fish-eye cameras provide panoramic views and are often used to help the vehicle park itself.

Vehicle manufacturers often use a series of CMOS imaging sensors to produce images between 1 and 2 megapixels; most use relatively inexpensive 2-D cameras, but some are incorporating 3-D cameras as well. Embedded vision systems need sensors with a high dynamic range of more than 130 dB to ensure a clear image in all conditions, including direct sunlight.

Radar sensors extend capabilities

Radar sensors are used to supplement camera vision in low visibility, such as night driving or poor weather. Radar transmits radio waves in pulses — those waves bound off an object and back to the sensor, providing information about the speed and location of the object.

Radar sensors also usually surround the car to detect objects at all angles. They can detect speed and distance, but can’t provide information on the type of vehicle — this is where another sensor used in embedded vision systems comes into play.

LiDAR Provides a 3-D View

Camera and radar are included on most new vehicles. Cars use them for advanced driver assistance and park assist technologies. They can also offer some level of autonomy when a human is in the driver’s seat with control of the vehicle.

For a completely autonomous car, the embedded vision system uses LiDAR. LiDAR has the capability of providing a 3D view of the environment, with the added benefit of working well in low-light conditions. It detects the shape and depth of surrounding cars, pedestrians, and road geography. Only a few LiDAR sensors are needed for an effective embedded vision system.

Like the human brain, an autonomous car’s embedded vision system must take in the data from all of these sensors and make interpret the information. Having different types of sensors offers reliability and overlap, as a fail-safe, that can respond in real-time.

This article originally appeared in Vision Online. AIA is a part of the Association for Advancing Automation (A3), a CFE Media content partner. Edited by Chris Vavra, production editor, CFE Media, cvavra@cfemedia.com.

Original content can be found at www.visiononline.org.