Advanced driver assistance systems (ADAS) have come a long way since their inception and new technologies are helping them achieve greater degrees of autonomy in navigation and operation.

Advanced driver assistance systems (ADAS) in vehicles are relatively new but evolving quickly. They deploy a range of sensors that collect information about the vehicle’s surroundings, allow it to react to certain situations autonomously. Vision systems in ADAS go beyond LiDAR or radar sensors in providing vehicles critical contextual clues about the surrounding environment.
Most new vehicles include some form of ADAS and vision systems are playing an important role in creating more advanced ADAS, leading to higher degrees of autonomy in navigation and operation.
Vision-based ADAS systems in modern vehicles are typically comprised of two core components – a camera module, or embedded vision system, and ADAS algorithms. The embedded vision systems are typically comprised of an image sensor, some form of image processing, a cable, and a lens module.
These compact cameras aren’t housed like traditional machine vision systems and are built specifically to integrate with other systems within the vehicle. Oftentimes, these vision systems are based around complementary metal oxide semiconductor (CMOS) sensors that can achieve high resolution at fast frame rates for accurate image capture and transfer of image data. The key feature here is electronic global shutters capable of image capture at high speeds. Other forms of sensors are often too slow, leveraging rolling shutter systems, that distort images at high speeds.
ADAS algorithms are another important aspect of embedded vision systems. Most ADAS systems are capable of understanding, to some degree, the context of the images being captured, which translates into autonomous reactions. These algorithms are specifically developed to recognize traffic lanes, signs, and potential sources of danger, among many other things.
Advanced ADAS applications enabled by embedded vision
As mentioned, vision-based ADAS systems provide important information that LiDAR and radar sensors cannot. This allows for far more advanced ADAS functionality. Common applications include:
- Road sign detection
- Pedestrian detection
- Vehicle detection
- Vehicle localization
- Traffic lane detection
- Emergency braking systems
- Forward collision warning systems.
These are among the most common ADAS applications enabled by advanced vision systems, though there are many other ways in which ADAS can be deployed.
Embedded vision systems are playing a critical role in the development of advanced ADAS, as well as the end goal of achieving fully autonomous vehicles, by delivering clear and accurate image data for complex ADAS algorithms to understand and respond to.