Embedded vision is driving ADAS technology advances
Vehicle manufacturers are always looking for systems to improve and meet stricter safety standards. Advanced driver assistance systems (ADAS) offer simplification of the driving process, including reduction of driver distraction and inattention. Embedded vision is integral to ADAS technology.
Automotive vision systems assist drivers with defying and tracking potential hazards. The embedded vision systems provide input and warnings for problems like lane drift or unobserved traffic. Vision systems are also used for tasks like automated parallel parking and traffic sign recognition for speed change warnings. Vision systems also warn drivers of their drowsiness or distraction.
Designers today have more image-processing software available to them along with vision processors and specialized development systems. Unlike simple rear vision cameras, embedded vision provides 360-degree coverage. The open-source OpenCV library allows designers to deploy highly-sophisticated computer vision algorithms without specialized knowledge in image-processing theory.
Manufacturers have started focusing on creating chips with dedicated on-chip image processing for computer vision applications. Some chips are capable of processing multiple data streams simultaneously.
Combination of image processing software and dedicated image processors offers the fundamentals for building automotive embedded vision. Developers who want to create vision solutions also now have access to embedded vision starter kits. Specialized hardware is available with a full complement of peripherals required to build a computer vision system.
Embedded vision and ADAS advancements get results
An ADAS vision sensor provides drivers the time they need to take corrective action or avoid a collision. The system identifies vehicles, cyclists, and pedestrians in the driver’s path. Information like relative speed and distance are measured in real time. The system determines if there is a potential danger and warns the driver.
An ADAS is never distracted and fatigued and it’s highly accurate. After using ADAS, drivers quickly become aware of their behavior and start self-correcting. Not only do many fleets see a reduction of collisions, but many have also seen a reduction in fuel consumption.
ADAS also requires sensors for monitoring the vehicle’s immediate surroundings. LiDAR (light detection and ranging), infrared, and radar sensors enable features like adaptive cruise control, automatic braking, and other responses to traffic changes. Sensor fusion combines the output of different sensors to provide predictive warnings.
ADAS continues to be developed, taking advantage of new technologies that improve results and allow vehicles to meet the growing and shifting safety standards.
This article originally appeared in Vision Online. AIA is a part of the Association for Advancing Automation (A3), a CFE Media content partner. Edited by Chris Vavra, production editor, CFE Media, email@example.com.