Machine vision comes of age in automotive manufacturing applications
Automotive manufacturing has been a leader in the adoption of automation technology, including machine vision and robots—and for good reason. The use of automation has made automobiles more affordable to the masses and significantly safer due to higher-quality construction and advanced automotive systems, many of which wouldn’t be possible without the assistance of automation technology.
Given the automotive industry’s adoption of automation tech, it’s no surprise the number of new applications being automated for the first time isn’t driving the adoption of vision and other advanced automation solutions. Instead, growth in the automotive industry comes more from retooling and retrofits to production lines, rather than new applications solved for the first time. Today, integrated vision systems packed with intelligence to simplify their setup and operation are driving vision’s penetration into the motor vehicle market, helping the automotive manufacturing industry to achieve new heights in productivity and profitability.
A list of automotive systems that use vision technology during assembly or quality inspection reads like the table of contents from a service manual, covering every aspect of the automobile from chassis and power trains to safety, electronics, and tire and wheel. In most cases, machine vision is tracking the product through the use of 1-D and 2-D barcodes and performing quality inspections. But it’s also helping to assemble the products.
"Most of the applications we’re solving today involve material handling, moving parts and racks to assembly lines using either 2-D or 3-D vision," said David Bruce, engineering manager for general industry & automotive segment for Fanuc America. "But the biggest buzz word right now is ‘3-D.’"
Automotive manufacturing is going away from manual processes. Few remain on the plant floor and robots may be taking those away, too.
"Today, one of the last manual processes on the automotive manufacturing line involves part feeding, getting parts out of bins, and so on," Bruce said.
In some of the most advanced material handling work cells, one robot with a 3-D sensor picks the parts out of the bin and places them on a table so a second robot with a 2-D vision system can easily pick up the part and feed another machine, conveyor, or other process. Bruce also noted end-of-arm tooling is one of the toughest challenges for bin picking applications; magnets and vacuum work best.
"By having the vision system controller directly integrated with the robot instead of using a PC, the engineers can focus on the mechanical engineering challenges and developing a bin picking system with buffering to make sure acceptable cycle times are achieved," Bruce says.
Tighter integration between vision system and robot also makes it easier for end users to train the work station. "You can take images of the robot in 7, 8, or 10 different poses and the system will guide you through programming. Or if you’re looking at a large part that won’t fit in the field of view—not uncommon in automotive manufacturing—you can take images from several small fields of view of the part, and the robot controller can determine the full 3-D location of the part."
3-D vision is also enhancing the latest class of robot assembly: lightweight robots, also called collaborative robots due to their low-force operation and ability to work next to humans with minimal safety systems.
While the automotive industry is boosting the number of collaborative vision work cells, "Right now the killer application is kitting," Bruce said. Kitting is the process of collecting parts into a bin for a specific product configuration or assembly.
The path to full traceability
Any kitting or assembly task is only as good as the quality and accuracy of the incoming parts, which is why track-and-trace vision applications are so important to the automotive industry. "Over the last 31 years, the industry average was 1,115 car recalls per every 1,000 sold, according to the National Highway Traffic Safety Administration," said Adam Mull, business development manager machine vision/laser marking for Datalogic. The recall rate can exceed 1,000 because a single car can have more than one recall.
"While we’re seeing applications across the board from inspection to vision-guided robotics [VGR], we’re definitely seeing a trend toward full traceability," said Bradley Weber, application engineering leader and industry product specialist—manufacturing industry at Datalogic. "There’s always been traceability of the most critical components of the car, but now it’s going everywhere. Every part is being laser marked or peened, read, and tracked. That’s part of what has opened a lot of doors for Datalogic because we have many types of laser markers, vision systems to verify those marks, and then both handheld and fixed barcode readers to read and track those marks all through the process."
According to Mull, while one manufacturing plant used to manufacture only one type of vehicle, today plants either makes parts for multiple vehicles or assembles different vehicles.
Consumer demand is driving the need for more automation in the factory. "When you go to a dealership, there are so many more options than there were years ago, from the color of the dashboard to the onboard electronics," Weber said. "With all those choices, original equipment manufacturers (OEMs) need a strong manufacturing execution system that is being fed data from every part along the manufacturing process."
With machine-readable codes going on more and more components, it also opens up the possibility of reworking problem parts instead of scrapping them.
As automation and machine-to-machine communication continue to blur the lines between robot, vision, marking system, and production equipment, the benefit to the manufacturer is greater ease of use, leading to greater machine vision adoption.
Advanced vision software is aiding new adopters to set up ID, VGR, and inspection routines using simple flow-chart programming and automated sample image acquisition, according to Fabio Perelli, product manager at Matrox Imaging.
Better automation integration is also helping to educate engineers, opening up even more opportunities for vision and other automation solutions.
"In automotive, engineers often work in bubbles," Mull said. "Everyone’s running with their own part of the project. But as one system works more closely with another system, the team members start to cross-pollinate, opening up the opportunity to teach the engineer who only knows about smart cameras how to use embedded controllers and advanced vision or marking systems. And since our systems all use the same software environment, it makes it seamless for an engineer to move from smart cameras to embedded controllers to other Datalogic solutions."
Winn Hardin is contributing editor for AIA. This article originally appeared in Vision Online. AIA is a part of the Association for Advancing Automation (A3), a CFE Media content partner. Edited by Chris Vavra, production editor, CFE Media, firstname.lastname@example.org.