Sensors, Vision

Envisioning more at Ford

Plans are underway at Ford to use machine vision with robotics to increase quality and throughput, as explained at Automate 2019. See video.
By Mark T. Hoske June 14, 2019
An inverted Universal Robots 10 kg payload robot with a two-camera configuration makes a test sweep over an automobile in progress, using machine vision to improve manufacturing quality. Walter LaPlante, advanced manufacturing systems engineer at Ford Motor Co., Redford, Mich., facility explained some Ford machine vision applications at Automate 2019. Courtesy: Mark T. Hoske, Control Engineering, CFE Media

More collaborative robotics are under consideration at Ford Motor Co., explained Walter LaPlante, advanced manufacturing systems engineer at Ford’s Redford, Mich., facility. Expected benefits include flexibility, scalability, robustness, cost-savings, and quality, LaPlante said, at an Automate 2019 conference session.

A 10 kg payload robot has been tested in multiple applications by inspecting connectors and oil leaks in one and two-camera configurations. The system was designed to work with the robot and handshake with existing production software. It can be easily replicated, use a programmable logic controller (PLC) template, and provide quality data collection. It also can handle increasing complexity.

Fixed cameras were eliminated, resulting in less hardware, less engineering design, an improved manufacturing process, and fewer warranty claims, LaPlante said.

Video clip: An inverted Universal Robots 10 kg payload robot with a two-camera configuration makes a test sweep over an automobile in progress, using machine vision to improve manufacturing quality. Walter LaPlante, advanced manufacturing systems engineer at Ford Motor Co., Redford, Mich., facility explained some Ford machine vision applications at Automate 2019. Courtesy: Mark T. Hoske, Control Engineering, CFE Media

Safety standards used

Because the robot is operating collaboratively with humans in an industrial setting, safety standards were used.

  • ISO/TS 15066 on safety of collaborative robot systems requires a safety risk assessment for all applications and end-of-arm tooling (EOAT).
  • ISO 13849-1 machine safety also was used, which complies with Ford safety standards, LaPlante said.

The design includes speed and separation monitoring. When a human approaches, the robot equipment in motion slows and stops to reduce risk.

Cameras use GigE communications, allowing flexibility and scalability, LaPlante said. It’s adaptable for 2-D vision and to off-the-shelf 3-D vision systems, easier to use than prior implementations, adds accuracy and repeatability, and comes with global service support. Inspection of black-on-black connectors is very difficult, LaPlante noted, but this system could work.

More collaborative robotics are under consideration said Walter LaPlante, advanced manufacturing systems engineer at Ford Motor Co., Redford, Mich., facility. He spoke at Automate 2019. Courtesy: Mark T. Hoske, Control Engineering, CFE Media

More collaborative robotics are under consideration said Walter LaPlante, advanced manufacturing systems engineer at Ford Motor Co., Redford, Mich., facility. He spoke at Automate 2019. Courtesy: Mark T. Hoske, Control Engineering, CFE Media

Expansion after testing

After testing, more than 20 systems have been replicated globally at Ford Powertrain, and they are being expanded to vehicle manufacturing, LaPlante said. The application reduces engineering design cost. A system handshake ensures robust error handling. Images and data are stored in a quality database for statistical trend monitoring.

Flexible automation applications can extend well beyond automotive implementations. Think again about how robot-guided machine vision can help improve manufacturing processes.

Mark T. Hoske is content manager, Control Engineering, CFE Media, mhoske@cfemedia.com.


Mark T. Hoske
Author Bio: Content Manager, Control Engineering, CFE Media