Machine Vision: Not Just for Metrology Anymore

To many manufacturing engineers, machine vision (MV) on the factory floor is just a metrology tool. They’ve gotten used to applying machine vision to speed up inspection and gauging applications for quality control, especially for 100% inspection situations. What they haven’t come to grips with, yet, is something that electronic printed-circuit assembly engineers embraced years ago:...




  • Robotics communication

  • Simulation speeds startup

  • Higher CNC throughput

  • Machine vision pick and place

To many manufacturing engineers, machine vision (MV) on the factory floor is just a metrology tool. They’ve gotten used to applying machine vision to speed up inspection and gauging applications for quality control, especially for 100% inspection situations. What they haven’t come to grips with, yet, is something that electronic printed-circuit assembly engineers embraced years ago: machine vision can turn robots into fast, accurate and very, very flexible parts-handling machines.

If you think back to pre-CNC machine-shop practice, a human picked up a part to be machined, locked it into a fixture, then manually applied a cutter. Whether the cutter was a drill, an end mill or a lathe cutting bit, the fixture itself took care of guiding the cutter to the right place on the part.

The human, however, took care of orienting the piece correctly in the fixture using what behavioral psychologists call “eye-hand coordination.” Eye-hand is the operant term. To automate that part of the process, you not only need to have a hand analog (which is a robot), but you also need an eye analog. MV is the eye analog.

Just as a human operator manually loading a fixture can do the job faster, more reliably and more safely when not blindfolded, adding machine vision to a robotic handling system makes it faster and more reliable. It also makes it more capable of handling uncertainty.

A couple of demonstrations that GE Fanuc set up in its booth at the International Manufacturing Technology Show (IMTS) in Chicago really showed how useful MV can be in machine-tool applications. In one demonstration, they used MV-guided robots to move parts along a simulated processing line that included milling and deburring stations. The second used two synchronized robots to assemble a gear box.

Watching robots on the simulated assembly line held all the fascination of peeking through a knothole to watch activity at an urban construction site. Being an old vision maven, the part I found most fascinating was watching two uncoordinated robots load parts onto a CNC miller.

By “uncoordinated” I don’t mean to imply “clumsy” in any way. Uncoordinated robots work entirely independently and one driven by two separate controllers. They pass a minimum of signals between them, just as two humans in a bucket brigade need to pass only two bits between them. The downstream human only signals: “I’m ready for another bucket,” and the upstream human simply indicates: “Here’s another bucket.”

In the demonstration, the first robot picked randomly oriented parts from a large tote bin. Before gripping the next part to pick it up, the robot had to:

  1. Locate all available parts in bin;

  2. Choose the next part to pick up;

  3. Figure out the best gripper orientation for picking and moving the part to a stand.

Task 1 is a lot more complex than one might think. The vision system has to recognize a lot of lumps in the bin: then match them with what the correct part would look like in various orientations—including upside down and laying on its side.

The task is further complicated by the possibility that one part might overlie another, making two parts look like one lump. To help solve this problem, the robot moved the camera around to see different points of view.

This robot probably didn’t have the additional problem of recognizing the existence of a wrong lump, such as a different part type mixed into the bin: a gum wrapper tossed into the bin instead of the trash, or the third-shift worker’s forgotten lunch pail. In a real factory, the robot would have to deal with all of those.

Once the MV system identified all the (correct) parts and their orientations in the bin, task 2 is not too bad. Simply choose the part with the least number of interferences, or, if they’re all equally easy to reach, the one that takes the least motion.

To accomplish task 3, map out a series of joint movements to grip the part, then place it right side up on the stand. Just write a few five-axis CNC programs and pick the one that accomplishes the task with the fewest number of motions. It’s a piece of cake, right?

That’s the first robot’s job. The second robot knows the part is on the stand right side up. It does not, however, know its orientation around the vertical axis or its exact position in the horizontal plane. It uses MV to get that information, then moves its gripper for the exact grip it needs to put the part in the exact spot in space for the miller. By the way, this second robot has a double gripper, so it can pick up the finished part already in the miller before placing the new part to be milled.

Even more impressive is the coordinated action of two robots assembling a gearcase. They work together very closely, like the coordination between a human’s two hands, rather than a hand off between two humans. Positioning errors between the grippers can—and will—lead to stripped screw threads and broken components.

To gain this coordination, Fanuc developed a multi-robot controller along with robot-simulation software. The one controller has access to all of the sensor information from all the load cells and encoders on both robots simultaneously. Instead of dealing with solid geometry in a five-dimensional space (one dimension for each robot axis), the controller for two robots has to deal with a 10-dimensional space!

Robot simulation software allows the controller to check for interferences before running the program on expensive real robots in a production environment. Typically, this part of the process takes place off line on an engineer’s workstation, but it’s an important part of the workcell safety net.

In the demonstration, one robot picks up one half of the gearcase housing and presents it to the other robot. The second carefully inserts one gear and then the other on a bearing shaft already in the housing. It has to rotate the second gear during the assembly to get the gears to mesh properly.

The second robot then picks up the housing’s second half and locks it in place over the assembly. The two robots then—together—reorient the housing to present it to a third robot, which inserts and drives the screws that ultimately hold the gearcase together. This third robot, of course, has to be run by the same controller as the first two in order to gain the precision needed.

These demonstrations show how versatile robotic equipment can be when guided by machine vision. Instead of dumb brutes fumbling around in the dark, machine vision-equipped robots can complete sophisticated tasks in the face of a large amount of environmental uncertainty. Their flexibility can rival or exceed that of human assemblers, and they don’t leave forgotten lunch pails in parts bins.

For more information :

No comments
The Engineers' Choice Awards highlight some of the best new control, instrumentation and automation products as chosen by...
Each year, a panel of Control Engineering editors and industry expert judges select the System Integrator of the Year Award winners.
Control Engineering Leaders Under 40 identifies and gives recognition to young engineers who...
Learn more about methods used to ensure that the integration between the safety system and the process control...
Adding industrial toughness and reliability to Ethernet eGuide
Technological advances like multiple-in-multiple-out (MIMO) transmitting and receiving
Big plans for small nuclear reactors: Simpler, safer control designs; Smarter manufacturing; Industrial cloud; Mobile HMI; Controls convergence
Virtualization advice: 4 ways splitting servers can help manufacturing; Efficient motion controls; Fill the brain drain; Learn from the HART Plant of the Year
Two sides to process safety: Combining human and technical factors in your program; Preparing HMI graphics for migrations; Mechatronics and safety; Engineers' Choice Awards
The Ask Control Engineering blog covers all aspects of automation, including motors, drives, sensors, motion control, machine control, and embedded systems.
Join this ongoing discussion of machine guarding topics, including solutions assessments, regulatory compliance, gap analysis...
News and comments from Control Engineering process industries editor, Peter Welander.
IMS Research, recently acquired by IHS Inc., is a leading independent supplier of market research and consultancy to the global electronics industry.
This is a blog from the trenches – written by engineers who are implementing and upgrading control systems every day across every industry.
Anthony Baker is a fictitious aggregation of experts from Callisto Integration, providing manufacturing consulting and systems integration.
Integrator Guide

Integrator Guide

Search the online Automation Integrator Guide

Create New Listing

Visit the System Integrators page to view past winners of Control Engineering's System Integrator of the Year Award and learn how to enter the competition. You will also find more information on system integrators and Control System Integrators Association.

Case Study Database

Case Study Database

Get more exposure for your case study by uploading it to the Control Engineering case study database, where end-users can identify relevant solutions and explore what the experts are doing to effectively implement a variety of technology and productivity related projects.

These case studies provide examples of how knowledgeable solution providers have used technology, processes and people to create effective and successful implementations in real-world situations. Case studies can be completed by filling out a simple online form where you can outline the project title, abstract, and full story in 1500 words or less; upload photos, videos and a logo.

Click here to visit the Case Study Database and upload your case study.