Machine Vision: Not Just for Metrology Anymore

To many manufacturing engineers, machine vision (MV) on the factory floor is just a metrology tool. They’ve gotten used to applying machine vision to speed up inspection and gauging applications for quality control, especially for 100% inspection situations. What they haven’t come to grips with, yet, is something that electronic printed-circuit assembly engineers embraced years ago:...




  • Robotics communication

  • Simulation speeds startup

  • Higher CNC throughput

  • Machine vision pick and place

To many manufacturing engineers, machine vision (MV) on the factory floor is just a metrology tool. They’ve gotten used to applying machine vision to speed up inspection and gauging applications for quality control, especially for 100% inspection situations. What they haven’t come to grips with, yet, is something that electronic printed-circuit assembly engineers embraced years ago: machine vision can turn robots into fast, accurate and very, very flexible parts-handling machines.

If you think back to pre-CNC machine-shop practice, a human picked up a part to be machined, locked it into a fixture, then manually applied a cutter. Whether the cutter was a drill, an end mill or a lathe cutting bit, the fixture itself took care of guiding the cutter to the right place on the part.

The human, however, took care of orienting the piece correctly in the fixture using what behavioral psychologists call “eye-hand coordination.” Eye-hand is the operant term. To automate that part of the process, you not only need to have a hand analog (which is a robot), but you also need an eye analog. MV is the eye analog.

Just as a human operator manually loading a fixture can do the job faster, more reliably and more safely when not blindfolded, adding machine vision to a robotic handling system makes it faster and more reliable. It also makes it more capable of handling uncertainty.

A couple of demonstrations that GE Fanuc set up in its booth at the International Manufacturing Technology Show (IMTS) in Chicago really showed how useful MV can be in machine-tool applications. In one demonstration, they used MV-guided robots to move parts along a simulated processing line that included milling and deburring stations. The second used two synchronized robots to assemble a gear box.

Watching robots on the simulated assembly line held all the fascination of peeking through a knothole to watch activity at an urban construction site. Being an old vision maven, the part I found most fascinating was watching two uncoordinated robots load parts onto a CNC miller.

By “uncoordinated” I don’t mean to imply “clumsy” in any way. Uncoordinated robots work entirely independently and one driven by two separate controllers. They pass a minimum of signals between them, just as two humans in a bucket brigade need to pass only two bits between them. The downstream human only signals: “I’m ready for another bucket,” and the upstream human simply indicates: “Here’s another bucket.”

In the demonstration, the first robot picked randomly oriented parts from a large tote bin. Before gripping the next part to pick it up, the robot had to:

  1. Locate all available parts in bin;

  2. Choose the next part to pick up;

  3. Figure out the best gripper orientation for picking and moving the part to a stand.

Task 1 is a lot more complex than one might think. The vision system has to recognize a lot of lumps in the bin: then match them with what the correct part would look like in various orientations—including upside down and laying on its side.

The task is further complicated by the possibility that one part might overlie another, making two parts look like one lump. To help solve this problem, the robot moved the camera around to see different points of view.

This robot probably didn’t have the additional problem of recognizing the existence of a wrong lump, such as a different part type mixed into the bin: a gum wrapper tossed into the bin instead of the trash, or the third-shift worker’s forgotten lunch pail. In a real factory, the robot would have to deal with all of those.

Once the MV system identified all the (correct) parts and their orientations in the bin, task 2 is not too bad. Simply choose the part with the least number of interferences, or, if they’re all equally easy to reach, the one that takes the least motion.

To accomplish task 3, map out a series of joint movements to grip the part, then place it right side up on the stand. Just write a few five-axis CNC programs and pick the one that accomplishes the task with the fewest number of motions. It’s a piece of cake, right?

That’s the first robot’s job. The second robot knows the part is on the stand right side up. It does not, however, know its orientation around the vertical axis or its exact position in the horizontal plane. It uses MV to get that information, then moves its gripper for the exact grip it needs to put the part in the exact spot in space for the miller. By the way, this second robot has a double gripper, so it can pick up the finished part already in the miller before placing the new part to be milled.

Even more impressive is the coordinated action of two robots assembling a gearcase. They work together very closely, like the coordination between a human’s two hands, rather than a hand off between two humans. Positioning errors between the grippers can—and will—lead to stripped screw threads and broken components.

To gain this coordination, Fanuc developed a multi-robot controller along with robot-simulation software. The one controller has access to all of the sensor information from all the load cells and encoders on both robots simultaneously. Instead of dealing with solid geometry in a five-dimensional space (one dimension for each robot axis), the controller for two robots has to deal with a 10-dimensional space!

Robot simulation software allows the controller to check for interferences before running the program on expensive real robots in a production environment. Typically, this part of the process takes place off line on an engineer’s workstation, but it’s an important part of the workcell safety net.

In the demonstration, one robot picks up one half of the gearcase housing and presents it to the other robot. The second carefully inserts one gear and then the other on a bearing shaft already in the housing. It has to rotate the second gear during the assembly to get the gears to mesh properly.

The second robot then picks up the housing’s second half and locks it in place over the assembly. The two robots then—together—reorient the housing to present it to a third robot, which inserts and drives the screws that ultimately hold the gearcase together. This third robot, of course, has to be run by the same controller as the first two in order to gain the precision needed.

These demonstrations show how versatile robotic equipment can be when guided by machine vision. Instead of dumb brutes fumbling around in the dark, machine vision-equipped robots can complete sophisticated tasks in the face of a large amount of environmental uncertainty. Their flexibility can rival or exceed that of human assemblers, and they don’t leave forgotten lunch pails in parts bins.

For more information :

No comments
The Engineers' Choice Awards highlight some of the best new control, instrumentation and automation products as chosen by...
The System Integrator Giants program lists the top 100 system integrators among companies listed in CFE Media's Global System Integrator Database.
The Engineering Leaders Under 40 program identifies and gives recognition to young engineers who...
This eGuide illustrates solutions, applications and benefits of machine vision systems.
Learn how to increase device reliability in harsh environments and decrease unplanned system downtime.
This eGuide contains a series of articles and videos that considers theoretical and practical; immediate needs and a look into the future.
Salary and career survey: Benchmarks and advice; Designing controls; Remote data collection, historians; Control valve advances; Hannover Messe; Control Engineering International
System integration: Best practices and technologies to help; Virtualization virtues; Cyber security advice; Motor system efficiency, savings; Product exclusives; Road to Hannover
Collaborative robotics: How to improve safety, return on investment; Industrial Internet of Things, Industrie 4.0: World views; High-performance HMI, Information Integration: OPC and OMG
This article collection contains several articles on the Industrial Internet of Things (IIoT) and how it is transforming manufacturing.
PLCs, robots, and the quest for a single controller; how OEE is key to automation solutions.

Find and connect with the most suitable service provider for your unique application. Start searching the Global System Integrator Database Now!

Getting to the bottom of subsea repairs: Older pipelines need more attention, and operators need a repair strategy; OTC preview; Offshore production difficult - and crucial
Digital oilfields: Integrated HMI/SCADA systems enable smarter data acquisition; Real-world impact of simulation; Electric actuator technology prospers in production fields
Special report: U.S. natural gas; LNG transport technologies evolve to meet market demand; Understanding new methane regulations; Predictive maintenance for gas pipeline compressors
click me