Measure Up with Machine Vision

Vision systems help manufacturers improve quality and productivity by automating many part gauging and inspection tasks quickly, accurately, and with a high degree of repeatability. By generating valuable inspection data at each step of a manufacturing process vision systems also can help control engineers augment process diagnostics.

By Mark T. Hoske, Control Engineering September 1, 2006
AT A GLANCE
  • Vision system calibration

  • Tools for setup

  • Point-and-click software

  • Online, realtime inspection

  • Gauging, diagnostics

Vision systems help manufacturers improve quality and productivity by automating many part gauging and inspection tasks quickly, accurately, and with a high degree of repeatability. By generating valuable inspection data at each step of a manufacturing process vision systems also can help control engineers augment process diagnostics.

Non-contact gauging is faster

Among the most common applications for machine vision in quality control is gauging. Because vision systems measure part features within a thousandth of an inch, they are suitable for a number of applications traditionally handled by contact gauging, says Bryan Boatner, Cognex product marketing manager for In-Sight vision sensors.

“Manufacturers implement vision-based gauging for a number of reasons. Speed is a primary driver. With contact gauging, it’s impossible to keep pace with high-throughput production lines and consequently, measurements are typically performed on an audit basis.” Machine vision systems, in contrast, can keep pace with the fastest production lines, perform thousands of measurements per minute and are designed for in-line, 100% inspection, he says.

Without physical contact, vision-based gauging “avoids damage to parts and eliminates maintenance associated with wear and tear on mechanical gauge surfaces. Finally, vision systems allow certain types of parts to be measured that cannot be gauged using contact methods,” Boatner says.

Arnaud Lina, Imaging Software Group manager, Matrox Imaging, concurs that machine vision has a lot to offer physical metrology, taking precise measurements of objects. Ben Dawson, director of strategic development, ipd, a Dalsa Digital Imaging group, says a few tips can improve set-up for machine-vision gauging and other applications.

Diagnostics, uncorked

In addition to gauging, machine vision provides an effective data collection tool that can deliver process metrics and trends data for diagnostics. Vision systems provide real-time images to monitor operations or archive digital images with time stamps for later retrieval, Boatner says.

On a bottling line, for example, a camera may capture an image of each bottle as it passes and measure against a set of acceptable parameters, checking fill levels, cap, seal and label, presence, and skew, Boatner says; an out-of-spec parameter prompts a signal that diverts the bad bottle into a scrap bin. “Additionally, the system can be set up to save an image of the scrap bottle to a database for analysis. Rather than picking through a reject bin to try to figure out the source of the defective product, engineers can look at images for root-cause analysis. Archiving failed images for use as a diagnostic tool can assist troubleshooting and reduce downtime.”

Best practices

Vision products generally fall into three primary categories: vision sensors (typically self-contained, low-cost inspection devices), PC-based vision systems (combining the power and flexibility of advanced programming with simple graphical programming), and expert sensors (lower-cost technology for specific automation problems), Boatner says. Components common to each of the three include a camera, processor, image analysis software, lighting, and networking capability.

Sharing integration tips and tricks for machine vision technologies helps end-users know what to do and not to do, suggests Ken Brey, technical director, DMC Inc., a system integration firm.

Some best practices for setup and selection follow, from sources cited above. Much of the advice is interdependent, so the list isn’t exactly sequential. For instance, “strong” software can compensate for less-than-optimal lighting; wise use of lenses can lower hardware horsepower.

Setup tools: Every vision setup area or lab should have a stock of accessories, lenses, and various other components in addition to a sample machine vision camera, lights, and mounting options for engineers to try when setting up any vision application. Brey suggests that an acceptable stock would cost around $300.

Setup accessories: A clear plastic ruler , about $2, can be used to measure the field of view (FOV) for front-lit or back-lit applications. At a 45 deg angle, focus near the middle (not on edge), it can measure depth of field. (For 15 mm, divide focused length by 1.41). For high magnification applications, use a simple, $40, Ronchi Ruling to check optical system aberrations. Change lenses and adjust camera position on the smallest field of view required. Not buying a camera with higher-than-needed resolution can be well worth the time it takes to do the math.

Setup accessories: Color filters set , for about $150, can reduce the effect of ambient light; filter out wavelengths that different materials have in common, increasing contrast.

Setup accessories: Focal-length extender , $95, can simulate a higher-focal-length lens using a 2x focal length extender, turning a 50 mm lens into 100 mm, or a 35 mm lens into 70 mm. However, exposure time will be 4x more than using the higher focal length lens with equivalent indicated f-stop, Brey says. (It is a downside of extenders.) If exposure time is not a concern, the focal length extender can become part of the implementation. It provides a shorter optics stack where space is tight and can provide a less expensive total optics cost.

Setup accessories: Mirror-type beamsplitter , for $50, can be used in conjunction with a ring light or integrated camera light to align camera and lighting perpendicular to the part. At 45 deg it can simulate a diffuse on-axis light (DAOL; see diagram) by reflecting half and transmitting half the light. On the diffuser reflection, you can adjust the camera so the center ring light reflection circles the part, eliminating any measurement error. In a robotics application, the robot can be taught at each measuring point around an engine block for instance. For distant measurements, it also increases accuracy. DOAL lighting reduces reflection and glare, eliminating bright spots, finding, for instance, a tiny lump in paint by giving the edge contrast.

Application needs, know what you’re testing: Understand what features distinguish a good from bad part/product. The more precise the specifications, the easier it is to solve an application. Most importantly, obtain and test marginal product to minimize false accepts/rejects.

Using a mirror-type beam splitter at 45 degrees simulates diffuse, on-axis lighting (DOAL). It can be used to in conjunction with a ring light or integrated camera light to align the camera perpendicular to the part, ensuring image quality.

Application needs, vision system performance: Before specifying a vision sensor, define the entire scope of the application to ensure that the vision system has enough performance headroom in terms of speed, accuracy, and acquisition requirements. As users become familiar with the power of machine vision, they tend to want to accomplish as many vision tasks as possible. Also, consider any future requirements for increased throughput, ability to accommodate a new product, or changes to the existing product.

Application needs, camera performance, resolution: The camera must have enough pixels to resolve the smallest detail of interest. For spot gauging, such as measuring pinhole defects, you would like the minimum spot size to be at least 3 x 3 pixels. Good machine vision software can gauge edge-to-edge distances to 1/10 of a pixel or better. This greatly increases resolution for some kinds of measurements.

Application needs, hardware : Image acquisition hardware (the “frame grabber”) must have low noise and jitter to give stable measurements.

Application needs, software : Ensure the vision-based metrology reproduces the specific definition of the tolerances to measure. There are several different methods for testing perpendicularity, for example, so you must ensure the vision solution can perform the required test.

Application, repeat performance: Any sensing task, machine vision included, requires measurement repeatability to maximize reliable inspections. Test repeatability by presenting a part to the vision system for measurement “at least five times without changing part position, lighting, or any other variables. From this, you should be able to plot the repeatability of the measurements, and make sure that any variance in the results stays within the measurement tolerance,” Boatner says.

Before purchase, consider add-ons : Don’t base the buying decision on price alone without considering all potential add-on items required. Frequently spending more initially on a vision system with more powerful software can save money by reducing the need for more costly lighting, optics, or part fixtures.

Before purchase, test actual parts in a pre-sale demo : Have the vision vendor do a pre-sale demonstration to ensure proof of concept and verify operation on a range of product/part samples from worst good to best bad.

Lens selection, avoid distortion : Standard machine vision lenses have optical distortion, typically negative or “barrel” distortion, and perspective distortion when you gauge at varying distances from the lens. Distortions can be partially corrected by the machine vision system. A better solution is to use a low distortion lens or a telecentric lens. “Ask the vendor for help,” Dawson suggests.

Lens selection, field of view , resolution: Select a lens for the required “field of view” and ability to resolve the smallest detail of interest. The camera, not the lens, should be the component that limits resolution, Dawson says.

Lighting, avoid reflections : Provide lighting that enhances what you want to gauge and suppresses artifacts, such as unwanted reflections. For example, on parts that can be back-lighted, such as stamped-metal parts, use collimated light to give “crisp” edges. Proper lighting takes some experience and experimentation. Again, ask the vision system vendor for help.

Lighting, contrast : Acquire images in the best possible conditions since accuracy will be affected by contrast and noise, says Lina.

Lighting, contrast, color : Remember the importance of lighting to creating observable contrast and forming a good image. When considering lighting, consider type and lighting color.

Lighting, depth of view : Ensure that lighting properly coincides with the depth of field and field of view, Boatner says.

Mounting : In a typical gauging application, a camera mounted above or to the side of a part captures an image of the part to be measured as it enters the field of view. The image is then analyzed using gauging software tools, which calculate distances between various points in the image. Based on these calculations, the vision system determines if the part dimensions are within tolerance. If dimensions fall outside of tolerance, the vision system sends a fail signal to a logic-based device (such as a programmable logic controller, PLC), which actuates a mechanism to eject the product from the line.

Software, automatic repositioning : Choose a metrology tool that can automatically reposition the metrology template (layout of regions). For example, if a geometric pattern recognition operation is used to locate part position and orientation, based on these results, the system should correctly move metrology regions.

Software, based on geometry : Rely on basic geometrical manipulation rather than complex math. By adopting a metrology package that uses multiple sub-coordinate systems, you can easily construct new features that are geometrically derived from other features.

Software, camera positioning , calibration: Calibrate! Choose a software package that works in real-world coordinates, and position the camera to avoid significant distortion and perspective. Extract the metrology features from the source (distorted) image and measure them in the plane in which the calibration was performed. Performing metrology operations on the distorted image maintains the level of accuracy.

Software, ease of use : Pick machine vision software that is easy to set up and use, provides correction for optical and perhaps perspective distortion, and provides sub-pixel precision; it’s possible to do software setup and distortion correction in minutes, with gauging resolution of 1/25 of a pixel or more.

Software, feature-based : Imaging-based metrology should be feature-based; a technology that extracts geometric features from grayscale pixel values. Algorithms can mimic how real-world metrology is performed, says Lina, and be robust enough to handle changes in illumination.

Software, field of view : To optimize operations, make regions as tight as possible around the feature being measured.

Software, measuring tools : Libraries of pre-configured software tools ease configuration of a machine vision application, including gauging. Gauging tools work by measuring the distance between edges in an image. An edge is any type of significant change in gray value intensity in the pixel grid array that makes the image. The change can be from dark to light, or light to dark. In addition to calculating distances between edges, gauging tools measure angles formed by intersecting edges, as well as the size and location of holes in a part.

Software, setup/programming : “Some vision software environments are easier than others,” Boatner points out. They provide users with “point and click” control instead of requiring the user to know an advanced programming language for tool set-up.

Software for vision-based gauging , includes:

• Edge detection: Locates edges in an image while ignoring variations in its background, and calculates the magnitudes and angles of edges;

• Caliper: Performs high-speed, sub-pixel measurement of the width of part features;

• Blob analysis: Performs highly repeatable measurements of the area, size, and centroid of part features;

• Calibration: Calibrates camera pixels to real-world engineering units;

• Non-linear calibration: Optimizes system accuracy, corrects for lens and perspective distortion; and

• Multi-pose nonlinear calibration: Optimizes system accuracy, corrects for lens and perspective distortion, and enables the use of a smaller, more manageable calibration plate in large field-of-view applications.

Set inspection thresholds using simple statistics : Repeatedly moving a threshold and evaluating results is time-consuming and does not generate demonstrable proof that it is in the best place when you are done. Moreover, you can’t say something won’t work until hours have been wasted trying it, and then you can’t be absolutely sure. To save time, Brey says:

• Configure your vision system to output the criterion against which you are setting a pass/fail threshold;

• Run your full sample of collected images and collect the output values into a spreadsheet;

• Segregate values from parts which should pass and parts which should fail;

• Calculate the standard deviation(s) of both groups;

• Make a histogram. The two distributions should not overlap. If they do, find a better criterion;

• Select a threshold which is at least 3 Sigma from the mean of the fail group. The false reject rate will be based on the Sigma level of the pass group; and

• If no threshold value exists that is 3 Sigma from both pass and fail, find a new criterion.

Setup, inspection development , using stored images to improve inspection development: Vision engineers spend hours running a process while refining an inspection. Costs include money, parts, and time to run the process. Waiting for a rare defect may be necessary, but wastes time waiting or trying to create them. It also takes a long time to evaluate the effectiveness of any change. Instead:

• Align the camera and lights optimally. Be sure all objects of interest are visible and have optimum contrast;

• Verify system is focused;

• Collect dozens, sometimes hundreds of images of different parts. Create a smaller working sample of 10 to 20 images with a few good parts and a couple examples of each failure mode. Doing this may require collecting gigabytes of images;

• Create the inspection and refine it until all images in the working sample evaluate correctly;

• Inspect all images from the larger sample. Refine the inspection until there are no “False Accepts,” and the rate of “False Rejects” is acceptable; and

• Apply the inspection to the system. Continue collecting all images. Add any problem images to the working sample and further refine the inspection off-line, reducing time required working in front of a working machine.

ONLINE EXTRA

Is +/-0.001-in. repeatability enough?

To meet the demands for higher throughput rates, improved product quality, and increased yields, manufacturers will continue to rely on machine vision technology as a non-contact way to measure parts, says Bryan Boatner, Cognex product marketing manager for In-Sight vision sensors.

Machine vision systems provide 100% inspection measuring parts at high speeds with a high degree of repeatability, he says, and they can measure products that cannot be measured using mechanical gauges, he says.

With new capabilities such as multi-pose calibration, a high-accuracy gauging application using a 2 megapixel camera can maintain better than a +/-0.001-inch repeatability in an 8-in. horizontal field of view, which is 1/4-pixel repeatability in an image with lens distortion. This capability can help reduce the need for more expensive optics in large field-of-view applications.

Other useful references
Related reading from Control Engineering includes the “Product Research” article on machine vision (CE, Aug.’06, p.64) and:

  • High Horsepower Vision Sensors

  • Vision Systems (how to buy and specify; checklist)

  • Adjust Quality in Real Time

  • Machine Vision Looks Well Beyond Inspection

  • Cognex

Control lighting fluctuations
Because lighting types and angles highlight part features in different ways, changes in part illumination can challenge any software, suggests Banner Engineering. To see how ambient lighting changes part details, see below and click here . https://www.bannerengineering.com/comparelighting:

  • Ring light that surrounds the sensor lens hides raised features;

  • Low angle light accentuates rough areas and raised edges. It hides shiny areas;

  • Dome light evenly lights the whole part;

  • Backlight shows only the outline;

  • On-axis light can reveal a fingerprint; and

  • Area light at 30 degrees can highlight a raised area and reduce glare.

DMC Inc. is a CSIA member as of 3/1/2015