Sensor selection is dependent on the application and so is the lighting, which makes optics just as important. However, optics suppliers do struggle to keep pace with the machine vision industry's rapid evolution.
New applications and sensors tend to drive most innovation in optics and lighting for machine vision, and this innovation must often advance in parallel. Sensor selection is highly dependent on the end-application in question and, often, so is lighting.
For optics, however, the sensor and lighting are usually the “application in question.” Even the sharpest-eyed sensor will fail to perform to potential unless a lens distributes light correctly and evenly over its active area. The challenge for optics suppliers is machine vision applications are evolving so quickly it is difficult to keep pace.
“There’s so much new application demand as sensors and lighting technology advance. The question for optics suppliers has become ‘What lens design needs to be optimized to work with what’s going on downstream?’” said Greg Hollows, vice president, Imaging Business Unit at Edmund Optics.
Optical design challenges
A remote sensing unit mounted on a drone, for example, might favor a larger format sensor with a high resolution simply because packing more pixels into a larger array enables the drone to gather as much image data as possible during its limited flight time. This application demand is easy to intuit and therefore anticipate if you’re a sensor manufacturer. So, it is reasonable to proactively invest in the development of larger format sensors. That option is not so straightforward for optics manufacturers, who cannot design a larger lens without also considering many other variables.
Many of the systems incorporate lens assemblies and filters together, which can be become complicated for wide field-of-view applications usually associated with drones today.
“In the past, the application needs were less demanding. Since systems were more forgiving, an optics designer did not always need to account for effects seen by the filters. Now, these very wide angles can cause the light that is passed by the filter to shift towards the blue part of the spectrum (blue shift) as you go across the field of view,” Hollows explained with regard to our hypothetical remote-sensing application. “Today, however, you need to design a lens with accommodations for a filter in mind that will minimize blueshift from light passing through at a wide angle, say 45-degrees. It can make things harder as it removes some of the off-the-shelf plug-and-play options that were available in the past.”
The complications don’t stop when designing for larger formats. The stacking of CMOS image sensors can help speed image capture, but it presents a design problem similar to the one we have with the remote-sensing application.
“The stack up of CMOS imaging boards is creating deeper wells that require more narrow-incident-angle image side lens design,” said Jason Baechler, president of Moritex. “The lenses that were compatible with a CCD sensor larger than a two-thirds-inch format might not be compatible with a CMOS sensor of the same size. They might share the same resolution, but [CMOS image capture] will suffer in contrast and uniformity. The center of field will appear brighter, while intensity drops off at the corners. So overall uniformity is poor.”
Because lens components are less likely than sensors to be commercial-off-the-shelf products, design and production costs are higher. This also creates a potential bottleneck for camera-makers and integrators who are focused on time-to-market.
“In the past, three people in the world might have had two machines that could make an optic for a certain scenario,” Hollows said. “And they were operated by an engineer, not a tech, so, it was cost-prohibitive. What’s advanced is that it’s now only expensive instead of prohibitive. People understand how to do the hard stuff, but it’s become how to do it cost-effectively. For example, designing the filter with the optic allows you to design it smaller, better, and less expensively.”
Making light do
This appeal to a more holistic design approach extends to the lighting component. Ideally, lighting and imaging optics should be designed together as a system to ensure optimal light rays are collected by the lens and suboptimal rays are not. Light that falls outside a lens’ field of view contributes to glare and reduced image contrast. Designing the lighting and optical elements together make it much easier to match both for best effect. It can also minimize costs and improve time-to-market.
The wavelength range of an imaging system’s light source also influences lens selection, as well as overall system performance. Image contrast can often be heightened simply by narrowing that range to a few nanometers. Using filters with a broadband light source can help, but this is not always as flexible a solution as selecting a narrowband light source. For one, lenses focus different wavelengths at different distances so using a narrowband light source can actually contribute to lens performance. In addition, what filters can achieve is dependent on the light source. By themselves, they do not improve contrast.
“If the ink on a label absorbs near-infrared light, for example, the label will appear blank even if the near infrared is filtered out at the camera,” Baechler said. “This makes it increasingly important to find light sources that emit within certain bands and design optics around them.”
Leveraging the wavelength of a light source can accomplish a great deal in machine vision. Using narrowband light sources, such as LEDs, minimizes what a lens design needs to correct and introduces interesting new options for increasing contrast in an image. The way a particular material absorbs or reflects a specific wavelength can also highlight defects that are invisible under broadband light.
Each color in the visible range has an opposite color that can be used to heighten image contrast. So, illuminating a green feature with green light, for instance, will make it appear brighter on an image sensor, while illuminating it with red light — green’s opposite — will make it appear darker.
Improving image contrast need not rely on a narrowband light source, however. The short wavelengths in the ultraviolet (UV) range, for example, scatter off surface features more strongly than visible or infrared rays. Or, if UV rays are absorbed, they tend to be absorbed at the surface. In either case, using gas-discharge tubes or UV LEDs to image how UV light interacts with a material can help detect contaminants or shallow scratches that would be invisible elsewhere on the spectrum. This offers clear benefits when inspecting glass displays, lenses, and other materials that appear transparent in the visible spectrum.
Similar benefits await on the other end of the electromagnetic spectrum, where short-wave infrared (SWIR) light sources and sensors are producing high-contrast images that were difficult or impossible to produce with visible light. Water absorbs very strongly in SWIR at 1450 nm, which makes moisture appear dark in an InGaAs camera image. This introduces new options for inspecting produce for bruises or measuring moisture content in crops or seedlings. SWIR also penetrates thin plastic walls, which allows automated measurement of liquid levels through otherwise opaque containers.
Winn Hardin is contributing editor for AIA. This article originally appeared in Vision Online. AIA is a part of the Association for Advancing Automation (A3), a CFE Media content partner. Edited by Chris Vavra, production editor, CFE Media, [email protected].