Six tips for picking the correct machine vision lens

Picking the right machine vision lens requires the user to consider the type of lens, sensor, and more.

By AIA July 12, 2019

Deciding on the right lens for a machine vision application calls for a review of the required specs, some math, and a consideration of how the lens will integrate with the camera setup. There are several key factors that must be considered when implementing machine vision technology. When choosing the lens used in a machine vision application, one must consider the sensor that will be used. Sensor size and pixel size are of extreme importance in the selection process. The lens must be able to properly illuminate the complete sensor area to avoid shading and vignetting.

Ideal lenses produce images that perfectly match the object captured, including all details and brightness variations. Standard lenses may be about a megapixel in fixed focal lengths of 4.5-100mm. Macro lenses are optimized for close-up focus. Telecentric lenses are used in specialized metrology applications. They are used to eliminate dimensional and geometric variations.

1. Calculate focal length

When selecting the correct lens for an application, designers calculate the needed working distance using 3 factors: focal length, the length of the inspected object, and sensor size. The required working distance may also be calculated using object size and opening angle of the lens.

This object distance refers to the distance between an object and the first principal plane of the lens groups in the optical system. This distance can only be calculated by specifying the position and distance of the two principal planes and the real lens length.

2. Choosing the right machine vision camera

Recent advancements in machine vision technology now let cameras transfer high-megapixel images at fast frame rates. Selecting the best interface requires a review of several considerations.

3. Choose a sensor type (CMOS or CCD)

The lines between charge-coupled device (CCD) and complimentary-metal-oxide-semiconductor (CMOS) sensors are blurring. Traditionally, CCDs have higher image quality, better light sensitivity, greater noise performance, and an ideal global shutter. CMOS sensors are known for their high speed, on-chip system integration, and low cost of manufacturing. But CMOS sensors have shown significant improvements in quality.

4. Color camera or monochrome camera

Many vision systems make use of color cameras because users are more comfortable with color images. Some designers assume monochrome cameras cost less, but that’s not always true. It’s also not always necessary to use a color camera to differentiate between only a few colors. Processing time, resolution, pass/fail accuracy, power consumption, and cost must all be considered.

5. Camera output format

It’s not just a matter of choosing the “fastest” interface available. Compatibility with existing systems and integration with other available technologies must be considered. Some interfaces allow for “hot-swapping,” where others may require a full system shutdown. Engineers must consider cable lengths and signal degradation as well as power requirements.

6. Frame rates

The output format affects the frame rates that are going to be achievable by the system. It’s crucial to calculate the frame rate required by the application when choosing any of the other camera factors.

This article originally appeared in Vision Online. AIA is a part of the Association for Advancing Automation (A3), a CFE Media content partner. Edited by Chris Vavra, production editor, CFE Media, cvavra@cfemedia.com.

Original content can be found at www.visiononline.org.


Related Resources