Machine vision starts thinking for itself

Machine vision developments such as image registration, computer-vision, deep learning and other artificial intelligence (AI) technologies are beginning to enable radical market change. 

Machine vision technology has proved invaluable for quality control during the processing and packaging of food products. In the last 20 years applications for machine vision have grown significantly, cementing the technology’s place as a vital aspect of food manufacturing. The market growth of automation and robotics is a key driver to the machine vision industry, as automated technologies rely heavily on vision and imaging to perform their roles effectively. Developments of image registration, computer-vision, deep learning and other artificial intelligence (AI) technologies are beginning to enable radical market change.

A core application for machine vision technology is in quality assurance (QA) to help determine whether food products or packaging are defective before leaving the factory. Food processing companies may check the size, shape and color of a potato product or may want to assess the fat content in a meat product. They will also check that packaging, both primary and secondary – are not defective and that any labelling is correct.  Both regulatory and customer demands for quality, combined with increasing competitive pressures, have increased the requirement for QA managers to ensure visual inspection is performed.

Identifying defective products

Food processing best practice requires investment in machine vision QA products to identify defective products or packaging and, if possible, why they occurred. Historically, up until the introduction of autonomous machine vision systems, the industry has been rife with challenges. This is because solutions are complex, time-consuming and expensive to install and set up.

Traditionally, machine vision solutions have been complex, often requiring a systems integrator to complete the project — from creating proof-of-concept and test plans, to selecting components and bringing it all together as a hard-engineered solution on the production line.

Another challenge that has plagued the industry has been a lack of flexibility of machine vision solutions. This means that any minor environmental change or alteration to the product being inspected requires the return of the integrator. In many cases, the original solution will be rendered completely obsolete, unsuitable for the new application, despite the vast investment required to implement it.

To overcome these challenges, and to directly address QA managers’ true needs on industrial production lines across the globe, Inspekto developed what it believes to be the world’s first autonomous machine vision system which allows food businesses to benefit from visual QA at any point on the production line at one-tenth of the cost and at least 1,000 times the installation speed. The system is suitable for any handling method – from manual to automated – and can be used for visual quality inspection and sorting.

In stark contrast to more complex assemblies of traditional machine vision, this autonomous machine vision solution consists of a single, standalone appliance — integrating software and hardware which allows for easy and low-cost installation, removing the need for a systems integrator.

The system consists of four main parts – the vision sensor, the arm, the mounting adaptor and the controller. A straightforward process designed to take less than one hour, is all that is required to install it on any production line. The operator must first choose the position of the vision sensor so that it captures the entire surface of the part to be inspected. Next, a mounting point close to the vision sensor must be selected. The mounting adaptor can then be assembled onto any profile in the plant, and the telescopic arm assembled and attached to the vision sensor.

Following these steps, the operator can connect the controller and the system will automatically launch. To set up inspection, the operator draws a polygon to form an outline around the part to be inspected, selecting regions of interest and regions to exclude. The entire process, from the moment the product is taken out of the box, requires no training.

The AI approach

Once the system is up and running, the holistic AI approach of the system kicks in. To ensure the clearest, most informative images are acquired an AI algorithm optimizes the camera and illumination settings for the object and environment. From this point, the algorithm can detect and localize the object without any operator input. Rather than a machine vision expert defining the detection and localization algorithm, the system is ready to go.

The final step is for the operator to verify some good sample references to enable the system to learn what a gold standard product looks like. It will use as few references as possible to do this, depending on the part that is being inspected and its movement profile. The inspection process can then begin.

Each new image will be compared with the gold standard references to verify the shape tolerances and surface variations. Because the system is not searching for predefined defects or comparing against defective parts, it will detect defects that were never considered by the manufacturer before.

Because of the self-adapting capabilities of the system, manufacturers can easily move the system to another point on the production line as and when required. Furthermore, because the system is simple and affordable to install, it enables total QA, where machine vision technology can be incorporated at any point of the production line to track every step. It makes it easy to identify a defective product, where the defect has been introduced and therefore to determine why a product is faulty. As one plant manager put it: “The move to autonomous machine vision systems enables plants not only to protect their customers from defected products, but also to protect the plants from scrap and flawed manufacturing processes.”

This article originally appeared on the Control Engineering Europe website. Edited by Chris Vavra, production editor, Control Engineering, CFE Media, [email protected].