How industrial vision systems differ from the human brain

Industrial machine vision applications work a lot like the human brain does, but they aren't quite as flexible and have other potential limitations we may not realize.

By Jon Breen February 5, 2021

I see a lot of vision applications and sometimes it surprises me what people expect a camera to do. Industrial vision inspection is very powerful, but it doesn’t work the same way the human brain does, so you can’t expect it to do the same things or in the same way. This can be both a help and a hindrance. As with any engineering task, understanding how the system works can help us maximize the benefits and mitigate the weaknesses.

How the human brain processes images

Your brain is a pattern recognizing machine. Primed with a lifetime of learned patterns, it can quickly identify and categorize nearly anything you can see. Even when pictures aren’t clear, you can probably make a good guess just based on context. You know where you are, what you’re looking for, and you’ve got the additional benefit of all your other senses.

When you’re considering a vision inspection task, you also have deep knowledge about the product you’re inspecting, the process that created it, and the things that can go wrong. That helps in determining if something is a good part or reject.

Unlike a computer, your brain is very flexible and readily learns from experience. You can quickly and effortlessly learn new product categories or defect types. You can recognize patterns without having to be taught. If you don’t see what you’re looking for in the place you expect, you’ll look elsewhere. Even fuzzy or distorted images are recognizable to the human brain, and you’ll usually get better over time.

While the human brain is supreme when it comes to pattern recognition (also called categorization in vision systems), it has some limitations.  It can be inconsistent from person to person or day to day, sometimes slow, and it’s not numerical, so things like measurement from visual inputs are difficult.

How industrial vision inspection works

On the other hand, when you’ve got an automated vision inspection system, it only knows what you tell it.  It has no idea about the process or the part, no stored bank of patterns to recognize, no awareness of context, and no other senses.

The programmed system is very rigid in its approach to finding, categorizing, and measuring features.  Each step of the process has to be specifically set up by the programmer, so it won’t improve over time without help, it won’t learn new patterns, and it won’t look anywhere else if it doesn’t find what it’s looking for.

This rigid approach has its benefits, as well.  It tends to be fast, consistent, and numerical, all important features for industrial processes.  It’s also very good at measurement tasks.

Simplicity is important for automated inspection. Set up the camera and lighting to collect images with high contrast on the features of interest and low contrast on everything else. This helps simple algorithms (compared to your brain) to be effective. Simple algorithms process faster, and they’re more feasible from a programming perspective.

Machine learning works as a hybrid approach

Very recently, we’ve got a new tool in industrial vision that’s like a cross between your brain and a traditional automated vision system.  Machine learning (ML), a branch of artificial intelligence can now be purchased for automated vision inspection, granting machines some of the power of the human brain.

These systems tend to be good for inspection tasks that can’t be setup for high contrast on the features of interest. For example, ML approaches can inspect weld quality using discoloration. Some processes also have defect modes that are hard to quantify for a traditional system – things like scratches, burn marks, and dents – but ML techniques can be a good fit.

This is still a new area for automated inspection. The tools are expensive, they require a lot of processing power, and they aren’t as well suited to the tasks that traditional tools excel at – location, measurement, high-speed applications.  They have a big part to play in the future of automated inspection, but aren’t yet a primary choice for most applications.

Closing thoughts

Engineering is always about weighing pros and cons, applying the right tools for the application in a way that maximizes benefits and minimizes drawbacks. Vision is a perfect example. When you know how it works, you can apply it to the right applications with the right lighting and programming to do the job well. And the big takeaway is: don’t expect the computer to be as smart as a human brain. The technology just isn’t there yet.

This article originally appeared on Breen Machine Automation Services’ blog. Edited by Chris Vavra, web content manager, Control Engineering, CFE Media and Technology,

Original content can be found at

Author Bio: Jon Breen, owner, Breen Machine Automation Services, LLC