Vision Sensor Helps Robot Rock On
Vision sensor, edge tool software, and a PLC make robot "Roxanne" a Guitar Hero winner-and gives her student-inventor crucial vision system experience.
Machine vision sensors are often likened to human eyes: They scan and “see” the world around them, then transmit the data elsewhere for some action to occur. But can a vision sensor watch TV? Turns out it can. And when put together with mechanical actuators and a PLC brain, it can also play video games.
Casting about for a project for his robotics class at Minnesota West Community and Technical College, engineering student Pete Nikrin hit on the idea to design a robot to compete against a friend newly introduced to the Guitar Hero electronic game. Playing Guitar Hero requires fast button-pushing on a guitar-shaped input device in response to dots (notes) that move down a path on a video screen.
For the project, Nikrin decided to use a mannequin—complete with Minnesota West sweatshirt and painted fingernails—with a vision sensor in its left eye and mechanical actuators attached to its painted nails. And, he wanted to keep the video game and the robot two separate entities.
At first, he says, “I couldn't find anything online like it. Then, a month or two into it, a bunch of videos showed up [using other methods]. To the best of my knowledge, though, mine is the only one that has the robot linked only by vision to the game.” Other projects tap directly into the game's electronic signal, which Pete, as a robotics guy, thinks “is kind of cheating.”
Not unlike the process of problem-solving in an industrial application, Nikrin hit upon two frustrations early on: finding the right sensor and building a vision system with high enough speed for the application.
“We had to find a way for the vision system to pick up on the source of light coming from TV. [The first] type of sensor I tried would only see the back of the tube—which was pretty cool in itself, but didn't fit the application,” says Nikrin.
Bill Manor, robotics instructor at Minnesota West, suggested Nikrin incorporate a PresencePLUS P4 Omni vision sensor from Banner Engineering. “Students have used Banner vision sensors in many projects over the years—to inspect containers, for example, as they come down a conveyor,” Manor says. Minnesota West had purchased a vision system at a discount through Banner as a start-up education kit, he adds.
Nikrin and Manor used the 20-plus year experience of Jeff Curtis, senior applications engineer at Banner, to help troubleshoot the details of the application.
“Bill and I both thought that they [Banner] went above and beyond to help with a school project, which might seem trivial to some companies,” Nikrin added.
“Pete worked hard to do this,” says Curtis. “He did a lot of work, and all the programming. We're happy to help with vision applications, but we won't write logic for other people. It was personally satisfying to work on, though. Lots of us are geeks around here and have played Guitar Hero.”
Nikrin installed the vision sensor as the robot's left eye and positioned it toward the video screen. Nikrin's team needed to ensure Roxanne could play within a range of lighting conditions—since she would be relocated from classrooms to gymnasiums for demonstrations—as well as confirm the robot was correctly oriented with the monitor displaying the video game.
On screen, Roxanne “sees” the little dots (notes) move down a path. The robot identified the notes to be played by using edge software, which is a simple linear tool used for locating edges. Edge tools help locate things in free space by detecting, counting and locating the transition between bright and dark pixels in an image area.
“When the robot was first set up, it would slouch over time,” says Curtis. “Using two edge tools, you can tell the imager where you should be vertically and horizontally. Then if the head moves slightly, you can find the correct scan area again.”
Curtis says other types of “locate tools are patterning tools. “Edge tools react in only a few milliseconds or fractions of milliseconds—much quicker than cumbersome area based tools,” he says. They find the absolute or relative position of the target in an image by finding its first edge.
To correctly orient the sensor with the monitor, “we honed a locate tool and gave it a fixed point—a piece of reflective tape on the PC monitor—to focus on,” Curtis explains. “This ensures the edge tools are in the proper location to detect each note as it comes along, and allows for any slight vibration in the application environment that could result in some deviation.”
Nikrin says the team set-up five edge tools that ran horizontally across the screen, one for every fret, and positioned the tools to focus on the notes at the bottom of each. “The Edge tools sent a constant signal as the five vertical fret lines progressed, and when a bright white dot appeared in the middle of a dark colored circle, the edge tool allowed the sensor to detect it.”
Curtis says vision sensors and edge tools are frequently used in industrial applications because they're very simple and very fast, good for verifying the simple presence or absence of parts. “They're used big time in packaging verification. You don't need optical character recognition system for that. We can do that with simple area or linear tool,” he says. “They're also good at checking a PC motherboard. With a basic green circuit board, you can check solder nodes and verify that all are there. With lasers and photo electric sensors, it's difficult.”
When it came to speed, Nikrin decided the camera's processor was too slow, so he chose a PLC to do the actuating. Also, the camera only had four I/O points and he needed five (one for each fret line). Once a note was identified, communicating the signal efficiently depended upon a heavy amount of programming, as well as Ethernet technology applied through a Modbus register, he says.
“Rather than use the processor in the camera, I hooked up a [Rockwell Automation] MicroLogix 1100 , which has Ethernet,” explains Nikrin. “Using the discrete I/O on the camera, the scans took 35 ms. By bitmapping the PLC and having the PLC do the processing, scans took 9 ms total—8 ms in the camera and 1 ms in the PLC.”
The MicroLogix PLC was programmed so that it constantly looked at the vision sensor's register. “Once the edge tool senses a note, the PLC notices the change in the register, and the logic in the PLC fires a solenoid that activates the robot's finger. Just as a human player would react, the robot's finger then presses down on the appropriate note on the guitar,” Nikrin explains.
Nikrin was surprised by “the simplicity of the project,” he says. “I was going for simplicity and elegance of execution.” His only disappointment? Lack of time meant he didn't get to work on the mechanics of the robot much. “The mechanics are lacking; it slows down the systems,” he says.
Regardless, after playing for two weeks, Roxanne had surpassed Nikrin in his ability with the game. On Medium mode, Roxanne hit 100% accuracy at times, and she averaged 98% accuracy during the remainder of Nikrin's tenure at Minnesota West. She could achieve up to 95% accuracy on Hard mode, and 80% accuracy on Expert mode (due to increased mechanical requirements on the robot's fingers).
Today, Roxanne still engages current and prospective Minnesota West engineering students, and Nikrin looks back on it with both a sense of accomplishment and a hefty dose of gratitude. He graduated from Minnesota West in 2008, and is now working as a manufacturing engineer at Meier Tool & Engineering. Not surprisingly, his first project there involved a vision system: “We're straightening, cutting, and forming a wire to a set length, and we're inspecting the finished product with a vision system,” he says. Soon, he'll be working on robotic guidance systems and robotic guides, he says—something Roxanne has prepared him for well.
Renee Robbins is senior editor of Control Engineering. Reach her at firstname.lastname@example.org .
|Search the online Automation Integrator Guide|
Case Study Database
Get more exposure for your case study by uploading it to the Control Engineering case study database, where end-users can identify relevant solutions and explore what the experts are doing to effectively implement a variety of technology and productivity related projects.
These case studies provide examples of how knowledgeable solution providers have used technology, processes and people to create effective and successful implementations in real-world situations. Case studies can be completed by filling out a simple online form where you can outline the project title, abstract, and full story in 1500 words or less; upload photos, videos and a logo.
Click here to visit the Case Study Database and upload your case study.