How can robots get our attention?

Research helps robots function more in tune with human coworkers by picking up on nonverbal cues: Georgia Institute of Technology. Smarter software and machine vision could make industrial robots safer and more efficient.

03/10/2011


Getting someone’sGeorgia Institute of Technology - Simon attention can be easy with a loud noise or a shout, but what if the situation calls for a little more tact? How can a robot use subtle cues to attract a human’s notice and tell when it has captured it? In a preliminary study, researchers at the Georgia Institute of Technology have found that they can program a robot to understand when it gains a human’s attention and when it falls short. The research is being presented at the Human-Robot Interaction conference in Lausanne, Switzerland.

Control Engineering context: Such research could be helpful for industrial robots, which could use such software to change their behavior or routes depending on interactions with human counterparts in the workplace. Automated guided vehicles (AGVs), for instance, might eventually slow slightly less, or take a more direct route in a situation, if it has a cue that a human nearby sees where the roving robot is at and where it’s going. Such software could add productivity to the industrial applications without increasing risk. Software that helps computers better understand human language also advanced. For more about AGVs from Control Engineering, see the AIMing for Automated Vehicles blog at www.controleng.com/blogs.

“The primary focus was trying to give Simon, our robot, the ability to understand when a human being seems to be reacting appropriately, or in some sense is interested now in a response with respect to Simon and to be able to do it using a visual medium, a camera,” said Aaron Bobick, professor and chair of the School Georgia Institute of Technology - Simon Visionof Interactive Computing in Georgia Tech’s College of Computing.

Using the socially expressive robot Simon, from Assistant Professor Andrea Thomaz’s Socially Intelligent Machines lab, researchers wanted to see if they could tell when he had successfully attracted the attention of a human who was busily engaged in a task and when he had not.

“Simon would make some form of a gesture, or some form of an action when the user was present, and the computer vision task was to try to determine whether or not you had captured the attention of the human being,” said Bobick.

With close to 80 percent accuracy Simon was able to tell, using only his cameras as a guide, whether someone was paying attention to him or ignoring him.

“We would like to bring robots into the human world. That means they have to engage with human beings, and human beings have an expectation of being engaged in a way similar to the way other human beings would engage with them,” said Bobick.

“Other human beings understand turn-taking. They understand that if I make some indication, they’ll turn and face someone when they want to engage with them and they won’t when they don’t want to engage with them. In order for these robots to work with us effectively, they have to obey these same kinds of social conventions, which means they have to perceive the same thing humans perceive in determining how to abide by those conventions,” he added.

Researchers plan to go further with their investigations into how Simon can read communication cues by studying whether he can tell by a person’s gaze whether they are paying attention or using elements of language or other actions.

“Previously people would have pre-defined notions of what the user should do in a particular context and they would look for those,” said Bobick. “That only works when the person behaves exactly as expected. Our approach, which I think is the most novel element, is to use the user’s current behavior as the baseline and observe what changes.”

The research team for this study consisted of Bobick, Thomaz, doctoral student Jinhan Lee and undergraduate student Jeffrey Kiser.

- Edited by Gust Gianos, Control Engineering, www.controleng.com



No comments
The Engineers' Choice Awards highlight some of the best new control, instrumentation and automation products as chosen by...
The System Integrator Giants program lists the top 100 system integrators among companies listed in CFE Media's Global System Integrator Database.
The Engineering Leaders Under 40 program identifies and gives recognition to young engineers who...
This eGuide illustrates solutions, applications and benefits of machine vision systems.
Learn how to increase device reliability in harsh environments and decrease unplanned system downtime.
This eGuide contains a series of articles and videos that considers theoretical and practical; immediate needs and a look into the future.
Save energy with automation; Process control system upgrades; Dispelling controll myths; Time-sensitive networking; Control system integration; Road to IANA
Additive manufacturing advancements; Machine vision enhances robotics; Fieldbus evolution; Process safety; Advice from System Integrators of the Year; Road to IANA
Salary and career survey: Benchmarks and advice; Designing controls; Remote data collection, historians; Control valve advances; Hannover Messe; Control Engineering International
This article collection contains several articles on the Industrial Internet of Things (IIoT) and how it is transforming manufacturing.

Find and connect with the most suitable service provider for your unique application. Start searching the Global System Integrator Database Now!

The digital oilfield: Utilizing Big Data can yield big savings; Virtualization a real solution; Tracking SIS performance
Getting to the bottom of subsea repairs: Older pipelines need more attention, and operators need a repair strategy; OTC preview; Offshore production difficult - and crucial
Digital oilfields: Integrated HMI/SCADA systems enable smarter data acquisition; Real-world impact of simulation; Electric actuator technology prospers in production fields
click me