How can robots get our attention?

Research helps robots function more in tune with human coworkers by picking up on nonverbal cues: Georgia Institute of Technology. Smarter software and machine vision could make industrial robots safer and more efficient.

Getting someone’s attention can be easy with a loud noise or a shout, but what if the situation calls for a little more tact? How can a robot use subtle cues to attract a human’s notice and tell when it has captured it? In a preliminary study, researchers at the Georgia Institute of Technology have found that they can program a robot to understand when it gains a human’s attention and when it falls short. The research is being presented at the Human-Robot Interaction conference in Lausanne, Switzerland.

Control Engineering context: Such research could be helpful for industrial robots, which could use such software to change their behavior or routes depending on interactions with human counterparts in the workplace. Automated guided vehicles (AGVs), for instance, might eventually slow slightly less, or take a more direct route in a situation, if it has a cue that a human nearby sees where the roving robot is at and where it’s going. Such software could add productivity to the industrial applications without increasing risk. Software that helps computers better understand human language also advanced. For more about AGVs from Control Engineering, see the AIMing for Automated Vehicles blog at www.controleng.com/blogs.

“The primary focus was trying to give Simon, our robot, the ability to understand when a human being seems to be reacting appropriately, or in some sense is interested now in a response with respect to Simon and to be able to do it using a visual medium, a camera,” said Aaron Bobick, professor and chair of the School of Interactive Computing in Georgia Tech’s College of Computing.

Using the socially expressive robot Simon, from Assistant Professor Andrea Thomaz’s Socially Intelligent Machines lab, researchers wanted to see if they could tell when he had successfully attracted the attention of a human who was busily engaged in a task and when he had not.

“Simon would make some form of a gesture, or some form of an action when the user was present, and the computer vision task was to try to determine whether or not you had captured the attention of the human being,” said Bobick.

With close to 80 percent accuracy Simon was able to tell, using only his cameras as a guide, whether someone was paying attention to him or ignoring him.

“We would like to bring robots into the human world. That means they have to engage with human beings, and human beings have an expectation of being engaged in a way similar to the way other human beings would engage with them,” said Bobick.

“Other human beings understand turn-taking. They understand that if I make some indication, they’ll turn and face someone when they want to engage with them and they won’t when they don’t want to engage with them. In order for these robots to work with us effectively, they have to obey these same kinds of social conventions, which means they have to perceive the same thing humans perceive in determining how to abide by those conventions,” he added.

Researchers plan to go further with their investigations into how Simon can read communication cues by studying whether he can tell by a person’s gaze whether they are paying attention or using elements of language or other actions.

“Previously people would have pre-defined notions of what the user should do in a particular context and they would look for those,” said Bobick. “That only works when the person behaves exactly as expected. Our approach, which I think is the most novel element, is to use the user’s current behavior as the baseline and observe what changes.”

The research team for this study consisted of Bobick, Thomaz, doctoral student Jinhan Lee and undergraduate student Jeffrey Kiser.

– Edited by Gust Gianos, Control Engineering, www.controleng.com