MIT’s Bouncing Baby Robot?
S ci-fi movies are full of them. Walking, talking, human-like robots. In fact, as long as the field of artificial intelligence has been around, people have dreamed of creating robots that not only look like humans but can think, feel, learn, and make mistakes-just like the real thing.
MIT’s Artificial Intelligence Lab has succeeded in turning science fiction into fact with Cog, a humanoid robot that’s been in development since the early ’90s. Despite the robot’s lack of legs and a flexible spine, Cog is about as human as a robot can get. It has a head, trunk, and arms that are similar in shape, structure, and degrees of freedom to the human body. A sophisticated set of sensors and actuators coordinated with QNX-based distributed control algorithms approximate human sensory and motor dynamics with eerie accuracy.
Like humans, Cog can hear. A pair of hearing aid microphones are mounted directly on the head assembly. Cog can also see. The robot’s visual system copies some human visual capabilities, including binocularity and space-variant sensing. Two gray-scale cameras in each ‘eye’ allow Cog both a wide field of view and high-resolution vision. Cog even has a vestibular system, which in humans measures the acceleration of head rotation and is critical in the coordination of motor responses, eye movement, posture, and balance. Cog mimics this system with three rate gyroscopes and two linear accelerometers mounted in the robot’s head. Systems for vocalization and tactile reception are in the works.
To MIT’s researchers, getting Cog to look and sense in as human a way as possible is more than just aesthetics-it’s central to their research. Cog is learning to think like a human. With this robot, MIT is exploring the theory of embodiment, or the belief that human cognition is developed through interaction of the physical body with its surroundings and other humans. So, instead of being pre-programmed with detailed information about its environment, Cog is learning everything about itself and the world just like babies do-by trial and error.
Computational control for Cog is a heterogeneous network of many different processor types operating at different levels in the control hierarchy. Cog’s brain, which used to consist of a network of 16MHz Motorola 68332 microcontrollers connected through dual-port RAM, is now a network of 200MHz industrial PCs running QNX connected with 100VG Ethernet. Old and new brains communicate via a custom-built ISA shared memory card. As with all QNX systems, the network can be expanded by plugging new nodes into the network hub. The network also includes C40-based framegrabbers, display boards, and audio I/O ports. Other processors controlling Cog’s bodily functions range from small microcontrollers for joint-level control to digital signal processor (DSP) networks for audio and visual preprocessing.
When asked why QNX was chosen for their project, Matt Marjanovic, MIT researcher, explains: ‘With Cog, obviously there’s a realtime visual and auditory requirement. We wanted an OS that could enforce a schedule so that Cog could perceive people at interactive rates. Transparent networking is also a big plus. Cog has eight QNX nodes at this time, which are all accessed from the primary server. It’s also been very convenient to develop and run code remotely. Often we have several graduate students developing code at the same time on the same system-some in their offices, some at home.’
QNX has been used so successfully with the Cog project that now all of the lab’s robots are being built around the OS. That includes a high-profile MIT robot named Kismet designed to study human emotion which is featured on the cover of October’s Discover magazine (Look for ‘The Robot that Loves People’ at www.discover.com ). According to MIT researcher, Cynthia Breazeal, ‘It’s our goal to have all of the robots share the same computing platform so that code can be shared across projects.’
Since its ‘birth,’ Cog has learned to turn and fixate on a moving object, imitate ‘yes’ and ‘no’ head nods, reach out to touch objects, and play with a Slinky toy. Most importantly, Cog has learned to recognize faces and eyes-the first step toward realistic social interactions. And although Cog can’t yet be considered conscious, with this humanoid robot MIT is fulfilling a long-held dream of manufacturing aspects of human intelligence with computer and robotic technology.
Recently, the Guinness World Records 2000 (millennium edition) named Cog ‘the most intelligent robot.’ For more information about Cog and Kismet, including the latest research papers and video clips, visit www.ai.mit.edu/projects/cog .
Comments?E-mail gmintchell@cahners.com
Do you have experience and expertise with the topics mentioned in this content? You should consider contributing to our WTWH Media editorial team and getting the recognition you and your company deserve. Click here to start this process.