Professor receives $50,000 Honda grant to develop holographic instrument panel controls

Tarek El Dokor, director of the Machine Vision Lab at Embry-Riddle Aeronautical University, was awarded a $50,000 Honda Initiation Grant for development of holographic instrument panel controls and displays. “It implements a software alternative to what is currently a hardware solution to various controls,” said El Dokor.
By Control Engineering Staff March 1, 2008

Tarek El Dokor, director of the Machine Vision Lab at Embry-Riddle Aeronautical University, was awarded a $50,000 Honda Initiation Grant for development of holographic instrument panel controls and displays. “It implements a software alternative to what is currently a hardware solution to various controls,” said El Dokor.

Professor El Dokor states that concept is analagous to that of an iPhone, where a software-based touchscreen replaces the hardware-based keyboard commonly found in other phones and PDAs.

“You don’t need to touch any screens,” said El Dokor. “Content is projected away from the dashboard and toward the user, where the user can manipulate it in many ways.”

The panel under development provides vehicle operators with faster, safer, and more efficient access to information. According to Embry-Riddle vice president for research Christina Frederick-Recascino, “Professor El Dokor’s work is a creative and innovative advancement in human-machine interface. We are proud that his work has been recognized by Honda as advancing the development of new means of interacting with computers and other technologies.”

El Dokor’s proposal was chosen from among 300 grant submissions. Five other U.S. university professors also received grants. The purpose of Honda’s program is to fund, in early stages of research, innovative ideas that are likely to contribute value to technology over five to ten years.

The university’s Machine Vision Lab is designed to investigate and develop machine vision, machine perception and robotics applications that range from video games and unmanned aerial vehicles to training programs and outdoor signage.

One of the developments enables an individual to control the movement of video game characters by moving their own bodies instead of using a joystick or controller. The movements are captured on-camera and send messages through the computer that tell on-screen objects or contents what to do.

http://vision.pr.erau.edu