If it’s 15 years away from the showroom, you know it had to come from Japan. Last week at the 2009 Automotive Engineering Exposition Toshiba Corp exhibited a system which monitors and tracks driver head and eye movements. It can scold inattentive drivers and even operate the navigation system via manual switches and eye movement.
Upon startup, the driver looks forward while the camera mounted atop the steering wheel captures features such as face shape and eye position. Then the driver glances at the navigation system so the computer can calculate eye orientation and head movements associated with common driving tasks.
Once programmed, the system can lock onto the driver’s face and identify which of eight various areas he/she is focusing on. These include: the left, center and right portions of the windshield, the left and right mirrors, the gauges, the navigation system and the audio system. What, no HVAC? What about my Super Big Gulp perched precariously in the undersized, Japanese cup holder? Oh well, I suppose not getting killed will have to do for now.
The system may even be able to detect blinking patterns and alert drowsy drivers. Personally, I think it should blast audio of screaming pedestrians as a wake-me-up. If the software were more advanced, facial recognition could also be used as an anti-theft device.
There are limitations, however. As it stands, the system has trouble recognizing drivers with glasses or masks, but Toshiba is hard at work on a solution. Definitely good news for the Burger King.