Eye-controlled Computer Operation
- Date:
- September 27, 2006
- Source:
- Fraunhofer Institute
- Summary:
- A new system makes it possible to guide the computer mouse with your eyes. The technology is designed to facilitate the task of maintenance technicians and make it easier for paraplegics to work at the PC. A software program interprets the user’s pupil movements.
- Share:
A new system makes it possible to guide the computer mouse with your eyes. The technology is designed to facilitate the task of maintenance technicians and make it easier for paraplegics to work at the PC. A software program interprets the user’s pupil movements.
No sooner have we become accustomed to talking with computers than we can even begin to control them with our eyes. The “Eye-Controlled Interaction” system (EYCIN) developed by researchers at the Fraunhofer Institute for Industrial Engineering IAO in Stuttgart in cooperation with industrial partners tracks the human user’s eye movement and transmits it to the mouse pointer on the monitor. A camera observes the movement of the pupils from a distance of up to one meter; a software program calculates and transfers the coordinates of the area viewed. It all happens so quickly that the mouse pointer moves smoothly.
Calculating the motion is comparatively easy, but clicking presents a real challenge. The mouse has to be guided accurately to the required “button”, which then has to be activated by means of eye movements. One of the most important tasks of the research team led by IAO project managers Wolfgang Beinhauer and Fabian Hermann was therefore to develop a functional user interface. Its elements must not be too small, since the mouse cannot be controlled as accurately with the eyes as it can with the hand. Clicking a button, too, is more difficult with the eyes. To solve this problem, the researchers have developed sensitive areas that can be activated by fixing the eyes on them for a certain length of time. The button changes color twice before it goes “click” – important feedback for users, who can thus tell whether or not the computer has understood their commands.
The engineers have invested a tremendous amount of meticulous work in the new eye-controlled interaction system. One of the problems they faced were the miniature jerk-like movements, or microsaccades, that the eye constantly makes. If the pupil movements were transmitted to the monitor without first being filtered, the pointer would dash around all over the monitor. The software must first suppress these microsaccades by means of a filter function, then determine the main direction of movement. Another challenge is that of finding the best way of subdividing the monitor.
EYCIN will be used to facilitate the assembly or maintenance of industrial equipment by technicians: a worker can click his way through the maintenance menu with eye movements while holding the respective parts in his hands. The system could also make it easier for paraplegics to work with a computer.
Story Source:
Materials provided by Fraunhofer Institute. Note: Content may be edited for style and length.
Cite This Page: