February 1, 2008 Computer scientists worked with electrical engineers to move the cursor on a computer screen with the userýs voice instead of a mouse. Software interprets vocal commands to move the cursor, allowing people who cannot use their arms to browse the web, player video games, or use photo editing software.
Simple sounds matched with new software are helping people get where they want to go -- and it's all hands-free.
Rich Eldridge suffered a spinal cord injury in a car accident ten years ago. Limited mobility in his arms and hands makes using a computer mouse difficult. But vocal joystick software being developed at the university of Washington is making hands-free mouse movement a reality.
"So what I am doing is making vowel sounds to direct the mouse pointer on the screen," Eldridge said. "So when I go 'ahhhh,' the pointer will go to the right, when I say 'ohhhh,' it will move down. If I wanted to click on something I just go 'kkk,' as in click."
"The vocal joystick is to a mouse what speech recognition is to a keyboard," Jeff Bilmes, Ph.D., Associate Professor of Electrical Engineering at the University of Washington told Ivanhoe.
Dr. Bilmes, with the help of some of his electrical engineering students, has paired a regular personal computer and an inexpensive computer microphone with software to create fluid computer movements -- using only the voice.
"About a 100 times per second, so every 10 milliseconds, the computer listens to what your voice is doing, Dr. Bilmes said.
They are also experimenting using the software to control a small robotic arm.
As for the symphony of sounds it takes to run the software ý "I don't really think about the sounds, I am more focused on what I am looking at and where I want to go," said Eldridge.
Thanks to this human-computer interaction, Eldridge can now rely on his voice to get him where his hands can't take him anymore.
Other voice control technologies: People who use a new kind of wheelchair can now move it using their tongue rather than their hands. Mechanical engineers and computer scientists worked together to develop the wheelchair controls. Moving the tongue changes the air pressure in the users' ears. In the new wheelchair design, a microphone near the ears picks up the change in air pressure and issues commands to a computer chip, which moves the chair in a specified direction.
The Human Factors and Ergonomics Society contributed to the information contained in the video portion of this report.
Editor's Note: This article is not intended to provide medical advice, diagnosis or treatment.