New! Sign up for our free email newsletter.
Science News
from research organizations

Operating smart devices from the space on and above the back of your hand

Date:
May 3, 2017
Source:
Saarland University
Summary:
Smartwatches such as the Apple Watch have been called a 'revolution on the wrist', but the operation of these devices is complicated, because the screen is small. Researchers have therefore developed a novel input method that expands the input space to the back of the hand and the 3-D space above the back of the hand wearing the watch.
Share:
FULL STORY

It relies on a depth sensor that tracks movements of the thumb and index finger on and above the back of the hand. In this way, not only can smartwatches be controlled, but also smartphones, smart TVs and devices for augmented and virtual reality.

They're called the "Apple Watch Series 2," "LG Watch," "Samsung GEAR S3" or "Moto 360 2nd Gen" but they all have the same problem. "Every new product generation has better screens, better processors, better cameras, and new sensors, but regarding input, the limitations remain," explains Srinath Sridhar, a researcher in the Graphics, Vision and Video group at the Max Planck Institute for Informatics.

Together with Christian Theobalt, head of the Graphics, Vision and Video group at MPI, Anders Markussen and Sebastian Boring at the University of Copenhagen and Antti Oulasvirta at Aalto University in Finland, Srinath Sridhar has therefore developed an input method that requires only a small camera to track fingertips in mid-air, and touch and position of the fingers on the back of the hand. This combination enables more expressive interactions than any previous sensing technique.

Regarding hardware, the prototype, which the researchers have named "WatchSense," requires only a depth sensor, a much smaller version of the well-known "Kinect" game controller from the Xbox 360 video game console. With WatchSense, the depth sensor is worn on the user's forearm, about 20cm from the watch. As a sort of 3D camera, it captures the movements of the thumb and index finger, not only on the back of the hand but also in the space over and above it. The software developed by the researchers recognizes the position and movement of the fingers within the 3D image, allowing the user to control apps on smartphones or other devices. "The currently available depth sensors do not fit inside a smartwatch, but from the trend it's clear that in the near future, smaller depth sensors will be integrated into smartwatches," Sridhar says.

But this is not all that's required. According to Sridhar, with their software system the scientists also had to solve the challenges of handling the unevenness of the back of the hand and the fact that the fingers can occlude each other when they are moved. "The most important thing is that we can not only recognize the fingers, but also distinguish between them," explains Sridhar, "which nobody else had managed to do before in a wearable form factor. We can now do this even in real time." The software recognizes the exact positions of the thumb and index finger in the 3D image from the depth sensor, because the researchers trained it to do this via machine learning. In addition, the researchers have successfully tested their prototype in combination with several mobile devices and in various scenarios. "Smartphones can be operated with one or more fingers on the display, but they do not use the space above it. If both are combined, this enables previously impossible forms of interaction," explains Sridhar. He and his colleagues were able to show that with WatchSense, in a music program, the volume could be adjusted and a new song selected more quickly than was possible with a smartphone's Android app. The researchers also tested WatchSense for tasks in virtual and augmented reality, in a map application, and used it to control a large external screen. Preliminary studies showed that WatchSense was more satisfactory for each case than conventional touch-sensitive displays. Sridhar is confident that "we need something like WatchSense whenever we want to be productive while moving. WatchSense is the first to enable expressive input for devices while on the move."

From May 6, the researchers will present WatchSense at the "Conference on Human Factors in Computing," or CHI for short, which this time takes place in the city of Denver in the US.

Find more information at: http://handtracker.mpi-inf.mpg.de/projects/WatchSense/


Story Source:

Materials provided by Saarland University. Note: Content may be edited for style and length.


Cite This Page:

Saarland University. "Operating smart devices from the space on and above the back of your hand." ScienceDaily. ScienceDaily, 3 May 2017. <www.sciencedaily.com/releases/2017/05/170503110802.htm>.
Saarland University. (2017, May 3). Operating smart devices from the space on and above the back of your hand. ScienceDaily. Retrieved April 22, 2024 from www.sciencedaily.com/releases/2017/05/170503110802.htm
Saarland University. "Operating smart devices from the space on and above the back of your hand." ScienceDaily. www.sciencedaily.com/releases/2017/05/170503110802.htm (accessed April 22, 2024).

Explore More

from ScienceDaily

RELATED STORIES