Science News
from research organizations

In the future, we will control our mobiles using gestures

Date:
April 4, 2017
Source:
Linnaeus University
Summary:
Being able to interact with mobile phones and other smart devices using gestures with our hands and fingers in three dimensions would make the digital world more like the real one. A new project will help develop this next-generation interface.
Share:
FULL STORY

Being able to interact with mobile phones and other smart devices using gestures with our hands and fingers in three dimensions would make the digital world more like the real one. A new project at Linnaeus University will help develop this next-generation interface.

Width, height, depth -- the world is three-dimensional. So why are almost all interactions with mobile devices based on two dimensions only?

Sure, touchscreens have made it a lot easier to interact, but as new devices such as smart watches and virtual reality glasses turn up, we will not be content with two dimensions. We want to be able to interact in 3D with the help of the hands in front of and around our digital devices. This is where the new project Real-Time 3D Gesture Analysis for Natural Interaction with Smart Devices comes in, a project that will bring the next big development in interface technology.

- The goal of our project is that you will get the same experience of, for example, grabbing and twisting an object in the digital world as in the real world, says Shahrouz Yousefi, senior lecturer in media technology at Linnaeus University and project manager.

The applications are many and diverse -- virtual and augmented reality (VR and AR), medical settings, robotics, e-learning, 3D games and much more. To be able to analyse in real-time the movements of the hands and individual fingers, however, requires both high capacity and high intelligence of the system that is to handle this. It involves large amounts of data and advanced analyses, especially when the system needs to track the movements of multiple hands simultaneously.

- A key issue is how we can develop and use new techniques for the analysis of so-called Big Data to analyse gestures and movements. Likewise how we can integrate them with solutions that already exist for computer vision and pattern recognition, to tackle the high degrees of freedom 3D gesture analysis.

The project is funded by the HÖG 16 programme of KK-stiftelsen (the Knowledge Foundation) with SEK 4 393 590 and will last for three years. Three companies -- Screen Interaction, MindArk PE and Globalmouth -- participate as active partners. They will contribute with unique knowledge and expertise in the field as well as equipment, so that research, development and implementation as quickly as possible will lead to practical applications that can be demonstrated.


Story Source:

Materials provided by Linnaeus University. Note: Content may be edited for style and length.


Cite This Page:

Linnaeus University. "In the future, we will control our mobiles using gestures." ScienceDaily. ScienceDaily, 4 April 2017. <www.sciencedaily.com/releases/2017/04/170404085749.htm>.
Linnaeus University. (2017, April 4). In the future, we will control our mobiles using gestures. ScienceDaily. Retrieved May 24, 2017 from www.sciencedaily.com/releases/2017/04/170404085749.htm
Linnaeus University. "In the future, we will control our mobiles using gestures." ScienceDaily. www.sciencedaily.com/releases/2017/04/170404085749.htm (accessed May 24, 2017).

RELATED STORIES