New! Sign up for our free email newsletter.
Science News
from research organizations

'Magic Bench' lets users see, hear and feel animated characters

Date:
July 26, 2017
Source:
Disney Research
Summary:
Sit on Disney Research's Magic Bench and you may have an elephant hand you a glowing orb. Or you might get rained on. Or a tiny donkey might saunter by and kick the bench. It's a combined augmented and mixed reality experience, but not the type that involves wearing a head-mounted display or using a handheld device. Instead, the surroundings are instrumented rather than the individual, allowing people to share the 'magical' experience as a group.
Share:
FULL STORY

Sit on Disney Research's Magic Bench and you may have an elephant hand you a glowing orb. Or you might get rained on. Or a tiny donkey might saunter by and kick the bench.

It's a combined augmented and mixed reality experience, but not the type that involves wearing a head-mounted display or using a handheld device. Instead, the surroundings are instrumented rather than the individual, allowing people to share the magical experience as a group.

People seated on the Magic Bench can see themselves in a mirrored image on a large screen in front of them, creating a third person point of view. The scene is reconstructed using a depth sensor, allowing the participants to actually occupy the same 3D space as a computer-generated character or object, rather than superimposing one video feed onto another.

"This platform creates a multi-sensory immersive experience in which a group can interact directly with an animated character," said Moshe Mahler, principal digital artist at Disney Research. "Our mantra for this project was: hear a character coming, see them enter the space, and feel them sit next to you."

The research team will present and demonstrate the Magic Bench at SIGGRAPH 2017, the Computer Graphics and Interactive Techniques Conference, beginning July 30 in Los Angeles.

The researchers used a color camera and depth sensor to create a real-time, HD-video-textured 3D reconstruction of the bench, surroundings, and participants. The algorithm reconstructs the scene, aligning the RGB camera information with the depth sensor information.

To eliminate depth shadows that occur in areas where the depth sensor has no corresponding line of sight with the color camera, a modified algorithm creates a 2D backdrop. The 3D and 2D reconstructions are positioned in virtual space and populated with 3D characters and effects in such a way that the resulting real-time rendering is a seamless composite, fully capable of interacting with virtual physics, light, and shadows.

"The bench itself plays a critical role," Mahler said. "Not only does it contain haptic actuators, but it constrains several issues for us in an elegant way. We know the location and the number of participants, and can infer their gaze. It creates a stage with a foreground and a background, with the seated participants in the middle ground. It even serves as a controller; the mixed reality experience doesn't begin until someone sits down and different formations of people seated create different types of experiences."

The Magic Bench was the work of a large team of Disney Research scientists. In addition to Mahler, it includes Kyna McIntosh, John Mars, James Krahe, Jim McCann, Alexander Rivera, Jake Marsico, Ali Israr and Shawn Lawson.


Story Source:

Materials provided by Disney Research. Note: Content may be edited for style and length.


Cite This Page:

Disney Research. "'Magic Bench' lets users see, hear and feel animated characters." ScienceDaily. ScienceDaily, 26 July 2017. <www.sciencedaily.com/releases/2017/07/170726102958.htm>.
Disney Research. (2017, July 26). 'Magic Bench' lets users see, hear and feel animated characters. ScienceDaily. Retrieved April 28, 2024 from www.sciencedaily.com/releases/2017/07/170726102958.htm
Disney Research. "'Magic Bench' lets users see, hear and feel animated characters." ScienceDaily. www.sciencedaily.com/releases/2017/07/170726102958.htm (accessed April 28, 2024).

Explore More

from ScienceDaily

RELATED STORIES