Featured Research

from universities, journals, and other organizations

Perfecting digital imaging

Date:
July 23, 2013
Source:
Harvard School of Engineering and Applied Sciences
Summary:
Computer graphics and digital video lag behind reality; despite advances, the best software and video cameras still cannot seem to get computer-generated images and digital film to look exactly the way our eyes expect them to.

The subtleties in these computer-generated images of translucent materials are important. Texture, color, contrast, and sharpness combine to create a realistic image.
Credit: Image courtesy of Ioannis Gkioulekas and Shuang Zhao

Computer graphics and digital video lag behind reality; despite advances, the best software and video cameras still cannot seem to get computer-generated images and digital film to look exactly the way our eyes expect them to.

Related Articles


But Hanspeter Pfister and Todd Zickler, computer science faculty at the Harvard School of Engineering and Applied Sciences (SEAS), are working to narrow the gap between 'virtual' and 'real' by asking a common question: how do we see what we see?

Between them, Pfister and Zickler are presenting three papers this week at SIGGRAPH 2013, the 40th International Conference and Exhibition on Computer Graphics and Interactive Techniques.

Realistic soap

One project led by Zickler, the William and Ami Kuan Danoff Professor of Electrical Engineering and Computer Science, tries to find better ways to mimic the appearance of a translucent object, such as a bar of soap. The paper elucidates how humans perceive and recognize real objects and how software can exploit the details of that process to make the most realistic computer-rendered images possible.

"If I put a block of butter and a block of cheese in front of you, and they're the same color, and you're looking for something to put on your bread, you know which is which," says Zickler. "The question is, how do you know that? What in the image is telling you something about the material?"

His hope is to eventually understand these properties well enough to instruct a computer with a camera to identify what material an object is made of and to know how to properly handle it -- how much it weighs or how much pressure to safely apply to it -- the way humans do.

The new approach focuses on translucent materials' phase function, part of a mathematical description of how light refracts or reflects inside an object, and, therefore, how we see what we see.

In the past, phase function shape was considered relevant to an object's translucent appearance, but formal perceptual studies had never been carried out. This is because the space of different phase functions is so vast and perceptually diverse to the human brain that modern computational tools were required to generate and analyze so many different images.

Zickler's team took advantage of increased computational power to trim down the potential space of images to a manageable size. They first rendered thousands of computer-generated images of one object with different computer-simulated phase functions, so each image's translucency was slightly different from the next. From there, a program compared each image's pixel colors and brightness to another image in the space and decided how different the two images were. Through this process, the software created a map of the phase function space according to the relative differences of image pairs, making it easy for the researchers to identify a much smaller set of images and phase functions that were representative of the whole space.

Finally, the researchers asked people to compare these representative images and judge how similar or different they were, shedding light on the properties that help us decide which objects are plastic and which are soap simply by looking at them.

"This study, aiming to understand the appearance space of phase functions, is the tip of the iceberg for building computer vision systems that can recognize materials," says Zickler. The next step in this research will involve finding ways to accurately measure a material's phase functions instead of making them up computationally, and Zickler's team is already making progress on this, with a new system that will be presented at SIGGRAPH Asia in December.

Zickler's coauthors were Ioannis Gkioulekas, a graduate student at Harvard SEAS; Bei Xiao and Edward H. Adelson of MIT; and Shuang Zhao and Kavita Bala of Cornell University.

Adaptive displays

A second study involving Zickler investigates a new type of screen hardware that displays different images when lit or viewed from different directions.

By creating tiny grooves of varying depths across the screen's surface, Zickler's team created optical interference effects that cause the thin surface to look different when illuminated or viewed from different angles.

The paper essentially asks, "If I know what appearances I want the screen to have, how do I optimize the geometric structure to get that?" Zickler explains.

The solution takes advantage of mathematical functions (called bidirectional reflectance distribution functions) that represent how light coming from a particular direction will reflect off a surface.

Past attempts to control surface reflection for graphics applications have only been accomplished for surfaces displaying huge images that, for example, have pixels the size of a square inch, because their analyses did not account for interference effects. Zickler's work, however, demonstrates that interference effects can be exploited to control reflection from a screen at micron scales using well-known photolithographic techniques.

In the future, this kind of optimization could enable multi-view, lighting-sensitive displays, where a viewer rotating around a flat surface could perceive a three-dimensional object while looking at the surface from different angles, and where the virtual object would correctly respond to external lighting.

"Looking at such a display would be exactly like looking through a window," Zickler says.

He was joined on this paper by Ying Xiong, a graduate student at Harvard SEAS; Anat Levin and Daniel Glazner at the Weizmann Institute of Science; and Frιdo Durand, William Freeman, and Wojciech Matusik at MIT.

Vivid color

A third paper, led by Hanspeter Pfister, An Wang Professor of Computer Science, tackled a problem in digital film editing. (Video: http://youtu.be/cYbDJ4NR6WY)

Color grading -- editing a video to impose a particular color palette -- has historically been a painstaking, manual process requiring many hours' work by skilled artists. Amateur filmmakers therefore cannot achieve the characteristically rich color palettes of professional films.

"The starting idea was to appeal to broad audience, like the millions of people on YouTube," says lead author Nicolas Bonneel, a postdoctoral researcher in Pfister's group at SEAS.

Pfister's team hopes to make frame-by-frame editing unnecessary by creating software that lets users simply select, hypothetically, the Amιlie look or the Transformers look. The computer would then apply that color palette to the user's video via a few representative frames. The user only has to indicate where the foreground and background are in each frame, and the software does the rest, interpolating the color transformations throughout the video.

Bonneel estimates that the team's new color grading method could be incorporated into commercially available editing software within the next few years.

Pfister and Bonneel were joined on this paper by Kalyan Sunkavalli and Sylvain Paris of Adobe Systems, Inc.


Story Source:

The above story is based on materials provided by Harvard School of Engineering and Applied Sciences. The original article was written by Manny Morone. Note: Materials may be edited for content and length.


Cite This Page:

Harvard School of Engineering and Applied Sciences. "Perfecting digital imaging." ScienceDaily. ScienceDaily, 23 July 2013. <www.sciencedaily.com/releases/2013/07/130723134321.htm>.
Harvard School of Engineering and Applied Sciences. (2013, July 23). Perfecting digital imaging. ScienceDaily. Retrieved March 5, 2015 from www.sciencedaily.com/releases/2013/07/130723134321.htm
Harvard School of Engineering and Applied Sciences. "Perfecting digital imaging." ScienceDaily. www.sciencedaily.com/releases/2013/07/130723134321.htm (accessed March 5, 2015).

Share This


More From ScienceDaily



More Computers & Math News

Thursday, March 5, 2015

Featured Research

from universities, journals, and other organizations


Featured Videos

from AP, Reuters, AFP, and other news services

Redheads Call For Representation Among Apple Emojis

Redheads Call For Representation Among Apple Emojis

Newsy (Mar. 4, 2015) — Some redheads and their supporters are petitioning Apple to include a red-haired emoji. Video provided by Newsy
Powered by NewsLook.com
Largest Gathering of Games Developers in San Francisco

Largest Gathering of Games Developers in San Francisco

AFP (Mar. 4, 2015) — The 2015 Games Developers Conference, the largest gathering of its kind, brings professionals from all over the world together in San Francisco to reflect on on the art and science of games creation. Duration: 01:13 Video provided by AFP
Powered by NewsLook.com
'FREAK' Attack Courtesy Of Age-Old Government Policies

'FREAK' Attack Courtesy Of Age-Old Government Policies

Newsy (Mar. 4, 2015) — "FREAK" attack allows hackers to gain access to your encrypted data. Video provided by Newsy
Powered by NewsLook.com
Zipline Through the Amazon With Google Street View

Zipline Through the Amazon With Google Street View

Buzz60 (Mar. 4, 2015) — Google Street View lets you zip through trees in the Amazon Jungle. Well, as Gillian Pensavalle (@GillianWithaG) explains, as fast as your Internet speed will allow. Video provided by Buzz60
Powered by NewsLook.com

Search ScienceDaily

Number of stories in archives: 140,361

Find with keyword(s):
 
Enter a keyword or phrase to search ScienceDaily for related topics and research stories.

Save/Print:
Share:  

Breaking News:

Strange & Offbeat Stories

 

Space & Time

Matter & Energy

Computers & Math

In Other News

... from NewsDaily.com

Science News

Health News

Environment News

Technology News



Save/Print:
Share:  

Free Subscriptions


Get the latest science news with ScienceDaily's free email newsletters, updated daily and weekly. Or view hourly updated newsfeeds in your RSS reader:

Get Social & Mobile


Keep up to date with the latest news from ScienceDaily via social networks and mobile apps:

Have Feedback?


Tell us what you think of ScienceDaily -- we welcome both positive and negative comments. Have any problems using the site? Questions?
Mobile iPhone Android Web
Follow Facebook Twitter Google+
Subscribe RSS Feeds Email Newsletters
Latest Headlines Health & Medicine Mind & Brain Space & Time Matter & Energy Computers & Math Plants & Animals Earth & Climate Fossils & Ruins