Featured Research

from universities, journals, and other organizations

New Breed Of Supercomputers Proposed To Improve Climate Change Prediction Accuracy

Date:
May 7, 2008
Source:
DOE/Lawrence Berkeley National Laboratory
Summary:
Three researchers have proposed an innovative way to improve global climate change predictions by using a supercomputer with low-power embedded microprocessors, an approach that would overcome limitations posed by today's conventional supercomputers.

Berkeley Lab has signed a collaboration agreement with Tensilica®, Inc. to explore the use of Tensilica's Xtensa processor cores as the basic building blocks in a massively parallel system design. Tensilica's Xtensa processor is about 400 times more efficient in floating point operations per watt than the conventional server processor chip shown here.
Credit: Image courtesy of DOE/Lawrence Berkeley National Laboratory

Three researchers from the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have proposed an innovative way to improve global climate change predictions by using a supercomputer with low-power embedded microprocessors, an approach that would overcome limitations posed by today’s conventional supercomputers.

Related Articles


In a paper published in the May issue of the International Journal of High Performance Computing Applications, Michael Wehner and Lenny Oliker of Berkeley Lab’s Computational Research Division, and John Shalf of the National Energy Research Scientific Computing Center (NERSC) lay out the benefit of a new class of supercomputers for modeling climate conditions and understanding climate change. Using the embedded microprocessor technology used in cell phones, iPods, toaster ovens and most other modern day electronic conveniences, they propose designing a cost-effective machine for running these models and improving climate predictions.

In April, Berkeley Lab signed a collaboration agreement with Tensilica®, Inc. to explore such new design concepts for energy-efficient high-performance scientific computer systems. The joint effort is focused on novel processor and systems architectures using large numbers of small processor cores, connected together with optimized links, and tuned to the requirements of highly-parallel applications such as climate modeling.

Understanding how human activity is changing global climate is one of the great scientific challenges of our time. Scientists have tackled this issue by developing climate models that use the historical data of factors that shape the earth’s climate, such as rainfall, hurricanes, sea surface temperatures and carbon dioxide in the atmosphere. One of the greatest challenges in creating these models, however, is to develop accurate cloud simulations.

Although cloud systems have been included in climate models in the past, they lack the details that could improve the accuracy of climate predictions. Wehner, Oliker and Shalf set out to establish a practical estimate for building a supercomputer capable of creating climate models at 1-kilometer (km) scale. A cloud system model at the 1-km scale would provide rich details that are not available from existing models.

To develop a 1-km cloud model, scientists would need a supercomputer that is 1,000 times more powerful than what is available today, the researchers say. But building a supercomputer powerful enough to tackle this problem is a huge challenge.

Historically, supercomputer makers build larger and more powerful systems by increasing the number of conventional microprocessors — usually the same kinds of microprocessors used to build personal computers. Although feasible for building computers large enough to solve many scientific problems, using this approach to build a system capable of modeling clouds at a 1-km scale would cost about $1 billion. The system also would require 200 megawatts of electricity to operate, enough energy to power a small city of 100,000 residents.

In their paper, “Towards Ultra-High Resolution models of Climate and Weather,” the researchers present a radical alternative that would cost less to build and require less electricity to operate. They conclude that a supercomputer using about 20 million embedded microprocessors would deliver the results and cost $75 million to construct. This “climate computer” would consume less than 4 megawatts of power and achieve a peak performance of 200 petaflops.

“Without such a paradigm shift, power will ultimately limit the scale and performance of future supercomputing systems, and therefore fail to meet the demanding computational needs of important scientific challenges like the climate modeling,” Shalf said.

The researchers arrive at their findings by extrapolating performance data from the Community Atmospheric Model (CAM). CAM, developed at the National Center for Atmospheric Research in Boulder, Colorado, is a series of global atmosphere models commonly used by weather and climate researchers.

The “climate computer” is not merely a concept. Wehner, Oliker and Shalf, along with researchers from UC Berkeley, are working with scientists from Colorado State University to build a prototype system in order to run a new global atmospheric model developed at Colorado State.

“What we have demonstrated is that in the exascale computing regime, it makes more sense to target machine design for specific applications,” Wehner said. “It will be impractical from a cost and power perspective to build general-purpose machines like today’s supercomputers.”

Under the agreement with Tensilica, the team will use Tensilica’s Xtensa LX extensible processor cores as the basic building blocks in a massively parallel system design. Each processor will dissipate a few hundred milliwatts of power, yet deliver billions of floating point operations per second and be programmable using standard programming languages and tools.  This equates to an order-of-magnitude improvement in floating point operations per watt, compared to conventional desktop and server processor chips. The small size and low power of these processors allows tight integration at the chip, board and rack level and scaling to millions of processors within a power budget of a few megawatts.


Story Source:

The above story is based on materials provided by DOE/Lawrence Berkeley National Laboratory. Note: Materials may be edited for content and length.


Cite This Page:

DOE/Lawrence Berkeley National Laboratory. "New Breed Of Supercomputers Proposed To Improve Climate Change Prediction Accuracy." ScienceDaily. ScienceDaily, 7 May 2008. <www.sciencedaily.com/releases/2008/05/080506124443.htm>.
DOE/Lawrence Berkeley National Laboratory. (2008, May 7). New Breed Of Supercomputers Proposed To Improve Climate Change Prediction Accuracy. ScienceDaily. Retrieved November 24, 2014 from www.sciencedaily.com/releases/2008/05/080506124443.htm
DOE/Lawrence Berkeley National Laboratory. "New Breed Of Supercomputers Proposed To Improve Climate Change Prediction Accuracy." ScienceDaily. www.sciencedaily.com/releases/2008/05/080506124443.htm (accessed November 24, 2014).

Share This


More From ScienceDaily



More Computers & Math News

Monday, November 24, 2014

Featured Research

from universities, journals, and other organizations


Featured Videos

from AP, Reuters, AFP, and other news services

Symantec Uncovers Sophisticated Spying Malware Regin

Symantec Uncovers Sophisticated Spying Malware Regin

Newsy (Nov. 24, 2014) — A Symantec white paper reveals details about Regin, a spying malware of unusual complexity which is believed to be state-sponsored. Video provided by Newsy
Powered by NewsLook.com
Hackers Target Business Travellers

Hackers Target Business Travellers

Reuters - Business Video Online (Nov. 24, 2014) — A newly detected malware, dubbed Darkhotel, infects hotel networks with spying software to steal sensitive data from the computers of high profile business executives, warns a leading computer security firm. Ciara Lee reports. Video provided by Reuters
Powered by NewsLook.com
Microsoft Adds Robot Guards, Ushers In Sci-Fi Apocalypse

Microsoft Adds Robot Guards, Ushers In Sci-Fi Apocalypse

Newsy (Nov. 23, 2014) — Microsoft has robotic security guards working at its Silicon Valley Campus. Video provided by Newsy
Powered by NewsLook.com
European Parliament Might Call For Google's Break-Up

European Parliament Might Call For Google's Break-Up

Newsy (Nov. 22, 2014) — This is the latest development in an antitrust investigation accusing Google of unfairly prioritizing own products and services in search results. Video provided by Newsy
Powered by NewsLook.com

Search ScienceDaily

Number of stories in archives: 140,361

Find with keyword(s):
 
Enter a keyword or phrase to search ScienceDaily for related topics and research stories.

Save/Print:
Share:  

Breaking News:

Strange & Offbeat Stories

 

Space & Time

Matter & Energy

Computers & Math

In Other News

... from NewsDaily.com

Science News

Health News

Environment News

Technology News



Save/Print:
Share:  

Free Subscriptions


Get the latest science news with ScienceDaily's free email newsletters, updated daily and weekly. Or view hourly updated newsfeeds in your RSS reader:

Get Social & Mobile


Keep up to date with the latest news from ScienceDaily via social networks and mobile apps:

Have Feedback?


Tell us what you think of ScienceDaily -- we welcome both positive and negative comments. Have any problems using the site? Questions?
Mobile iPhone Android Web
Follow Facebook Twitter Google+
Subscribe RSS Feeds Email Newsletters
Latest Headlines Health & Medicine Mind & Brain Space & Time Matter & Energy Computers & Math Plants & Animals Earth & Climate Fossils & Ruins