New! Sign up for our free email newsletter.
Science News
from research organizations

Get ready for computers of the future

Date:
May 29, 2014
Source:
Sandia National Laboratories
Summary:
Experts expect multiple computing device-level technologies in the future, rather than one dominant architecture. About a dozen possible next-generation candidates exist, including tunnel FETs (field effect transistors, in which the output current is controlled by a variable electric field), carbon nanotubes, superconductors and fundamentally new approaches, such as quantum computing and brain-inspired computing.
Share:
FULL STORY

Computing experts at Sandia National Laboratories have launched an effort to help discover what computers of the future might look like, from next-generation supercomputers to systems that learn on their own -- new machines that do more while using less energy.

"We think that by combining capabilities in microelectronics and computer architecture, Sandia can help initiate the jump to the next technology curve sooner and with less risk," said Rob Leland, head of Sandia's Computing Research Center. Leland recently outlined a major effort into next-generation computing called Beyond Moore Computing that's part of Sandia's overall work on future computing.

For decades, the computer industry operated under Moore's Law, named for Intel Corp. co-founder Gordon Moore, who in 1965 postulated it was economically feasible to improve the density, speed and power of integrated circuits exponentially over time. But speed has plateaued, the energy required to run systems is rising sharply and industry can't indefinitely continue to cram more transistors onto chips.

The plateauing of Moore's Law is driving up energy costs for modern scientific computers to the point that, if current trends hold, more powerful future supercomputers would become impractical due to enormous energy consumption.

Solving that conundrum will require new computer architecture that reduces energy costs, which are principally associated with moving data, Leland said. Eventually, computing also will need new technology that uses less energy at the transistor device-level, he added.

Sandia experts expect multiple computing device-level technologies in the future, rather than one dominant architecture. About a dozen possible next-generation candidates exist, including tunnel FETs (field effect transistors, in which the output current is controlled by a variable electric field), carbon nanotubes, superconductors and fundamentally new approaches, such as quantum computing and brain-inspired computing.

Sandia's facilities will play key role in researching future computing technology

Sandia is well positioned to work on future computing technology due to its broad and long history in supercomputers, from architecture to algorithms to applications. Leland said Sandia can play a key role because of that far-reaching background and two key facilities: the Microsystems and Engineering Sciences Applications (MESA) complex, which performs multidisciplinary microsystems research and development and fabricates chips to test ideas; and the Center for Integrated Nanotechnology (CINT), a Department of Energy Office of Science national user facility operated by Sandia and Los Alamos national laboratories.

No one is sure what tomorrow's high performance computers will look like. "We have some ideas, of course, and we have different camps of opinion about what it might look like, but we're really right in the midst of figuring that out," Leland said.

Erik DeBenedictis of Sandia's Advanced Device Technologies department said Sandia can play an important role in creating breakthroughs that are not simply variations of transistors -- developments such as computers that learn or technologies that move data from one part of the computer to another more efficiently -- crucial for big data problems.

What ultimately prevails might well be something not yet invented, Leland said.

"That's the first challenge, to figure out what the new device technology is, then work through what the implications of that are, what sort of computer architecture is required to assemble that device into components and subsystems and systems," he said.

New technology must be broadly adopted to drive improvements

Sandia needs both capability computing, which means finer resolution and more accuracy, and capacity computing, or running many different jobs simultaneously.

"So what does efficiency buy you? It allows you to have a bigger computer or more computers with the same amount of operating expense -- paying your power bill," said Advanced Device Technologies department manager John Aidun. "There's no limit to the amount of efficiency we would like to achieve because really there's no limit to the amount of computing we would like to do."

Whatever technology comes next must be broadly adopted so it will drive continual improvements, similar to the way the 1947 invention of the transistor transformed society. It's not enough to have a device that's fast; it has to be something that can be built into a complete computer system, Aidun said.

Thus, new technology must have commercial uses. "There will have to be some industrial base that supports it and produces it and that can be used to assemble a large number of these into a system that can be deployed for national security," Leland said. "What we'd really like to do is figure out how to advance the state of the art for national security in a way that is more broadly deployable across society."

The computer industry is exploring technologies that in essence are drop-in replacements for transistors with improved characteristics: different designs such as the fin FET, a 3-D rather than a flat configuration on a computer chip, Aidun said. While the design would be moderately disruptive for industry, it's still compatible with standard silicon fab technology and opens the potential for generations of ever-smaller fin FETs on a chip, he said.

While industry views a beyond-transistor technology as something far off, Sandia's national security interests anticipate bigger changes will be needed sooner than industry would develop them on its own, Aidun said. He estimated Sandia could have a prototype new technology within a decade.

Identifying best computer designs can help accelerate innovation

To accelerate the process, Sandia wants to identify computer designs that could take advantage of new device technologies and demonstrate key components or steps in fabrication that would lower the risk for industry by demonstrating technological feasibility.

"We'd be doing it with an eye toward helping industry give due attention to national security needs in computing," Aidun said.

The numerical capability developed in computers in World War II remains valuable today for such tasks as nuclear weapons simulations. But the modern era's largest computing development -- the Internet -- deals with text and demands computing functions called integer calculation, also used in mobile computing.

Improving mobile computing could allow much more efficient and rapid data processing aboard satellites, so less data would need to be sent to Earth for processing.

"The mobility we see in cell phones and tablets is the closest match for the mobility needs of UAVs and satellites," DeBenedictis said. "The energy and time required to transmit data to the ground, process it there and send the answer back is a bottleneck, and it can be more resource-intensive than just computing on the device."

He also suggested turning more programming over to cognitive computers to help programmers manage ever-faster computers. "While computers have gotten millions of times faster, programmers and analysts are pretty much as efficient as they've always been," he said.

Cognitive computing can play role in pattern recognition

Cognitive computers might be able to do more to recognize patterns in satellite imagery, for example. People would still make the judgments, but computers would help by recognizing some lower-level patterns, he said. Up to now, programmers have created ways for computers to recognize images; computers didn't learn on their own. A cognitive computer, however, would learn to identify patterns, DeBenedictis said.

"A computer can learn to recognize images pretty well. Humans assisted by a computer recognizing images could improve the ability significantly," he said.

Researchers also must determine what hardware and software changes are needed so new devices are both possible to manufacture and practical to operate. "You have to design over all those different considerations," Leland said. "That's what makes this a particularly challenging problem."

Today's computer systems rely on huge, longstanding investments in massive amounts of software.

"So we are strongly motivated to develop computers that will run old software that was optimized for traditional computer architectures that are not used today," DeBenedictis said. "To break out of that, we have to find different architectures that are more energy efficient at running old code and are more easily programmed for new code, or architectures that can learn some behaviors that once required programming."

Since the software of today won't unleash the full capabilities of the hardware of tomorrow, he expects computers in about a decade that can run both today's software and new software. New software "would learn or would process information in fundamentally different ways, and become the most powerful aspect of the computer over time," he said.


Story Source:

Materials provided by Sandia National Laboratories. Note: Content may be edited for style and length.


Cite This Page:

Sandia National Laboratories. "Get ready for computers of the future." ScienceDaily. ScienceDaily, 29 May 2014. <www.sciencedaily.com/releases/2014/05/140529142030.htm>.
Sandia National Laboratories. (2014, May 29). Get ready for computers of the future. ScienceDaily. Retrieved March 28, 2024 from www.sciencedaily.com/releases/2014/05/140529142030.htm
Sandia National Laboratories. "Get ready for computers of the future." ScienceDaily. www.sciencedaily.com/releases/2014/05/140529142030.htm (accessed March 28, 2024).

Explore More

from ScienceDaily

RELATED STORIES