Faster than 50 million laptops – the race to go exascale

Story highlights

ExaFLOP supercomputers will perform a billion, billion operations every second

Exascale computer would be 1,000 times faster than today's quickest machines

Issues over power consumption remain stumbling block to their creation

First exaFLOP computers expected to go online around 2020

CNN  — 

A new era in computing that will see machines perform at least 1,000 times faster than today’s most powerful supercomputers is almost upon us.

By the end of the decade, exaFLOP computers are predicted to go online heralding a new chapter in scientific discovery.

The United States, China, Japan, the European Union and Russia are all investing millions of dollars in supercomputer research. In February, the EU announced it was doubling investment in research to €1.2 billion ($1.6 billion).

See also: Super telescope to search for secrets of universe

What is an exaFLOP?

Computer scientists measure a supercomputer’s performance in FLOPS, an acronym for FLoating Point Operations per Second, while “exa” is a metric prefix which stands for quintillion (or a billion billion). An exascale computer could perform approximately as many operations per second as 50 million laptops.

“It is the next frontier for high-performance computing,” says Dimitrios Nikolopoulos, professor at the School of Electronics at the UK’s Queen’s University of Belfast.

Exascale by numbers

How fast are today’s supercomputers?

Today, the fastest supercomputers operate at the petaFLOP level says Nikolopoulos, performing in excess of one quadrillion (or a million billion) operations per second.

The first computer to break through the petaFLOP barrier was IBM’s Roadrunner in 2008. But its reign as the fastest computer in the world didn’t last long, with the Cray Jaguar installed at Oak Ridge National Laboratory in the United States becoming the quickest with a performance of 1.75 petaFLOPS in 2009.

Today, the crown is held by is Japan’s K computer developed by RIKEN and Fujitsu, according to TOP500 – a project that tracks trends in high-performance computing.

The machine, installed at the RIKEN Advanced Institute for Computational Science, in Kobe, Japan, currently operates at over 10 petaFLOPS. It is more than four times faster than its nearest rival, China’s NUDT YH MPP computer (2.57 petaflops).

How big are they?

“The kind of space that you need is similar to that of a football field. You’re talking about many, many lanes of computer racks and thousands of processors,” says Nikolopoulos.

The K computer contains a mind-boggling 88,128 computer processors and is made up of 864 refrigerator-sized cabinets.

Physically, exascale computing won’t get any bigger, says Nikolopoulos, and might even get a little smaller. But the amount of processors will rise substantially to anywhere between one million and 100 million.

See also: $35 PC size of a credit card

What are the challenges of reaching exascale?

Nikolopoulos says “severe technology barriers” remain, the most important being power. “Power consumption of supercomputers in general is not sustainable,” he says.

“The current projections suggest that power consumption of exascale computers will be 100 megawatts. It’s impossible to build a suitable facility and have enough power.”

Historically, a computer’s processor has used the most power (around 40-50% of the total) Nikolopoulos says, but memory is rapidly catching it up.

“Changing materials and also the architecture of processors and memories is critical to exascale’s success,” he says.

“We are beginning to understand the challenges of exascale in terms of hardware, software and applications. We are at the stage where we can make mental projections and set up directions for research.”

What benefits could exascale computing bring?

It will enable discovery in many areas of science, says Nikolopoulos. “Aerospace engineering, astrophysics, biology, climate modeling and national security all have applications with extreme computing requirements,” Nikolopoulos said.

See also: Mapping the brain’s secrets

Bill Cabbage, public information officer at Oak Ridge National Laboratory, says exascale will attempt to tackle very serious challenges in energy supply and sustainability.

“These are very difficult problems and will require the development of new forward-thinking technologies to deal with them,” Cabbage said.

“We are bringing all our resources to bear on these problems,” he added.

Social sciences could also profit, says Nikolopoulos.

“More and more people are interested in understanding the behaviors of societies as a whole. These require simulations – how people interact, communicate, how they move. That will require exascale computing,” he said.

Related