The US Department of Energy is making $258 million available to fund the construction of the country’s next giant supercomputer, according to a report on Futurism.com.
When completed, the supercomputing system could be a capable of one quintillion operations a second.
A quintillion is a 1 with 18 zeros after it. Or in mathematical notation, it’s 10 to the power of 18. Or, in straightforward terms, it’s… 1,000,000,000,000,000,000.
Achieving 1 quintillion operations per second would make the US DoE supercomputer the fastest in the world.
Currently the fastest computer in the world, China’s TaihuLight can perform 93 quadrillion calculations per second.
At the moment, the fastest supercomputer in the US is called Titan, which is a Cray XK7 system, based on Opteron processors which have speeds of 2.2 GHz, and Nvidia K20X graphics processing units, which are designed for servers.
Titan is owned by the Department of Energy, and is currently ranked the fourth-fastest supercomputer in the world, according to Top500.org, a geeky website for supercomputing nerds.
The top 10 supercomputers in the world, according to the latest list published by Top 500, are:
- Sunway TaihuLight (China) – > 10 million cores; ~ 93,000 Teraflops per second
- Tianhe-2 MilkyWay-2 (China) – > 3 million cores; ~ 34,000 Teraflops per second
- Piz Daint (Switzerland) – > 360,000 cores; ~ 20,000 Teraflops per second
- Titan (USA) – > 560,000 cores; ~ 18,000 Teraflops per second
- Sequoia (USA) – > 1.5 million cores; ~ 17,000 Teraflops per second
- Cori (USA) – > 622,000 cores; ~ 14,000 Teraflops per second
- Oakforest (Japan) – > 550,000 cores; ~ 14,000 Teraflops per second
- K (Japan) – > 700,000 cores; ~ 10,000 Teraflops per second
- Mira (USA) – > 780,000 cores; ~ 9,000 Teraflops per second
- Trinity (USA) – > 300,000 cores; ~ 8,000 Teraflops per second
In the past, designing and building supercomputers may have been a purely academic business – both in terms of what type of people work on them as well as the type of applications they have.
Often they’d have applications in government, whether it’s atmospheric sciences to analyse and predict the weather, or in the military.
Now, however, with the expansion of the internet and worldwide web, such colossal computing power is increasingly being seen as the only way to cope with the massive and accelerating increase in both the volume of data around today, as well as its processing.
But supercomputers have long been used in the commercial sector, in such industries as automaking, aerospace and energy.
Mainly they’re used to analyse vast quantities of data and make predictions based on that data.
And it seems it’s not a fringe activity either. According to a study by supercomputing boffins Earl Joseph and Steve Conway, “97 per cent of companies that had adopted supercomputing say that could no longer compete or survive without it”.
So the supercomputing industry, such as it is, could be said to be maturing.
Meanwhile, quantum computing is just emerging, with the promise of even faster and more powerful systems.
But while most people can just about understand how modern conventional computer components work, with their binary transistors at the lowest level, nobody has any idea what quantum computing is.