Four European supercomputers in the top 10, but how relevant is that?
The top 10 of the November 2024 list of supercomputers is very much a western affair, all coming from the US, Europe and Japan. But in fact, if China’s fastest supercomputers were listed, they would be third and fifth on the list. And the commercial sector is building machines that are even faster than the number one.
Looking at the list of the 500 most powerful supercomputers in November 2024, it is very much a western government affair. The entire top 10 is made up of systems in western countries, if we count Japan also as a western country. Half of the top 10 is in the US, 4 are in Europe and one in Japan.
A solid lead in supercomputers
Of the entire list, about a third (34%) is located in the US, with Europe only a little bit behind (32%). China has just over 12%. When it comes to performance, the picture is even more skewed: more than half (55%) of all computing power of the Top 500 supercomputers is in the US, the EU has about a quarter (27%) and China only about 3%.
The top three systems on the list are in the exascale range, meaning that their performance is over 1.000 PFlop/s (that is: one exaFlop/s) and all three are in the US. The fastest computer is a new entry on the list, El Capitan at Lawrence Livermore National Laboratory. The fact that this laboratory is dedicated to security issues (in a broad sense) illustrates the strategic importance that is assigned to supercomputers.
This picture is quite different from only four years ago. Then, China had the highest number of systems in the Top 500, almost twice as much as the US with a total performance that was almost the same as the US. Now, China’s total compute power on the list is actually less than in November 2020 because many of their new systems don’t appear on the 2024 list. Their fastest computers on the list, the Sunway TiahuLight (number 15 on the list) and the Tianhe-2A (number 24) still list the same performance as in 2020, whereas the fastest US system now lists a performance that is 10 times as powerful as in 2020.
Europe is keeping up in the supercomputer race. Their fastest system on the list is also around ten times faster now than in 2020. At the moment, an exascale supercomputer is being assembled in Germany, which is now 18th on the list because it is still under construction.
A misleading picture
However, the list is not necessarily a good representation of the biggest compute systems in the world, because of the specific way in which the list is determined.
First of all, rumor has it that the most powerful Chinese supercomputers are actually in the same league as El Capitan. The Tianhe-3, which came online early in 2024, is assumed to reach a sustained performance of 1.57 exaFlop/s, which is over 15 times as fast as the fastest Chinese system in 2020. If it were on the list, it would be in third place. And with 1.22 exaFlop/s the Chinese OceanLight system would be fifth. The reason that they are not included is that no official benchmarks are available for these systems. But it does make the list of supercomputers probably less representative.
Then there is the commercial sector. From the list of the top 500 supercomputers, it would seem that Microsoft is by far the largest player, with its Eagle system of 561,200 TFlop/s, the fourth most powerful system on the list and about a third as powerful as the top research systems on the list. But here again, there are probably other commercial systems that are not listed and that are at least as powerful.
In the field of AI there is a real scramble for huge systems. According to Elon Musk, the Colossus supercomputer of xAI, consisting of 100.000 Nvidia H100 (Hopper) accelerators would have a theoretical peak performance of 3.4 exaFlop/s. This is higher than that of El Capitan, which has a theoretical peak performance of 2.7 exaFlop/s. When used for training AI-models, Colossus will use smaller number representations (typically FP8 or INT8) for which it is expected to have a peak performance of even 396 exaFlop/s. xAI plans to eventually double the number of processors.
Similarly, OpenAI and Microsoft are developing their Stargate system, which is projected to eventually have a count of 2 M GPU’s/XPU, which in terms of pure numbers is even ten times as much as what xAI is building.
Note though that the performance of these systems cannot really be compared with those on the Top 500 list of supercomputers. They are tailored to training AI-systems and their performance for more general applications is essentially unknown. That is also the reason why they are not on the list.
Last but not least, another category of compute hardware that is missing from the list are quantum computers. Again, the reason for not including them on the Top 500 list of supercomputers is that quantum computers are not so general-purpose as traditional supercomputers. Therefore, they cannot run the benchmark applications that define the list. But for specific computations, they can easily outrun even the fastest supercomputers.
Quantum computing technology is developing fast. The size of a quantum computer is often expressed in terms of the number of qubits. Even if this is a very dubious measure for performance, it is at least somewhat meaningful that the largest quantum computer that is now planned is projected to have 1M qubits, whereas the current largest systems have around 1k qubits. And, as in supercomputers overall, the western lead in the field of quantum computing is far from assured.
So, even with all caveats that come with comparing supercomputers, it seems safe to say that the Top 500 list of supercomputers is hardly representative of the growth in compute power in the world. As China and commercial parties build up their capabilities and quantum computing matures, the lead of the US and European research systems suddenly seems much less solid.
Why supercomputers matter
This is certainly something to worry about. Computing capabilities are a matter of strategic importance. Just remember that computing played a central role in the Second World War. It helped to decrypt messages of the German army and it was crucial in the development of the atomic bomb. These days quantum computers are expected to break current encryption mechanisms, laying bare everything that was thought to be secret until now. Also, biotechnology, including the mitigation of contagious diseases (if not their development), is unthinkable without advanced computing. Clearly, whoever has the strongest computing capability has a strategic advantage.
The US and Europe have identified this issue. The US launched its National Strategic Computing Initiative already in 2015 with the aim “to advance U.S. leadership in high performance computing (HPC)”. In Europe, the EuroHPC program was established in 2018 in order to “make Europe a world leader in supercomputing, [boosting] Europe’s scientific excellence and industrial strength, support[ing] the digital transformation of its economy while ensuring its technological sovereignty.”
Don’t forget the algorithms
Nevertheless, such leadership is far from assured. Even if the west stays competitive in terms of hardware, computing is more than just metal. It’s also about algorithms and about capabilities in the workforce. In that respect, a lot is changing too with increasing use of machine learning and quantum algorithms. The west is not necessarily going to be leading on that front.
This is well understood, given reports like the one from the Society of Industrial and Applied Mathematics in the US and, in the Netherlands, the National Agenda Computational Science. But again, as China is rapidly increasing its R&D workforce it can also be expected to remain a serious competitor in the field of computational science. In fact, since 2016 Chinese teams have won several Gordon Bell prizes for outstanding achievement in high-performance computing and there are again Chinese finalists in 2024.
These are interesting times for anyone with an interest in computing.