At the start of this century’s 2020s, many experts observed that Gordon Moore's 1965 law, which stated that processor speed doubles every two years (later revised to 18 months), no longer holds the same effectiveness as it once did.

This is because today's 3-nanometer transistors, and the 2-nanometer ones currently being tested by Samsung and other companies, are nearing the physical limits of how small transistors can be made and placed on electronic chips.

However, progress in computer performance and efficiency continues through other technological pathways, such as improving processor architecture, software, and algorithms. To provide higher speeds, there has been an increasing trend toward building supercomputers consisting of thousands of processors.

## Supercomputers

Frontier was ranked first as the fastest computer according to the June 2024 list of the world's top 500 supercomputers, issued by the site "top500.org." Frontier surpassed 1.1 exaFLOPS, or one quintillion (1 followed by 18 zeros) calculations per second. This supercomputer is located at the Oak Ridge National Laboratory in Tennessee, affiliated with the U.S. Department of Energy. Its construction cost $600 million, and it became fully operational in 2022.

This supercomputer is used for various tasks, including advanced scientific research, climate change simulations, molecular biology, genome analysis, and drug design, aiding in the development of new treatments and understanding diseases at a deeper level. It is also used for big data analysis, AI development, and many other applications that require enormous computational power.

Equipped with 9,472 CPUs and 37,888 GPUs from AMD, Frontier boasts a total of 47,360 processors working together to deliver its incredible computational strength. Yet, these supercomputers are still not enough to solve all computational problems in the age of AI, as traditional computing chips have nearly reached their limits. This has created an urgent need to explore new computing models capable of continuing the pace of speed increases, sparking the rise of **quantum computing**.

### Quantum Computing

The theoretical foundations for quantum computing were laid in 1981 by Richard Feynman, the Nobel-winning American physicist, followed by the research of British physicist David Deutsch in 1985. Feynman’s idea was that quantum computers would be highly effective in simulating the universe, which fundamentally operates according to quantum mechanics.

Traditional computing uses classical bits, which can be either 0 or 1, whereas quantum computing employs quantum bits, or qubits, which can be 0, 1, or a superposition of both at the same time. This gives quantum computing its tremendous power, as qubits can perform multiple calculations simultaneously, making quantum computers more efficient at solving certain complex problems than traditional computers.

With two qubits, each in a state of superposition, the system can represent four states simultaneously: 00, 01, 10, and 11. As the number of qubits increases, so does the number of simultaneous operations. For example, three qubits can perform 2^3 operations (8) at once, while 100 qubits could perform 2^100 operations, which is a number with 27 digits. This vast number of simultaneous operations reflects the strength of quantum computing, but utilizing this power efficiently requires continuous development of quantum algorithms and error correction.

In 1998, IBM and Stanford University performed the first experimental quantum computation using a 2-qubit quantum algorithm. They used **nuclear magnetic resonance (NMR)** technology to distinguish between two quantum states.

#### Development of Qubits

Researchers began experimenting with different systems to create qubits in the 1990s. As the understanding and control of individual qubits improved, efforts shifted toward increasing the number of qubits for more complex computations, with companies like IBM, Google, and D-Wave building more advanced quantum systems.

One of the significant experimental milestones was the use of nuclear magnetic resonance for quantum computing in the late 1990s, transforming theoretical concepts into practical realities.

In October 2019, Google announced that its experimental quantum computer, equipped with 53 qubits, completed a calculation in just over three minutes, a task that would have taken the world’s fastest supercomputer around 10,000 years to achieve. However, IBM challenged this result.

#### Logical and Physical Qubits

Quantum computers have two types of qubits: physical qubits, which are the actual quantum units, and logical qubits, which are created by encoding groups of physical qubits to reduce errors and make computations more reliable. Achieving practical quantum computing requires creating logical qubits with minimal errors, which remains a major technical challenge.

Microsoft and Quantinuum made significant progress in 2024 by creating 4 logical qubits from just 30 physical ones, conducting 14,000 experiments without detecting a single error. IBM and Google are also making strides toward scaling up logical qubits, with IBM aiming to develop 12 logical qubits from 244 physical ones by 2026.

### The Threat of Quantum Computing

Quantum computers could potentially break encryption methods that current classical computers would take thousands of years to crack in just seconds. However, breaking RSA-2048 encryption would require 4,000 logical qubits, which equates to roughly 400,000 physical qubits with current technologies, a capability not yet achieved but possibly reachable in a few years.

#### Quantum Winter

Despite notable progress, quantum computing faces significant technical challenges, such as quantum stability and noise-related errors. Some experts warn of a potential "quantum winter," where failure to find fundamental solutions to these issues might lead to a diminished interest in quantum computing for years. Russian computer scientist Mikhail Dyakonov has voiced skepticism, suggesting that we may never succeed in building a practical quantum computer due to noise, scalability, and efficiency issues.

Nevertheless, the ongoing research by numerous teams around the world suggests a promising future, with many seeking solutions to the challenges of creating useful quantum computers.