Classical digital computers are closer than ever to reaching their maximum computational power, and humanity is nearing the celebration of the last generation of silicon-based architecture that has transformed the world over several decades. The inevitable evolution toward developing computers that operate at the atomic or quantum level is underway, with intense competition among nations and technology companies to advance quantum computing due to the extraordinary capabilities promised by this new technology.
Quantum computers are so powerful that they can be described as the “Ultimate Computer.” Theoretically, these devices could break all known electronic security codes. This situation is so critical that the National Institute of Standards and Technology (NIST) in the U.S., which sets national policies and standards, has issued guidelines to help major companies and government agencies plan for the inevitable transition to this new quantum era. NIST has already projected that quantum computers will be capable of breaking 128-bit AES encryption by 2029, with AES encryption being used by many companies to protect their data, and countries spending tens of billions of dollars to safeguard it.
The applications of quantum computers extend beyond security and military domains to include fields such as medicine, space, environment, climate, and energy. They are capable of performing calculations that traditional computers cannot handle. For instance, in 2019, Google announced the development of the Sycamore device, which solved a computational problem in 200 seconds—a task that would take classical digital computers 10,000 years. In 2020, China’s Quantum Innovation Institute announced the development of a quantum computer 100 trillion times faster than a supercomputer.
Consequently, many tech giants, including Google, Microsoft, Intel, IBM, Rigetti, and Honeywell, are racing to develop quantum computers, with each working on building prototypes. In 2021, IBM announced the development of its quantum computer named Eagle. The competition is not limited to American companies but also includes Chinese firms.
The End of the Silicon Age:
In the 1950s, computers were so enormous that they were the size of large rooms, and only major companies and government agencies like the Pentagon and major banks could afford them. For example, the ENIAC computer could perform in 30 seconds what would take a human 20 hours at that time. However, it was expensive and bulky until a revolution in electronic chip development occurred, shrinking their size over decades to a single chip the size of a fingernail, capable of containing about a billion transistors. This development allowed engineers and scientists to create smaller, portable computers and eventually led to the advent of cellular phones, IoT devices, and other microelectronics.
As the development of electronic chips accelerated, Gordon Moore, a co-founder of Intel, formulated what is known as Moore’s Law in 1965. This principle states that the number of transistors on a chip doubles approximately every two years, leading to simultaneous increases in performance and decreases in cost. In a later update, Moore adjusted the period to every 18 months. This principle is not a physical law in the strict sense but an empirical observation that guides the semiconductor industry. Moore’s Law has played a crucial role in the evolution of electronics, driving innovations, reducing production costs, and enabling the production of more powerful and compact electronic devices at affordable prices.
However, in recent years, many experts have pointed out that Moore’s Law may be reaching its physical limits. The main reason is that transistors have become so small that they are approaching the limits of what can be achieved with current technologies. Electronic chips have become so compressed that the width of transistors is around twenty atoms. When this distance reaches about five atoms in width, the position of electrons becomes uncertain due to quantum mechanics principles (electrons in this case are the charge carriers in transistors and play a critical role in the operation and processing of data within a computer).
This uncertainty in electron position, combined with the small size of transistors, can lead to “leakage” of electrons from their designated paths (i.e., gates responsible for allowing electrons to pass through), potentially shortening the chip circuit or generating excessive heat that could melt the chips. In other words, according to the laws of physics, Moore’s Law is bound to collapse eventually if we continue to rely primarily on silicon. This computational architecture has reached its end, and we may be witnessing the end of the silicon age.
This does not mean that progress in computing technology has stopped, but rather that innovation has begun to shift from merely increasing the number of transistors on a chip to improving system designs, using new materials, and exploring new computing architectures. In other words, while Moore’s Law as we know it may have reached its end, innovation in computing technology continues in new ways, with quantum computing leading the charge.
Quantum Computers: Is the End of the Silicon Age Near?
Classical digital computers are closer than ever to reaching their maximum computational power, and humanity is nearing the celebration of the last generation of silicon-based architecture that has transformed the world over several decades. The inevitable evolution toward developing computers that operate at the atomic or quantum level is underway, with intense competition among nations and technology companies to advance quantum computing due to the extraordinary capabilities promised by this new technology.
Quantum computers are so powerful that they can be described as the “Ultimate Computer.” Theoretically, these devices could break all known electronic security codes. This situation is so critical that the National Institute of Standards and Technology (NIST) in the U.S., which sets national policies and standards, has issued guidelines to help major companies and government agencies plan for the inevitable transition to this new quantum era. NIST has already projected that quantum computers will be capable of breaking 128-bit AES encryption by 2029, with AES encryption being used by many companies to protect their data, and countries spending tens of billions of dollars to safeguard it.
The applications of quantum computers extend beyond security and military domains to include fields such as medicine, space, environment, climate, and energy. They are capable of performing calculations that traditional computers cannot handle. For instance, in 2019, Google announced the development of the Sycamore device, which solved a computational problem in 200 seconds—a task that would take classical digital computers 10,000 years. In 2020, China’s Quantum Innovation Institute announced the development of a quantum computer 100 trillion times faster than a supercomputer.
Consequently, many tech giants, including Google, Microsoft, Intel, IBM, Rigetti, and Honeywell, are racing to develop quantum computers, with each working on building prototypes. In 2021, IBM announced the development of its quantum computer named Eagle. The competition is not limited to American companies but also includes Chinese firms.
The End of the Silicon Age:
In the 1950s, computers were so enormous that they were the size of large rooms, and only major companies and government agencies like the Pentagon and major banks could afford them. For example, the ENIAC computer could perform in 30 seconds what would take a human 20 hours at that time. However, it was expensive and bulky until a revolution in electronic chip development occurred, shrinking their size over decades to a single chip the size of a fingernail, capable of containing about a billion transistors. This development allowed engineers and scientists to create smaller, portable computers and eventually led to the advent of cellular phones, IoT devices, and other microelectronics.
As the development of electronic chips accelerated, Gordon Moore, a co-founder of Intel, formulated what is known as Moore’s Law in 1965. This principle states that the number of transistors on a chip doubles approximately every two years, leading to simultaneous increases in performance and decreases in cost. In a later update, Moore adjusted the period to every 18 months. This principle is not a physical law in the strict sense but an empirical observation that guides the semiconductor industry. Moore’s Law has played a crucial role in the evolution of electronics, driving innovations, reducing production costs, and enabling the production of more powerful and compact electronic devices at affordable prices.
However, in recent years, many experts have pointed out that Moore’s Law may be reaching its physical limits. The main reason is that transistors have become so small that they are approaching the limits of what can be achieved with current technologies. Electronic chips have become so compressed that the width of transistors is around twenty atoms. When this distance reaches about five atoms in width, the position of electrons becomes uncertain due to quantum mechanics principles (electrons in this case are the charge carriers in transistors and play a critical role in the operation and processing of data within a computer).
This uncertainty in electron position, combined with the small size of transistors, can lead to “leakage” of electrons from their designated paths (i.e., gates responsible for allowing electrons to pass through), potentially shortening the chip circuit or generating excessive heat that could melt the chips. In other words, according to the laws of physics, Moore’s Law is bound to collapse eventually if we continue to rely primarily on silicon. This computational architecture has reached its end, and we may be witnessing the end of the silicon age.
This does not mean that progress in computing technology has stopped, but rather that innovation has begun to shift from merely increasing the number of transistors on a chip to improving system designs, using new materials, and exploring new computing architectures. In other words, while Moore’s Law as we know it may have reached its end, innovation in computing technology continues in new ways, with quantum computing leading the charge.