In the mid-20th century, Alan Turing laid the foundations of classical computing, showing what a machine could and could not calculate step by step. At that time everything seemed clear: computers would always be machines that executed instructions sequentially on bits, tiny units that take the value 0 or 1. Yet nature itself does not work that way. In the microscopic world of quantum physics, particles do not have only one state but can exist in many possibilities at once.
In the 1980s, physicist Richard Feynman and David Deutsch opened the path by suggesting that if nature is quantum, then a computer based on quantum rules could simulate phenomena that classical computers fail to capture. In the mid-1990s, two algorithms gave substance to this theory: Peter Shor proved that a quantum computer could break the encryption that relies on the difficulty of factoring large numbers, while Lov Grover showed how searching through a massive database could be done much faster than by any classical method.
From then on began the transition from theory to practice. Laboratories around the world tried to implement qubits using different physical systems: trapped ions, superconducting circuits, even photons. The first devices could handle only a few qubits, but within three decades we reached systems with dozens or even hundreds of qubits, enough to perform experiments that no classical supercomputer can replicate. Today, major companies such as IBM, Google, and Microsoft, as well as university research centres, already operate quantum computers, even if still with limitations.
But how do they actually work? A classical computer is based on bits that are either open or closed, like a switch on the wall. A quantum computer uses qubits that can be both open and closed at the same time. A simple picture is the coin spinning in the air: while it spins, it is both heads and tails, and only when it lands does it become one or the other. This is superposition.
In classical computers, multiple combinations of bits exist as possibilities but only one is real at any given time. If we have ten bits, there are over a thousand possible combinations, but the computer holds one at a time and must check them sequentially. In qubits the same ten are not in one state but in a superposition of them all. It is as if all the possible keys to a lock are laid out on the table at once, not hidden in a drawer to be tested one by one. Quantum operations act on this whole mixture of states simultaneously, and then interference ensures that the wrong probabilities cancel out and the right one is strengthened. Thus, a quantum computer does not need to run through all the cases one by one, but works on all of them in parallel.
Interference is familiar from waves. When two waves meet, they can merge and grow stronger, or cancel each other out. This is how quantum computing works too: with the right operations, the wrong solutions vanish and the correct one stands out. When we measure the result, the probability of obtaining the right answer is much greater than any other.
Seen practically, it is like having a lock with countless possible keys. A classical computer would test the keys one by one. A quantum computer can handle them all at the same time and, through interference, let only the correct one appear. Problems that would take today’s most powerful supercomputers centuries could be solved by a quantum computer in minutes.
This opens new paths for discovering medicines, designing innovative materials, predicting the climate, and securing communications. It is not just a faster machine, but a turning point in the way we think about computation itself. It rests on principles that may feel strange to everyday experience but are deeply logical: that something can be many things at once, and that from the interplay of those possibilities order can emerge and a clear answer be revealed. That different perspective is what makes it unique.
References
Feynman, R. (1982). Simulating physics with computers. International Journal of Theoretical Physics, 21(6–7), 467–488.
Deutsch, D. (1985). Quantum theory, the Church–Turing principle and the universal quantum computer. Proceedings of the Royal Society.
Shor, P. (1994). Algorithms for quantum computation: Discrete logarithms and factoring. Proceedings 35th Annual Symposium on Foundations of Computer Science.
Grover, L. (1996). A fast quantum mechanical algorithm for database search. Proceedings of the 28th Annual ACM Symposium on Theory of Computing.
Nielsen, M. & Chuang, I. (2010). Quantum Computation and Quantum Information. Cambridge University Press.
IBM Quantum, McKinsey & Co., RAND Corporation, CSIS – contemporary analyses and popular science articles on the progress and applications of quantum computing.