ERCIM news 135
ERCIM news 135
ERCIM news 134
ERCIM news 134
ERCIM news 133
ERCIM news 133
ERCIM news 132
ERCIM news 132
ERCIM news 131
ERCIM news 131
ERCIM news 130
ERCIM news 130
Back Issues Online
Back Issues Online

by Jop Briët (CWI) and Simon Perdrix (CNRS, LORIA)

For more than a century now, we’ve understood that we live in a quantum world. Even though quantum mechanics cannot be ignored during the development of atomic scale components of everyday computers, the computations they perform are governed, like the Turing machine, by the laws of classical Newtonian mechanics. But the most striking and exotic features of quantum mechanics, superposition and entanglement, currently play no part in every-day information processing. This is about to change – and in some specialised applications, already has. In academia, the field of quantum computation has been growing explosively since its inception in the 1980s and the importance of these devices is widely recognised by industry and governments. Big players in the tech industry like IBM and Google frequently announce that they have built yet a larger rudimentary quantum computation device and in 2016 the European Commission launched a one-billion Euro Flagship Initiative on Quantum Technologies. 

The development of this technology has both a hardware and a software component, both of which have been the subjects of intense research for the past few decades. On the hardware side, past efforts went into controlling and storing small numbers (5-10) of qubits, the fundamental building blocks for quantum computation. The current state of the art is that this can be done for arrays of about fifty qubits. Although a seemingly modest scale, this is where things become very interesting, because quantum systems of 50-100 qubits cannot be simulated on our current classical computers and so hold the possibility of harnessing a computational power we have not seen before. A more long-term goal (in the order of a few decades) is to scale-up further to millions of qubits. This scale is what the NSA is worried about, because quantum computers of this size could break most modern-day cryptography.

Hardware, however, is not much good without interesting software. Some important research areas on this side of the development include:

  • Quantum simulation, which includes research on applications of medium-sized quantum computers. Potential application areas include chemistry and materials science.
  • Algorithms and complexity, which explores what larger-scale quantum computers could do. Think for instance about machine learning, search and optimisation and tackling number-theoretic problems (relevant to cryptography).
  • Cryptography, important because larger scale quantum computers will break our current cryptosystems, but also hold the key for future cryptosystems.
  • Quantum software framework, including quantum programming languages, together with formal and logical methods to ease the use of quantum computers and understand the fundamental structures of quantum information processing.
  • Quantum information science in general, which is to provide the mathematical theory behind quantum information processing and is important for the development of error correction schemes, understanding counterintuitive quantum phenomena and the physical theory behind the hardware.

The articles in this special theme offer a more detailed look into the complexities of some of the above-mentioned aspects of the emerging paradigm shift currently happening in computation. Below you will find lightning-fast introductions to the basic concepts appearing in the articles.

Basic features of quantum computation

Qubits and superpositions
The basic building block for classical computation is the bit, a physical system that can be toggled between one of two possible states, 0 and 1. For quantum computers, this building block is fundamentally different. It is the qubit, short for quantum bit. A qubit is a physical system that can be in a sort of intermediate state given by one of the infinitely many possible superpositions of two basis states. If the two basis states are represented by two-dimensional row vectors (1,0) and (0,1), then a superposition is a complex Euclidean unit vector of the form (x,y) where x and y are complex numbers, referred to as probability amplitudes, whose absolute values squared sum to 1. A measurement of the qubit results in finding it in the first or second state with probability |x|2 or |y|2, respectively. Toggling of quantum states can be done via linear, length-preserving operations, in other words, unitary transformations. A measurement performed after a unitary operation is also referred to as doing a measurement in a different basis. Whereas the state of n classical bits can be represented by an n-bit string, the state of an n-qubit system is in general given by a superposition of 2n basis states: a complex 2n-dimensional Euclidean unit vector. The observation that n-qubit states require an exponential number of parameters to be described is what makes simulating quantum-mechanical systems hard to do on classical computers. It is also one of the key features of quantum mechanics that gives quantum computers their power.

Entanglement
Arguably the most striking features of quantum mechanics is entanglement, which manifests itself when two or more quantum systems are measured locally in one of two or more possible bases. As an argument against quantum, it was observed by Einstein, Podolsky and Rosen that compound systems allow superpositions that can result in measurement statistics that defy classical explanation. Celebrated work of Bell later showed that one can actually test this feature experimentally using what is nowadays referred as a Bell test. The most basic example of such a test can be cast as a game involving two players, Alice and Bob, and a referee. The referee picks two bits at random and sends these to Alice and Bob, respectively. Without communicating, and thus without knowing the other’s bit value, the players each return a binary answer to the referee. They win the game if the sum of the answers modulo 2 equals the product of the questions. A simple calculation shows that if they players are constrained by the laws of classical physics, then they can win this game with probability at most 3/4. However, by performing local measurements on a two-qubit “EPR pair”, the players can produce a probability distribution over their answer pairs with which they win with probability roughly 0,85!

Quantum algorithms
Quantum algorithms consist of a carefully chosen sequence of unitary transformations that are applied one by one to a register of qubits, followed in the end by a measurement to determine an output. A crucial property of unitary transformations is that they can cause cancellations among the probability amplitudes describing the overall state of the register. These cancellations can in turn cause large superpositions to quickly converge into a state that, when measured, with near-certainty gives only a single possible outcome. Two early, but still important, examples demonstrating the power of this phenomenon are Shor’s factoring algorithm and Grover’s search algorithm. Shor’s algorithm factors a given number into its prime-number components exponentially faster than the best-known classical algorithm to date, with dramatic consequences for important cryptographic schemes (see below). Grover’s algorithm finds the location of a 1 with good probability in a given n-bit sequence, provided there is one, using a number of basic computational steps given by roughly the square root of n. The best-possible classical algorithm needs roughly n steps in the worst case, however.

Quantum cryptography
Shor’s algorithm can break the most important cryptographic schemes we have today, which are based on the assumption that factoring is hard. A fully operational large-scale quantum computer shatters this assumption. This means that alternative cryptography is needed immediately, as some of today’s information may need to be kept secret even after quantum computers are built. Multiple lines of research on post-quantum cryptography address this issue. On the one hand, one can try to look for problems that might even be hard for quantum computers, an important motivation for lattice-based cryptography. On the other hand, quantum mechanics itself offers alternatives too, as was pointed out already long before Shor’s discovery. Wiesner circa 1970 introduced a quantum money scheme, a proposal where bank notes are unfalsifiable thanks to the presence of qubits on it. This was the first attempt to use quantum mechanics in a cryptographic protocol. More than a decade later, in 1984, Bennett and Brassard introduced the revolutionary quantum key distribution protocol, an unconditionally secure communication protocol relying on the laws of quantum mechanics. In this protocol, Alice wants to share a random key with Bob, to do so she sends random bits encoded into randomly chosen basis among two complementary basis. Complementary basis are subject to the uncertainty principle: if a bit of information is encoded into one of the basis, measuring according to the other one produces an random bit, uncorrelated with the encoded information. Roughly speaking a spy who wants to observe the sent qubits will not only get no information if he does not guess the appropriate basis, but will be detected by Alice and Bob with high probability. The protocol is relatively simple to implement, and has been commercialised. Notice however, the proof that the protocol is actually unconditionally secure took 15 years!  

Scalability and error correction
One of the major challenges in the quest for a quantum computer is its scalability. Prototypes of quantum computers already exist but their memory, subject to decoherence, is limit to a few dozen qubits. The scalability of quantum computers is not only a technological challenge, error correcting code are crucial: a large scale robust quantum computer requires that the quality of the quantum device should meet the maximal amount of errors quantum codes can correct.

Please contact:
Jop Briët
CWI, The Netherlands
This email address is being protected from spambots. You need JavaScript enabled to view it.

Simon Perdix
LORIA, CNRS, Inria Mocqua, Université de Lorraine, France
This email address is being protected from spambots. You need JavaScript enabled to view it.

Next issue: January 2024
Special theme:
Large Language Models
Call for the next issue
Image ERCIM News 112 epub
This issue in ePub format

Get the latest issue to your desktop
RSS Feed