Thursday, September 14, 2017

A few clarifications on the quantum revolution

First of all, let as clarify a few concepts, for recent news published in the media tends to encourage confusion.
·         Qubit: the quantum unit of information. While classical information is expressed in bits, which may take only the values 0 or 1, qubits are formed by superposition (or linear combination) of two quantum states, |0> and |1> (i.e. the horizontal or vertical polarization of a photon) and its value is: α|0>+β|1>, where α and β are two complex numbers called probability amplitudes.
·         Quantum Cryptography: It encrypts information through a protocol that takes advantage of the quantum properties of matter. The procedures devised so far can be deciphered by opponents using quantum procedures, but it is known (or believed) that they are impossible to decipher by classical means. The first quantum cryptography protocol, BB84, was proposed in 1984 by Charles Bennett and Gilles Brassard of IBM.
Quantum teleportation diagram
    ·     Quantum teleportation: Classic transmission (at the speed of light) of the quantum state of an atom or photon to another atom or photon, previously entangled with the former and subsequently separated. It does not transmit matter (the atom or photon) but its quantum state (the distribution of electrons, in the case of the atom). So far, it has not been performed with particles larger than an atom (not with molecules, for example). It transmits a qubit of information from one particle or atom to another. In 1998 it was demonstrated experimentally. Since then, the teleportation distance that separates the two entangled particles has been increased from a few meters to hundreds of kilometers. In 2017 a Chinese experiment performed a teleportation at a distance of 1200 km, using an artificial satellite as the arrival point.
·         Quantum computing: a new type of computer, which uses quantum concepts such as superposition and entangled states to perform calculations. In the 1930s, Alan Turing proved that a quantum computer cannot solve a problem that an ordinary computer cannot solve, although in theory it could do it faster. With the Turing nomenclature, this would be expressed thus: a nondeterministic Turing machine is equivalent to a deterministic Turing machine, since the former can be simulated by the second, although possibly increasing the execution time. There are some kinds of problems in which quantum computing could, in principle, speed up the computation process, but this does not apply to every problem. Some of those problems that could be solved faster by a quantum computer are NP-complete, that is, the time required to solve them with a classical computer grows exponentially as a function of the number of data to be analyzed.
Let us list a few advances in the field of quantum computing:
         In 1994, mathematician Peter Shor of MIT devised a quantum algorithm that would make it possible to decompose an integer into its prime factors. This problem is very difficult, if the number to be decomposed is very large (for example, hundreds of figures). It is trivial, on the other hand, if the number is small, as I mentioned in another article. In 2001, an IBM group led by Isaac L. Chuang executed the Shor algorithm on a 7 qubits quantum computer and managed to decompose 15 into its prime factors (3 times 5). In 2012, a team from the University of Bristol was able to factorize 21 (3 times 7), the current record with the Shor algorithm. In 2012, using a different algorithm (adiabatic quantum computation), 143 was factored (into 11 times 13) with 4 qubits. In 2014, using the same procedure, 56153 was factorized (into 233 times 241), also with 4 qubits.
         In 1996, computer scientist Lov K. Grover devised a quantum algorithm that would make it possible to find a piece of data in a non-indexed database in a time proportional to the square root of the number of data in the data base. In a classic computer, the average search time is equal to half the number of data. Since the square root of a number grows much more slowly than its half, the procedure should result in an acceleration for large databases. In 1997 the algorithm was experimentally tested on a two qubits chloroform computer by Isaac L. Chuang of IBM, Neil A.Gershenfeld of MIT and Mark G.Kubinek of the University of Berkeley, with a database containing 4 data. (In this particular case, a quantum computer does not offer an advantage over a classical one, for the square root of four is equal to one half of four).
         In 2009 researchers at Yale University built the first solid state quantum processor: a superconducting chip with two qubits. The same year, another team at the University of Bristol built a silicon chip that ran the Shor algorithm. In 2012 a multinational team built a two qubits quantum computer made with diamond, which works at room temperature. This team ran the Grover algorithm, obtaining the correct answer in 95% of cases. In 2016, IBM launched the IBM Quantum Experience, which makes it possible to use their 5 qubits superconductor quantum computer through the worldwide web. In 1917 IBM announced that it had managed to build prototypes of two new quantum processors, with 16 and 17 qubits, respectively.
         In September 2017 a group of Australian researchers has announced a new procedure for building long chains of qubits on silicon using atoms of phosphorus. The first prototype is expected to be ready around 2022.
Some researchers believe that quantum computing could actually be useful starting at 50-100 qubits. D-Wave Systems has announced commercial quantum devices with 128 and 512 qubits, but these are not really quantum computers, but quantum annealers, which use different techniques. Companies like Google and NASA are using them. Another US team announced in July 2017 that they have built a device with 51 qubits, although this is not a quantum computer either, but a quantum simulator, which means that it cannot be programmed, it can just solve a single equation.

Another big problem is the difficulty of keeping stable the value of the qubits for some time (in the IBM machine it endures less than a tenth of a millisecond). If this is not achieved, errors are introduced in the calculations that would make quantum computers unreliable. Some think that correcting those errors will require multiplying the number of qubits by 10, 100 or even 1000. A full-error-corrected quantum computer may need millions of qubits, which is now far away.
In short: progress is being made, but the commercial use of real quantum computers is still a long way off. On the other hand, despite what some say, it has not been convincingly proved that quantum computers actually provide dramatic speed increases in solving specific problems. In theory it is so, but theory is different from practice.

The same post in Spanish
Thematic Thread on Particle Physics: Previous Next
Manuel Alfonseca

No comments:

Post a Comment