- The goal is for quantum devices rather than classical computers to solve problems requiring massive processing power
- However we are still in the era of noisy, intermediate-scale quantum (NISQ) devices
- ‘Noise’ is the long-term problem for fault tolerance, scalability and commercialisation
- The focus now is on getting rid of that noise by any means possible
- Part of the solution might involve running quantum compute platforms in the cloud
During the past four or five years, the public have quickly learned to recognise that the elegant and intricate golden chandeliers – like upside-down, skeletal, multi-tiered metal wedding cakes – that now routinely feature in TV shows and films (often in a mad scientist’s laboratory or a megalomaniac villain’s lair) are popular and approximate representations of quantum computers.
However, in an actual quantum computer, the quantum processor itself is sited at the bottom of the chandelier and most of the rest of the device is a cooler – a ‘dilution refrigerator’ that mixes helium-3 and helium-4 – to maintain the processor’s temperature at 0.015 degrees Kelvin (equal to -273.135C or -459.643F, so rather chilly).
Such an extremely low temperature is necessary because quantum computers carry units of information in quantum bits, or qubits. They contain a particle (say, an electron) that has two possible states and while that particle is in superposition, quantum computers and quantum algorithms can take advantage of the potential power of both those states.
But qubits are very fragile and impossible to manipulate or manage at ‘normal’ temperatures. Even at close to absolute zero, qubits remain incredibly delicate and easily disturbed by ‘noise’, such as variations in the earth’s magnetic field, fluctuations in temperature, electromagnetic interference (including from Wi-Fi terminals and mobile phones), microscopic imperfections in quantum gates, and the impingement of other vibrations or acoustic interferences, such as the rumble of the subway train and the rattle of the taxis. Any of these factors (and others) will cause the quantum state to ‘decohere’, which either totally randomises the data and renders it useless or erases it completely.
The problem of noise in a quantum computing environment is long term, and while not insoluble it is delaying the development of practical, robust, large-scale, fault-tolerant quantum machines because even the slightest amount of noise can cause decoherence and the qubits to lose their remarkable properties of superposition and entanglement.
Superposition is the ability of a quantum system to be in multiple states at the same time until it is measured: It allows quantum objects to simultaneously exist in more than one state or location. This means an object can be in two states at one time while remaining a single object. Having particles in superposition is the fundamental way to store information in quantum computers.
Quantum entanglement occurs when two or more quantum particles are coupled so that any change in one particle will lead to a simultaneous change in the other, even if they are separated by immense, galaxy-spanning distances. Entanglement allows quantum computers to perform multiple calculations simultaneously, which massively increases their processing power and makes them far faster than even the biggest, most advanced and most powerful traditional supercomputers.
So, the search for a solution (or a series of solutions) to abate noise continues apace, and various paths are being followed, including the physical isolation of qubits and the development of ever more precise control techniques. Realisation has dawned that, as with so many things in life, human beings developing noise-immune fault-tolerant quantum computers will have to walk before they can run. Currently we are in the era of noisy, intermediate-scale quantum (NISQ) devices and it seems very unlikely that we will be able completely to solve the noise problem with the devices and algorithms of today. The consensus seems to be that the performance of NISQ devices will improve, steadily and incrementally, but some technological tour de force will be needed before quantum computing is robust and cheap enough for routine commercial use.
Approaches include quantum error suppression, error mitigation and error correction
The difficulties inherent to the development and production of quantum computers are evident but, it is believed, they can and will be overcome. That's why companies including the likes of Google, IBM, Intel and Microsoft, having between them already spent billions of dollars on the technology, are ramping up their R&D investments in the sector, even as specialist startups developing solutions based on a combination of hardware and software for the prevention or mitigation of quantum errors are beginning to emerge.
All players are working towards the establishment of the ‘quantum utility era’ at which time it will be natural, effective and economically sensible to choose quantum devices rather than classical computers to solve problems requiring massive processing power. As part of that, different approaches are being taken. Some companies are working to place quantum computing in the cloud, the rationale being that it will allow services to be accessed from anywhere in the world. In fact, it has already happened at an experimental level. The world’s first five-qubit cloud-access quantum computer went live in the cloud almost eight years ago, on 4 May 2016. It was instantly popular amongst research scientists, with 7,000 signing up for access to the facilities in the first week it was in operation – there were 22,000 registered users less than a month later.
Over the years, as more has been learned about the bizarre nature of quantum computing and the many difficulties of controlling a computational process based on quantum waves, three main categories of solutions have emerged. The first is error suppression, which continually utilises the known properties of classical software and machine-learning algorithms to analyse what is going on in the quantum circuitry and qubits, and reconfigures the process design to ensure that information held in the qubits is better protected.
The second solution is error mitigation and is based on the reality that not every error caused by noise will result in decoherence and thus make a quantum program fail. However, such a computation will wander off into paths that go nowhere and by examining the errors that caused it to go off track it may be possible to load a quantum system with something analogous to ‘echo-cancellation’ in telecom networks, a sort of anti-noise filter to limit the propagation of errors, both during the computational process itself and in the final output. Such a solution is partial – it estimates noise rather than identifying all aspects of an occurance – and for it to work, an algorithm must be run multiple times, which increases the cost of the operation. Nonetheless, it is another step in the right direction.
The third solution is the application of quantum error correction (QEC). Here, information is encoded into multiple qubits so that noise is corrected and minimised. It does work, but for it to do so the system must carry a supercargo of multiple physical qubits to protect and control a single logical qubit. The ratio of these is commonly quoted as being 1,000 physical qubits to one logical qubit (a huge and hugely expensive overhead) but recently some developers have claimed that in certain circumstances the ratios can be reduced to 100:1 or even, in some experiments, down to just 13:1. That may or may not be feasible or make commercial sense but whatever the ratio, QEC is expensive and very difficult to manage.
Meanwhile, more powerful algorithms are also being devised, with the Quantum Approximate Optimisation Algorithm (QAOA) proving to be more resilient to noise and capable of being applied in today’s limited and less-than-perfect quantum devices.
That said, none of the above options are long-term solutions and error correction, being a work-in-progress, is not applicable on commercially available quantum computers, especially given that it can’t easily be run as a real-time process in a commercial setting. However, those involved in quantum computing are increasingly optimistic that a full solution to the noise will eventually be found. The question is, when will that be? The answer most often given (with a confident smile but fingers crossed behind the back) seems to be by 2030... maybe.
- Martyn Warwick, Editor in Chief, TelecomTV
Email Newsletters
Sign up to receive TelecomTV's top news and videos, plus exclusive subscriber-only content direct to your inbox.