Digital Platforms and Services

The dream of fault-tolerant quantum computing comes a step closer to reality

By Martyn Warwick

May 15, 2023

  • Quantum computers can only run for a few microseconds before “noise” induces faults and crashes the system
  • Eliminating “noise” is therefore vital to the development of quantum computers that will run for prolonged periods
  • Braided non-abelian anyons anyone?
  • Seven years in development, Quantinuum’s new H2 quantum processor could be a major breakthrough

The general public are getting quite used to seeing classical “chandelier” design quantum computers in movies and TV programmes. They are usually hanging mysteriously and ominously in the exotic lair of a James Bond-type villain or hidden away in the laboratories of mad scientists plotting to boil the oceans or orchestrate earthquakes and volcanic eruptions.

Such machines are, of course, just props and set decoration and the narrative flow, car chases and gunplay are never interrupted by any effort to explain how quantum computers work or how incredibly difficult it is to get them up and running in the first place or to keep them in operation for more than a few microseconds before the quantum state collapses and the computer stops working. 

The basic unit of quantum information is the qubit (quantum bit), which is the quantum equivalent of the binary bit, which in classical computing is physically realised with a two-state device. Like binary computers, qubits can also have a value of 1 or 0 but can also attain a “superposition” where they are both 1 and 0 simultaneously – until they are measured, at which point they adopt a set value. Superposition means the particles are inextricably “quantum entangled” even if they are separated by billions of light years and exist on opposite sides of the universe. Einstein called this “spooky action at a distance”, and it certainly is. 

However, a qubit can’t stand “noise” and thus it has to be kept in total isolation and at a temperature minutely close to absolute zero. Zero on the Kelvin scale equates to -273.15 celsius and any thermal perturbation or the slightest vibration from an adjacent atom quite literally causes a qubit to go full diva and throw a wobbly. When that happens the incredibly frangible qualities of superposition and entanglement, called “coherence time”, cannot be maintained long enough for a quantum computer even to begin to run a calculation program.

Given that governmental and military research into quantum computing is conducted in great secrecy, wherever in the world it is taking place, we must rely on information from academic and commercial sources to give some indication of how things are progressing. According to IBM, the company plans this year to unveil Condor, the world’s first universal quantum computer with more than 1,000 qubits and says it is on track to construct a quantum machine with more than 4,000 qubits by 2025. IBM has also demonstrated a working 50-qubit quantum computer that can maintain its quantum state for 90 microseconds.

That is a very short period of time but it’s a major improvement on what was possible only a few years ago as research continues and improvements are made. The biggest problem in quantum computing is how to mitigate and eventually completely eliminate or make irrelevant “noise”, the perpetual interference from a plethora of thermal, radio frequency and magnetic sources, that causes the qubits in a quantum computer to degrade or “decohere”.

Within a thousandth of a second, decoherence results in randomisation of the information held in a qubit. Mistakes proliferate instantly, algorithms produce spurious outputs and fatal errors bring the process to an immediate halt and the quantum computer crashes.

Ideally, a quantum algorithm carrying out many operations across a lot of qubits should be running trillions of operations simultaneously. At the moment they can run only a few hundred before noise overwhelms the system, the quantum state collapses and fatal errors occur; hence the race by companies such as IBM and Google to develop quantum error correction (QEC).

QEC is yet another algorithm but one specifically designed to identify and correct errors in quantum computers and is vital to enabling quantum computing ‘at-scale’ whereby an increasing number of qubits can be deployed and handled in an increasingly stable manner. In QEC, quantum information is stored in a single logical qubit and then distributed across many supporting physical qubits, this ensures the integrity of the original quantum information even while the quantum processor is running.

However, the overhead and cost necessary to run QEC is immense. Currently it takes about 1,000 physical qubits to support just one logical qubit and that makes things very difficult because for fault-tolerant universal quantum computing ever to be realised, physical qubits will have to be improved to the point that they can meet, and even surpass, the most demanding error correction thresholds, and that has been regarded as a very long-term proposition. 

A la recherche du temps qui n'a pas été perdu (as Proust did not write)

However, help may be at hand. Back in the 1980s, the US theoretical physicist, Frank Wilczek – who is now a Nobel laureate and Professor of Physics at the Massachusetts Institute of Technology (MIT) – was researching the concept of particles that could exist only in a two-dimensional, flat universe.

He postulated that such “quasi-particles” would, when produced in a quantum computer, at temperatures of close to absolute zero and in a very strong magnetic field would, bizarrely, retain a “memory” of their own history. It is this property that might be, and it looks increasingly likely will be, instrumental in the evolution of error-correcting, crash-proof quantum computers.

Since Wilczek’s experimental work, many companies and institutions have spent a lot of time and money on research into… wait for it…. “non-abelian anyons”.

On 3 May this year, less than two weeks ago, scientists at Quantinuum, the US/UK company that two years ago was spun-out of Honeywell after it bought the British quantum company Cambridge Quantum, introduced its H2 32-qubit, second-generation quantum computer. Quantinuum claims that the H2 has been in secret development for the past seven years and is the most precise quantum computer yet built – anywhere. Microsoft, which is developing its own quantum computer, and graphics chip giant Nvidia are partnering with Quantinuum.

Apart from Flat Stanley, the lead character in US author Jeff Brown’s children’s books, the rest of us live in a three-dimensional world in which there are just two types of particles: fermions, which transport electricity and repulse each other, and electrons, which transport electricity and attract each other. However, in a two-dimensional system, another quasi-particle exists. Proving that some physicists do have a sense of humour, Frank Wilczek called these two-dimensional particles ‘anyons’ because he reckoned they could be more or less anything. Obviously he knows his anyon onions…

Anyons are classified as ‘abelian’ or ‘non-abelian’. Abelian anyons behave pretty much like fermions, but non-abelian anyons have some particular and particularly strange properties that, it is believed, will potentially be of great importance in topological quantum memory because they have a "memory" of where they were in the past relative to one another. It is that ‘memory’ that may allow quantum computers to function without the need for error correction.

In essence, identical non-abelian anyons share a memory and by making two non-abelian anyons exchange their positions, their states are switched. This process is described as “braiding” and when anyons are braided they emit a quasiparticle that is evidence they have crossed one another in spacetime: The historical record of the braiding event is preserved. Researchers say this phenomenon will eventually allow the construction of fault-tolerant quantum computers, and that will change the world.

Of course, it’s all enormously more complex and mind-bending than that, as you’ll find out if you are intrigued enough to follow-up this article.

It looks as though Quantinuum’s H2 processor is a major breakthrough in that it permits the controlled creation and manipulation of non-abelian anyons. As Tony Uttley, president and COO of Quantinuum, said ,“With our second-generation system, we are entering a new phase of quantum computing. H2 highlights the opportunity to achieve valuable outcomes that are only possible with a quantum computer. The development of the H2 processor is also a critical step in moving towards universal fault-tolerant quantum computing.” 

And now I’d like to sit down in a darkened room and quietly suck my thumb for a while…

- Martyn Warwick, Editor in Chief, TelecomTV

Cookies

TelecomTV uses cookies and third-party tools to provide functionality, personalise your visit, monitor and improve our content, and show relevant adverts.