- The quantum computing market is set to grow in value significantly in the coming years and become a multibillion-dollar industry
- Technical advances, such as breaking the 1,000 qubit barrier, offer the potential for increased competition and very powerful quantum machines
- But investors are currently more excited about the money to be made in the AI and biotech sectors
- As a result, the ‘talent crisis’ is intensifying in the quantum computing sector and impacting progress, according to a new report
The quantum computing sector is set to grow at a furious pace over the next 20 years and ultimately become a multibillion-dollar industry as technical milestones are reached and the appetite for related services grows, but the sector faces significant funding, human resources and consolidation challenges in the coming years, according to a new report from Cambridge, UK-based ID TechEx.
Many companies and organisations are now competing to create a large-scale, fault-tolerant, gate-based quantum computer as, following the recent breaking of the ‘1,000 qubit barrier’ by both IBM and Atom Computing, hardware systems scale up sufficiently to the point at which they will begin to compete with existing supercomputers, according to the authors of ID TechEx’s Quantum Computing Market 2024-2044: Technology, Trends, Players, Forecasts report, headed up by senior technology analyst Dr. Tess Skyrme.
As a result of growing competition and demand, the quantum computing market is predicted to grow at a compound annual growth rate (CAGR) of 34% over the next 20 years, with the related hardware sector set to be worth more than $800m by 2034. Growth will be driven by early adopters in aerospace, chemical, financial and pharmaceutical institutions, leading to increased installation of quantum computers into colocation datacentres and private networks alike.
However, the immediate downside is that, at the moment, most venture capital and private equity funding is being pumped into the AI and biotech sectors, where the prospects for immediate financial returns are higher than with quantum. The ID TechEx team notes that the number of quantum computing startups being formed appears to have “plateaued”, despite companies such as Photonic Inc, Oxford Quantum Circuits (OQC) and Quantinuum raising sums of $100m or more in the past year or so.
As a result, the ‘quantum talent crisis’ is intensifying as quantum computing companies find it more and more difficult to recruit the physicists, quantum engineers, chip designers, and computer scientists they need if they are to grow at speed and scale.
Competition in quantum computing is increasing, not only between different companies but also between different types of quantum computing technologies. Today, the focus is on the need for logical, or error-corrected qubits and the challenge for tomorrow will be to scale up hardware and increase qubit number, while greatly reducing errors.
The fact is that it remains very difficult to build a quantum computer and get it to operate with any degree of consistency or continuing accuracy, even under optimally controlled experimental conditions in a laboratory environment. Constructing models able to work in real-world circumstances and to be used continually to provide solutions to complex commercial problems is several orders of magnitude harder. Part of the answer is the creation of quantum computers of a size exceeding 1,000 qubits.
In the world of quantum mechanics, quantum bits are the analogue of the system we are all familiar with in so-called classical computing where information is encoded in bits, each of which can have the value zero or one and nothing else. However, qubits can exist in a third state, that of superposition, in which the values of zero, one and all possible positions in between them exist at once, because superposition represents the probability of a qubit’s state that is yet to be determined. Superposition allows quantum algorithms to process information in a fraction of the time it would take even the fastest classical systems to solve most problems.
Difficulties remain though: Current quantum computers are easily affected by a huge range of ‘noises’, such as temperature, electromagnetic radiation and ‘crosstalk’, which generate multiplying errors that quickly and catastrophically de-cohere the quantum state and bring processing to a halt – hence the need for multiplicities of dedicated error-correcting qubits to protect the smaller numbers of programmable qubits that actually do the calculations.
The 1,000 qubit barrier, which has now been reached and breached in a few experimental cases, is regarded as a major milestone point on the road to large-scale quantum computing hardware.
Guiding quantum data through the gates
Quantum circuits comprise three main types of gates: single-qubit gates, two-qubit gates, and measurement gates. Single-qubit gates operate on one qubit and can be used to create superpositions or perform rotations. In quantum mechanics, rotations are necessarily three-dimensional in that they involve rotation about an axis and the state of a qubit can be transformed by rotating it around axis x, y, and z.
Two-qubit gates operate on two qubits and are used to create entanglement between qubits. Two-qubit gates can process four possible combinations of 0s and 1s simultaneously, and three-qubit gates can process eight possible combinations. Each additional qubit doubles the number of combinations the gate can process at the same time, so there is an exponential increase with each new qubit. A gate-based quantum computer can execute very powerful commands, such as “write all possible values into the register simultaneously“ or “entangle two qubits”.
That’s the state of play today and, as the IDTechEx report points out, whilst reducing errors in quantum computing is very important, the impact on overall size and power consumption must also be affected if the incidence of growing, sprawling infrastructures are to be reduced. Scaling quantum computer hardware is easier said than done because almost all experimental research quantum machines require powerful cooling systems and these are big. Thus, as the race to manufacture quantum computers with 1,000 qubits or more heats up (if you’ll forgive the pun), hardware roadmaps are adapting to consider the connection of multiple modular systems as the answer to some serious problems.
As a result, today, many hardware roadmaps have been revised to show a modular approach with multiple systems connected. As the report makes clear, “On the one hand, quantum computing is designed for high-value problems – to be solved over the cloud, and so requiring a large footprint within a datacentre is not necessarily a huge barrier to adoption. However… to truly follow the trend of classical computing from vacuum tube to smartphone, it’s time to start making components smaller before capabilities can get bigger.”
The IDTechEx report has it that, “the coming years will illuminate which strategies hold the greatest promise for securing a lasting quantum commercial advantage. This task will be an uphill balancing act between reducing errors and scaling up logical qubit numbers while also optimising for resource efficiency.”
It concludes: “This is without even considering gate-speed, algorithm development, and many other crucial factors. The enormity of the task will likely see many players fail to survive until the end of the decade. Yet, with market consolidation and convergence of talent, increased clarity should come as to where and when quantum advantage could be offered first, serving only to increase end-user confidence and engagement. Despite the headwinds, the world-changing potential of quantum computers within finance, healthcare, sustainability, and security will remain a tantalising enough carrot for not only individual companies but entire nations to chase.”
- Martyn Warwick, Editor in Chief, TelecomTV
Email Newsletters
Sign up to receive TelecomTV's top news and videos, plus exclusive subscriber-only content direct to your inbox.