Cutting through the noise around quantum computing – where are we now and when will we reach quantum practicality?

By Jim Clarke, Director of Quantum Hardware, Intel Corporation.

  • 1 year ago Posted in

Quantum computers have been decades in the making. Hailed as the next big thing with the potential to address many of today’s unsolvable problems, the quantum computing market is expected to reach US$1.76 billion by 2026, fueled by investments from the public sector for research and development, according to MarketsandMarkets Research Private Ltd.

Supercomputers and quantum computing – what’s the difference?

Most laypersons think about computational power in terms of how fast a computer can perform. For commercial workloads that handle massive computations and databases such as weather forecasting and molecular modeling, even the best consumer desktops are unsuitable. This is where supercomputers come in. Supercomputers, like all classical computers, operate based on the computation of binary data: one or zero. Yes or no. On or off. The complexity comes in the form of long strings of binary information.

In comparison, quantum computers operate based on the principles of quantum physics, and therefore rely on quantum bits, or qubits. A simple way to understand a qubit is to think of it as a coin, where it could either be in a head or tail state. Now, imagine that the coin is spinning, and as it is doing so, it is in a sense, in both head and tail states at the same time. This state is known as a "superposition" of the two states. With two of these spinning, entangled coins, we will have four states at the same time. A quantum computer’s power therefore grows exponentially with the number of qubits. Theoretically, with 50 of these entangled qubits, we would be able to access more states than a supercomputer. With 300 entangled qubits, it would represent more states than atoms in the universe at the same time.

Unlike supercomputers, quantum computers treat data in a non-binary manner and perform calculations based on probabilities. The practical uses of a quantum computer are still largely in discovery, but already, the likelihood of their breaking the strongest encryption algorithms available today is giving governments and organizations pause on the potential of these systems. For instance, a conventional computer would take about 300 trillion years to break RSA's 2,048-bit encryption algorithm today. A 4,099-qubit quantum computer, on the other hand, just needs 10 seconds.

As of November 2021, the milestone achievement made in quantum computer is only 127 qubits, so we are still a long way off from a 4,099-qubit quantum computer.

The long road towards quantum practicality

In reality, we need more than a million high-quality qubits in order to commercialize quantum computing–also known as reaching “quantum practicality.” This is when a quantum computer has achieved commercial viability and can solve relevant, real-world problems.

The challenge lies in the fact that qubits are very fragile. They have very short lifetimes (microseconds), and the tiniest “noise” such as external interference from magnetic fields and variation in temperature can cause loss of information. Here are three key areas that we must address in order to advance the scalability of viable quantum computing systems.

Managing qubits in higher temperature with spin qubits

The fragile nature of qubits requires them to operate at extremely cold temperature (20 millikelvin, or about -273 degrees Celsius), which creates challenges for the material design of the chips themselves and the control electronics necessary to make them work. In working towards scaling quantum chips, Intel, in collaboration with QuTech, produced a silicon spin qubit technology process that enabled the fabrication of more than 10,000 arrays with several silicon-spin qubits on a single wafer with greater than 95 percent yield. Spin qubits are very similar to transistors and are built on 300mm process technology in the same fabs as Intel’s complementary metal-oxide-semiconductor (CMOS) chips. The joint research illustrated that it’s possible for qubits to eventually be produced alongside conventional chips in the same manufacturing facilities.

These spin qubits are much smaller but have a longer coherence time and can function at higher temperatures than superconducting qubits (1 kelvin/-272.15 degrees Celsius)— an advantage for scalability. These characteristics drastically reduce the complexity of the system required to operate the chips by allowing the integration of control electronics much closer to the processor. The research also highlights individual coherent control of two qubits with single-qubit fidelities of up to 99.3 percent. These advancements signal the potential for cryogenic controls of a future quantum system and silicon spin qubits to come together in an integrated package.

Simplifying system design to accelerate setup time and improve qubit performance

Another key challenge in today’s quantum systems is the use of room-temperature electronics and the many coaxial cables that are routed to the qubit chip inside a dilution refrigerator. This approach does not scale to a large number of qubits due to form factor, cost, power consumption, and thermal load to the fridge. It is crucial to address this challenge and radically simplify the need for multiple racks of equipment and thousands of wires running into and out of the refrigerator in order to operate a quantum machine.

Intel replaced these bulky instruments with a highly integrated system-on-a-chip (SoC) and the first cryogenic quantum computing control chip that simplifies system design. This approach uses sophisticated signal processing techniques to accelerate setup time, improve qubit performance, and enable engineering teams to efficiently scale the quantum system to larger qubit counts.

A full-stack scalable approach to quantum computing

Since quantum computing is an entirely new type of compute, it has an entirely different way of running programs that require new hardware, software, and applications developed specifically for these systems. This means that quantum computers need all new components on all levels of the stack, from the qubit control processor, control electronics, to qubit chip device and more. And Intel is hard at work developing all these components for the full stack. The challenge is in getting all those components to work together, which is like choreographing a quantum dance.

Clearly, quantum computers are not meant to replace classical compute infrastructure. They are meant to augment them. Their continued development aims to eventually solve some of the world’s most intractable challenges that has stumped today’s classical computers. But the road to building a viable system that works on a practical, commercial level will require persistence, patience, and partnerships.

In some ways, our work on classical computers makes us uniquely suited in this endeavor, given the scale that is needed to address the top challenges facing quantum computing development. The advancements we have made with spin qubit technology, cryogenic control, and development of a full technology stack are just some of the things Intel is making progress on in making quantum computing fully viable and practical in a not-too-distant future.

By Kashif Nazir, Technical Manager at Cloudhouse.
By Richard Eglon, CMO, Nebula Global Services.
By Graham Jarvis, Freelance Business and Technology Journalist, Lead Journalist – Business and...
By Krishna Sai, Senior VP of Technology and Engineering.
By Thomas Kiessling, CTO Siemens Smart Infrastructure & Gerhard Kress, SVP Xcelerator Portfolio...
By Aleksi Helakari, Head of Technical Office, EMEA, Spirent and Patrick Johnson, CMO, APNT - a...
By Dave Longman, Head of Delivery, Headforwards.
It’s getting to the time of year when priorities suddenly come into sharp focus. Just a few...