QC82: Shaping the Future of Photonic Continuous-Variable Quantum Computing

Quantum computing promises to revolutionize how we process information and reshape fields ranging from cryptography to drug discovery.

Yet the road to realizing its potential is paved with immense technical challenges, particularly in scaling quantum systems to a level where they can outperform classical computers. What if the key to scalable, fault-tolerant quantum computing lies in harnessing the power of photons?

QC82 was founded in 2022 by Hussain Zaidi, Olivier Pfister, Andreas Beling, Xu Yi, and Joe Campbell to develop integrated photonic chips for fault-tolerant quantum computing at room temperature. They tackle various challenges, from building the architecture for a large-scale photonic quantum computer to engineering groundbreaking individual components like photon detectors that operate at room temperature and are commercial game changers in the short term. 

Along the way, they’re advancing a new paradigm of encoding and processing quantum information: continuous-variable quantum computing. In early 2022, QC82 raised a seed round from Tensor Ventures, Qubits Ventures, and the University of Maryland Discovery Fund.

Learn more about the future of photonic continuous-variable quantum computing from our interview with the co-founder and CEO, Hussain Zaidi:

Why Did You Start QC82?

I come from a multidisciplinary background. I’ve always been fascinated by how we build computers to address real-world applications. At the same time, I’ve always been interested in physics, particularly photonics and the use of light. Eventually, I wanted to combine both of my passions by developing quantum computing and photonic quantum computing in particular.  

A few years back, I attended a quantum computing conference in Berkeley, with a few quantum startups presenting and people talking about small-scale quantum computers and algorithms, what they called the noisy intermediate-scale quantum (NISQ) era — it sounded interesting, but it wasn’t clear that they would be useful to do anything commercial. 

Quantum computers must scale to millions of qubits and billions of gates to become useful and provide real value. Photonics at telecom wavelength has already demonstrated that it can carry information globally for telecommunications, so it could also help scale quantum computers with many qubits in a quantum warehouse. 

But this couldn’t happen in academia. No single lab can build and integrate all the components to develop a large-scale quantum computer for commercial applications. It’s like building a spaceship for the first time without a blueprint and only a vague idea of how to get to Mars. 

You need to develop the qubits and classical control electronics, optimize the performance, keep the losses exceptionally low, and integrate everything into a machine. Different labs have different expertise, and by founding a startup, my co-founders and I put all of our expertise together in a commercial setting to eventually build useful quantum computers. 

What’s Your Approach To Building Quantum Computers? 

When I tell people outside the physics lab about what we’re doing, sometimes they wonder whether our quantum computers are solar-powered. 

That’s, of course, not the case.

We differ from other quantum computing companies broadly by building so-called photonic quantum computers and specifically by focussing on a new paradigm called continuous variable quantum computing. 

Photonic quantum computers use light for quantum computations, specifically the light particles called photons. By changing their quantum properties, we can implement quantum algorithms. 

Why Photonic Quantum Computers? 

The good thing about photons is that they generally don’t interact as much with their outside environment as most matter qubits that constantly interact with the environment. Yet, the bad thing is that when photons do interact with matter, they can get absorbed, and the quantum information they carry vanishes entirely. 

This is also true if you measure them: an electron, ion, or atom qubit stays there, but a photon qubit gets absorbed and vanishes when you measure it. That’s why we must create and entangle millions of photon qubits per second that are measured to implement quantum gates. 

Another challenge is creating entanglement between photons. Other quantum hardware platforms use so-called two-qubit gates to entangle qubits and exchange quantum information, but that’s not viable for photonic quantum computing. Instead, we pursue a different approach called measurement-based quantum computing.

It requires creating a large-scale entangled quantum state, and in a continuous-variable approach, we can achieve that using a so-called two-mode squeezing gate and beam splitters. Once you have this so-called cluster state, you only need single qubit gates to perform a quantum computation. 

Photonics really shines for large-scale quantum computers (pun intended!). Other technology platforms also have their advantages, like low loss or impressive single-qubit or two-qubit gate fidelities, but the problem is in scaling up to millions of qubits. 

We’ll need millions of physical qubits to run quantum chemistry algorithms that deliver value for developing novel drugs or new materials like high-temperature superconductors. 

Optical fiber has almost no loss, and our qubits are already photonic, so we don’t have to convert them back and forth from matter qubits. Having very low-loss chip-fiber coupling will help us create very large quantum computers—not desktop computers but data center-sized quantum computers. Photonics is the best approach to getting to those scales. 

What Is Continuous Variable Quantum Computing?

Traditionally, quantum computers are built from qubits, which can be any quantum system with two discrete quantum states, like an electron’s spin pointing either up or down. The superposition of those two discrete quantum states encodes a unit of quantum information, similar to a bit in a classical computer but distinct from classical computing because of quantum superposition. 

Continuous Variable Quantum Computing (CVQC) is a different quantum computing paradigm that performs quantum computations using continuous quantum degrees of freedom instead of discrete quantum states. 

The upside is that the continuous variable approach allows us to create photonic qubits for error correction at extremely high probabilities. In fact, we have shown that the probabilities can theoretically be above 90%. The other advantage is that these qubits can be entangled deterministically to create large-scale quantum states for quantum error correction and algorithm implementation. This is distinct from other photonic approaches in which entanglement generation is probabilistic. 

What Have You Built So Far?

We’re developing both the architecture of the photonic quantum computer and the hardware components needed to build large-scale, commercially useful quantum computers, what we call industrial-scale quantum computers. 

The architecture was released on arXiv as the first of its kind and feasible for large-scale photonic quantum computing. 

Building on this architecture and as part of a DARPA project, we have modeled the entire quantum computation end-to-end, from generating quantum states to implementing quantum operations for quantum error correction. Thereby, we have identified the error rates below a critical threshold needed for quantum error correction to work and be resistant to errors to produce meaningful results. 

On the hardware side, the QC82 cofounders have also demonstrated up to seventy quantum modes on a single photonic chip, a result which was published in the journal Optica Memorandum. We’ve also developed the first prototypes of a single photon avalanche diode and a high-efficiency, room-temperature photon detector, which is coming out soon. 

How Do You Build Quantum Computers at Room Temperature?

Photons maintain their quantum properties even at room temperature and thus allow photonic quantum computers to operate without needing to be cooled to cryogenic temperatures—a distinct advantage over many other quantum hardware platforms.

However, photonic quantum computers don’t operate truly at room temperature throughout the computing pipeline today. They still rely on photon detectors that need cryogenic temperatures to operate. 

We’re developing photon detectors that work at room temperature, feature low dark counts, and are compatible with 1515 nm telecommunications wavelengths. This opens up many use cases in fiber-based quantum communications. And we are solving the cryogenic photon detection bottleneck en route to developing a fully room-temperature photonic quantum computer. 

The detectors can be game-changing stand-alone products to third parties. A challenge in the quantum landscape is the long timeline for commercialization. We see ourselves as providing a shorter path to commercialization by making ground-breaking detectors. The same detectors will be used in our proprietary designs for industrial-scale quantum computing. 

What Applications Do You Plan to Target?

Quantum computing is typically used to solve computationally expensive problems. Some of the most touted examples are encryption and assisting classical machine learning algorithms with quantum enhancements. However, our thesis is that quantum chemistry applications will yield the most commercial value. 

If you take a look at the history of physics, a perennial theme is approximating solutions to quantum many-body problems, such as those describing the interactions of matter at the scales governing processes in the energy and chemical sectors, including biopharma. 

For decades, researchers have tried to approximate quantum systems, developing density functional theory and other approximation methods. However, it’s very hard to approximate large quantum systems, and it will be easier to use a quantum computer to solve such quantum problems. 

Many machine learning algorithms align well with human intuition, and hence, many improvements are possible on classical computers. However, quantum problems fundamentally contradict human intuition, and this is where quantum computers can be useful. For now, we’re focusing on chemistry applications and solving models like the Fermi-Hubbard model or quantum phase estimation—a fundamental algorithm needed everywhere in chemistry. 

What’s The Relation of AI and Quantum Computing?

In the future, quantum computers could help with AI tasks, but for now, they’re too small to show an advantage in machine learning, e.g., quantum machine learning algorithms, or solve important real-world problems.

For now, it’s mostly the other way around: AI can offer a lot of value to quantum computing, such as using machine learning to develop quantum algorithms and control the operations of quantum computers. Google recently showed a way to use AI to improve quantum error correction. The challenge with machine learning is making it work significantly faster for real-time quantum decoding. 

The biggest challenge right now for quantum computers is to scale the number of qubits. AI might be able to reduce the number of gates we need to implement a certain quantum operation, but we’ll still need millions of physical qubits to do something useful. But it’s a fast-moving field, and there can be breakthroughs: With the development of new LDPC error correction codes, we may need fewer qubits. 

What Advice Would You Give Fellow Deep Tech Founders? 

Learn how to manage the highs and lows. It’s important to manage that reality while simultaneously taking care of yourself. Don’t forget to take time off and unplug. Many things contribute to a startup’s success, but if you’re not consistently delivering your best self to the startup, little else matters. Be your best self by taking care of yourself!

Comments are closed.