QEDMA Quantum Computing: Shaping the Future of Quantum Operating Systems

Quantum computing has long been heralded as the next frontier in computing. However, despite their immense potential, quantum computers today still make too many errors to be useful. 

While it may become possible to correct these errors in the future, there is still a long way to go to reach fault tolerance. For now, the best strategy is to minimize errors and mitigate their impact on quantum computations by devising methods that can work with the existing quantum hardware.

QEDMA Quantum Computing was founded in 2020 by Asif SinayNetanel Lindner, and Dorit Aharonov to develop the quantum operating system of the future. QEDMA’s vision encompasses not only methods to characterize quantum hardware but also robust error mitigation strategies to get optimal results from the current generation of quantum computers.

With continuous improvements in quantum hardware, fueled by advancements in materials, control systems, and fabrication techniques, alongside the implementation of sophisticated error mitigation strategies, there is optimism that quantum computing will witness substantial progress in the coming years.

Learn more about the future of quantum operating systems from our interview with the co-founder and CTO, Netanel Lindner

Why Did You Start QEDMA?

As a condensed matter physicist, I am interested in characterizing quantum systems with many constituents – and I realized that those very same questions are actually relevant to quantum computing. 

I didn’t think of co-founding a startup when talking to our now-CEO Asif over lunch. We were exploring what technology trends would become impactful, and I suggested that characterizing quantum computers could become very important in reducing their errors. 

Nowadays, the main challenge for quantum computers is that they still make too many errors to be useful – characterizing individual quantum computers to learn which errors they make would allow us to take care of those errors. While talking to Dorit, now CSO of QEDMA, sometime later, she told Asif the same thing independently. He was the glue that brought us together as a founding team. 

We met about once a week, mapping out the potential for a startup and the tech roadmap. After half a year, we decided to go for it, and we co-founded QEDMA. The timing was great since I had planned for a sabbatical but hadn’t left due to Covid, so I could take the time to work on the company. 

How Does Error Characterization and Mitigation Work?

Let’s define the problem first. There are two main challenges when you’re trying to build a quantum computer. One is the number of qubits you need. The other one is the quality of the qubits. While the number of qubits is growing steadily, the main hurdle is still their quality, as decoherence and noise make them lose their quantum properties and lead to errors that spoil the result of a quantum computation. At QEDMA, we characterize these errors in order to subsequently mitigate them.

The reason we don’t observe quantum effects in our everyday life is that quantum systems constantly interact with their environment. They are very sensitive, and these interactions with the environment lead to so-called decoherence – the quantum effects get washed out. Even if you try to isolate a quantum system, the isolation won’t be perfect, and you get only a limited time, the so-called coherence time, to harness quantum effects for computing before they get washed out by decoherence. 

The other challenge to the quality of qubits is noise, random fluctuations, e.g. in the voltages or magnetic fields you apply to operate a quantum computer. Noise leads to a discrepancy between the state you intend to reach and the actual state you get by performing a step in your quantum computation – and this discrepancy introduces errors in your quantum computation.

For a classical computer, there’s just one type of basic error: a bit flip from 0 to 1 or vice versa. In the quantum world, since qubits can be in a superposition of 0 and 1, there are two types of basic errors. There’s the bit flip, but in addition, there can be a phase flip – flipping the sign of the superposition from 0 + 1 to 0 – 1. 

Classically you can perform error correction, but in the quantum world, this is really hard to do. It requires enormous resources and is only possible when the error rate is not too high. Error correction is really about fault-tolerant quantum computing, and interestingly enough, our co-founder Dorit has proven the threshold theorem, showing that error correction is possible in principle. (See this arXiv paper for more context.)

The error rate of current quantum computers is much too high, so the best we can do in the meantime is to mitigate errors. Of course, for the long-term, fault tolerance is the overall goal, but it’s going to be a long journey, gradually shifting from error mitigation to a mix of mitigation and correction to achieve ultimate fault tolerance. 

For error mitigation, we need to distinguish between reversible and non-reversible errors. Reversible errors stem from coherent noise and can be corrected at no extra cost. That is, they won’t significantly increase the run time of a quantum computation. You can undo these errors with a unitary transformation. For example, imagine you want to rotate a qubit by 90 degrees, but due to an imperfect implementation of the gate, you end up rotating it by 91 degrees. Given that you have characterized this problem precisely, you can rotate it back by 1 degree at no extra cost. Cancellation of coherent errors still leaves us with the need to mitigate irreversible errors, and is therefore often called error suppression. 

Irreversible errors cannot be undone without paying a price. The fundamental reason for this is that some information are lost due to these errors, and you need to rescale from the data you have to get the information back. Yet, rescaling requires lots of time and additional effort, so the run time of the quantum computation increases. 

Our product is a SAAS that allows a user to run a job through QEDMA, basically adding a line in your Python script, and we take care of running the quantum algorithm in an automated and optimal way. Under the hood, we first send a characterization job to the quantum machine to learn which errors are prevalent. Next, based on this characterization, we compile the quantum algorithm doing error suppression and mitigation, and let it run on the quantum hardware. We then do some classical post-processing of the result to achieve the highest precision.

The end result will have no systematic error. But note that it will still have a statistical error, which you always have when evaluating quantities resulting from quantum computations. This statistical error is due to the fundamentally probabilistic nature of quantum computers, so-called quantum shot noise, and you would have it even for fault-tolerant quantum computers.

Now, the tricky part is the rescaling required to mitigate irreversible errors since it introduces overhead that grows exponentially with the number of steps in a quantum computation. But, the advantage you expect from running many quantum algorithms is also exponential: dividing two exponents still gives you an exponent, so you get an exponential advantage. Through the methods we develop, we have the best, i.e., the smallest exponent for the overhead, which allows us to run larger quantum computations.

Given the speed at which things are currently improving, both in terms of lower error rates and better error mitigation techniques, soon quantum computers will be able to run complex quantum algorithms with thousands of gates, exceeding the capabilities of classical supercomputers. 

That’s not saying that there’s a quantum advantage for a practical application in the very near future. But, I am confident that once any advantage exists, people will find ways to use it. And secondly, quantum computers will be definitely useful for solving quantum problems, e.g., in material science or chemistry, like finding the ground state or time evolution of a material or a molecule. Classical methods require run times that grow exponentially with the complexity of a molecule or physical dimensions of a piece of material. Quantum computers have a real shot at simulating these more efficiently. For other use cases, like in optimization, it’s less clear whether there will be an actual, exponential advantage for solving a generic problem. 

How Did You Evaluate Your Startup Idea?

When the three of us got together initially, we started brainstorming to see if we had a coherent set of ideas and a sufficiently novel and advantageous technology to build a successful company.

For the first piece of our solution, the characterization of quantum hardware, we talked a lot to quantum hardware manufacturing companies that we knew personally and through our scientific network. The second part of our offering, error mitigation, will evolve from where we started, addressing a much larger market. 

What Advice Would You Give Fellow Deep Tech Founders?

Quantum computing is a very unique, research-heavy ecosystem as quantum computers still don’t have an advantage over classical computers. So end users don’t see a return on investment (ROI) in financial terms. 

So you need to take any advice with a grain of salt. Traditionally, an industry is built on ROI, and most people only know industries based on ROI. However, you need to keep in mind that quantum is not a standard industry. 

Product-market fit needs to be done extremely carefully, i.e., you need to understand whether the value you’re bringing is high on your customer’s priority list. Since for end users, it’s not a direct ROI, carefully try to understand what they really need.

Hardware manufacturers are pouring a lot of money into R&D and manufacturing the first quantum computers, which will be a scarce and valuable asset. So it’s a long process to convince them to collaborate with you as they don’t want any critical component to be external – and become dependent. 

Further Reading

Qedma Quantum Computing: Characterization and Error Suppression in Multi-Qubit Devices – Checkout this talk by Netanel Lindner at the Q2B 2022 conference in Silicon Valley

‘The Big Money Is Here’: The Arms Race to Quantum Computing – QEDMA feature in a news article in the HAARETZ 

Israel Innovation Authority making major push to develop quantum computing technologies – QEDMA is part of the largest consortium ever to develop quantum computing tech

Comments are closed.