Florian Preis: How Quantum Brilliance Pioneers Quantum Utility For Real-World Applications

Florian Preis: How Quantum Brilliance Pioneers Quantum Utility For Real-World Applications

A lot of quantum hype hinges on the idea that quantum computers can solve certain computational problems extremely fast. 

But if there are no practical applications or the costs are too high, no one apart from academics will care. Welcome to the concept of quantum utility, a paradigm where a quantum computer’s value is gauged not just by its computational power but by its ability to deliver business value, which includes aspects like cost-effectiveness, energy efficiency, or meeting size constraints. 

Quantum Brilliance develops diamond-based quantum accelerators to achieve quantum utility at room temperature – and we covered their founding history and technology in our previous interview. This time, we had the pleasure of speaking with Florian Preis, Head of Software and Applications, about their software development kit, Qristal, and their quantum emulator, helping to benchmark their quantum algorithms. 

Why Did You Join Quantum Brilliance?

After my physics studies, I joined IBM, initially as a data scientist working on applied machine learning. But since I am a physicist, soon after, they asked me whether I wanted to work on quantum computing, and I was like, absolutely! I was immediately hooked and supported the technical sales team as a quantum ambassador.

At the same time, I started following the developments around quantum computing based on nitrogen-vacancy (NV) centers in diamonds and got fascinated by their properties, such as being able to operate at room temperature and their very long coherence times. 

Quantum Brilliance uses such diamond NV centers to build quantum accelerators the size of a GPU – how cool is that? It looked like one of the best opportunities to realize value from quantum computing in the near term. So, I took the chance to join them on the software side, specifically to develop variational quantum algorithms that may lead us to quantum utility in the not-too-distant future.

How Do You Plan to Reach Quantum Utility?

The term quantum utility captures the idea that quantum advantage is only real if it also gives a commercial benefit. You have to incorporate costs into your benchmarks and consider the specifics of a quantum device, like its size, weight, or power consumption – in particular for edge applications.

Since you published the last interview about Quantum Brilliance, we deployed our first quantum computer at the Australian Pawsey Supercomputing Research Center. It taught us a lot about building robust quantum computers that can deal with the harsh environment in supercomputing centers, such as electromagnetic stray fields, vibrations, noise, and airflow, which all present challenges. 

On the software side, we geared our efforts toward reaching quantum utility with variational quantum algorithms, which is the focus of my team. We’ve built a software development kit called Qristal, which we open-sourced in February 2023 so that others can also develop quantum algorithms. 

When people use the software development kit, they typically start from quantum algorithms in the literature and then adjust them to the problem at hand. Say you’re looking at graph problems; you could reuse a lot of the intuition from physicists working on lattice models. It’s very exploratory and a lot of reverse-engineering well-known algorithms, as there’s no standard way of coming up with a new quantum algorithm.

We also developed a quantum emulator, which is proprietary, and we have subsequently brought the first customers on board. It mimics a quantum computer using GPU acceleration, both the fundamentally probabilistic way it works and the noise that today’s quantum machines face. Leveraging NVIDIA’s cuQuantum SDK works nicely, especially for tensor networks, for which we provide three different types of backends to deal with them.

It’s mostly to get a feeling for how a quantum algorithm scales with the number of qubits. Of course, it can’t simulate a hundred logical qubits. But you can steadily increase the number of qubits and see how the performance of a quantum algorithm scales as you go from ten to twenty to 35 qubits on GPUs or even 45-50 simulated qubits on supercomputers.

Which Applications May See Early Quantum Utility?

Variational quantum algorithms have various applications in chemistry, quantum machine learning, and combinatorial optimization, where they will prove valuable.

For example, we demonstrated a quantum machine learning model on our quantum accelerators deployed at the Pawsey Supercomputing Research Center, which had to find decision boundaries of differently colored points in a 2D plane. We found that it solved this task with better accuracy than a classical machine-learning model that was restricted to the same number of free parameters to ensure a fair comparison.

Our team has also worked on a combination of variational quantum eigensolvers (VQE) and classical machine-learning models for electronic structure calculations in quantum chemistry. In the past two years, studies (such as this one) showed that the VQE suffers from excessively long run times. Even for medium-sized molecules, calculating the ground state energy, for example, would take a very long time, as a large number of shots are required to achieve decent accuracy. There are too many terms in the cost function that you cannot evaluate simultaneously, which makes it computationally expensive. 

We bypass this limitation by following a hybrid quantum-classical approach: We use a quantum computer to sample a wave function from Hilbert space and machine learning to reconstruct the full wave function. We can then reuse it to calculate other observables different from the ground state energy.  

For chemistry applications, you often face a combinatorial problem of which electron belongs to which spin state and orbital, so you use a quantum computer as a sampler. We let the quantum and machine learning parts work in parallel and then combine the results.

Computational chemistry relies on many efficient approximations, which generally work amazingly well but, in some cases, break down. That’s when quantum computers could have an impact, helping to understand molecules that you can’t simulate classically or when you need high accuracy. In this case, speedups are the wrong way to look at quantum utility in the near term – it’s actually about solving a problem with better accuracy or less computational effort to reach the same accuracy. 

One example are catalysts, which reduce the energy barrier of a chemical reaction and may thus reduce their carbon footprint – but using them effectively depends on knowing their energy levels very accurately. Other examples are developing better solar cells or electrodes for hydrogen production.

What’s the State of Developing Quantum Algorithms?

To reach quantum utility in the near term, we still have to close the gap between abstract quantum algorithms and the current hardware, tailoring quantum algorithms to the respective quantum hardware to exploit their quirks. General-purpose quantum computing is, of course, the ultimate end goal, but till then, we need to make do with what we have so far. 

For variational quantum algorithms like VQE specifically, we still need to figure out better ansätze to make them work for chemistry problems, solve the run-time problem, and get high-accuracy results. 

Last but not least, I wanted to say that I really enjoy working in the quantum industry – it’s a really cool, collaborative, and friendly competitive environment.