Computer simulations in chemistry are a powerful way of understanding and optimizing chemical processes. But as the size of the system under consideration grows, so does the computational complexity of modeling the quantum chemical equations. This is due to the fact that increasing the size of a system leads to exponential scaling in the number and statistics of quantum variables. As such, the exact solution of quantum chemical equations for all but the smallest systems remain intractable for modern classical computers.
Considering this, quantum computers with their exponential computational power provide a way to encapsulate such systems. In a study published in Science today, the Google AI Quantum team sought to accelerate the current quantum chemistry simulation techniques. The researchers used the infamous 54-qubit quantum computer Sycamore, which achieved quantum supremacy last year, to run the largest chemical simulation performed on a quantum computer to date.
Though the calculation focused on the Hartree-Fock approximation of a real chemical system, it was twice as large as previous chemistry calculations on a quantum computer, and contained ten times as many quantum gate operations.
Part of the simulation involved modeling the isomerization of diazene and binding energies hydrogen chains:
Here, we perform a series of quantum simulations of chemistry the largest of which involved a dozen qubits, 78 two-qubit gates, and 114 one-qubit gates. We model the binding energy of H6, H8, H10 and H12 chains as well as the isomerization of diazene.
To achieve this, the team used a noise-robust variational quantum eigensolver (VQE) to directly simulate a chemical mechanism via a quantum algorithm. VQE was important because quantum calculations are prone to noise that causes inaccuracies in calculations. Essentially, the technique treats a quantum processor like a neural network and attempts to optimize the quantum circuit’s parameters by dynamically minimizing a cost function to account for errors during computation.
Sycamore has 54-qubits and consists of over 140 individually tunable elements, each controlled with high-speed, analog electrical pulses. Achieving precise control over the whole device requires fine tuning more than 2,000 control parameters, and even small errors in these parameters can quickly add up to large errors in the total computation.
The next challenge was to define a system that was able to control Sycamore. For this, the team used an automated framework that mapped this problem onto a graph with thousands of vertices whereby each vertex represented a physics experiment to determine a single unknown parameter.
Traversing this graph takes us from basic priors about the device to a high-fidelity quantum processor, and can be done in less than a day. Ultimately, these techniques along with the algorithmic error mitigation enabled orders of magnitude reduction in the errors.
With this setup in place, Google's team not only ran the largest chemical simulation performed on a quantum computer to date but also provided a proof of concept that the proposed method makes it possible to achieve chemical accuracy when VQE is combined with error mitigation strategies. Moving forward, the researchers hope their experiment serves as a blueprint for simulations on quantum processors in the future.
The complete code for running this experiment has been posted on GitHub. Further details and the original research paper can be found here.
3 Comments - Add comment