Quantum physics exponentially improves some types of machine learning

This post was originally published on this site

Machine learning can get a boost from quantum physics.

On certain types of machine learning tasks, quantum computers have an exponential advantage over standard computation, scientists report in the June 10 Science. The researchers proved that, according to quantum math, the advantage applies when using machine learning to understand quantum systems. And the team showed that the advantage holds up in real-world tests.

“People are very excited about the potential of using quantum technology to improve our learning ability,” says theoretical physicist and computer scientist Hsin-Yuan Huang of Caltech. But it wasn’t entirely clear if machine learning could benefit from quantum physics in practice.

In certain machine learning tasks, scientists attempt to glean information about a quantum system — say a molecule or a group of particles — by performing repeated experiments, and analyzing data from those experiments to learn about the system.

Huang and colleagues studied several such tasks. In one, scientists aim to discern properties of the quantum system, such as the position and momentum of particles within. Quantum data from multiple experiments could be input into a quantum computer’s memory, and the computer would process the data jointly to learn the quantum system’s characteristics.

The researchers proved theoretically that doing the same characterization with standard, or classical, techniques would require exponentially more experiments in order to learn the same information. Unlike a classical computer, a quantum computer can exploit entanglement — ethereal quantum linkages — to better analyze the results of multiple experiments.

But the new work goes beyond just the theoretical. “It’s crucial to understand if this is realistic, if this is something we could see in the lab or if this is just theoretical,” says Dorit Aharonov of Hebrew University in Jerusalem, who was not involved with the research.

So the researchers tested machine learning tasks with Google’s quantum computer, Sycamore (SN: 10/23/19). Rather than measuring a real quantum system, the team used simulated quantum data, and analyzed it using either quantum or classical techniques.

Quantum machine learning won out there, too, even though Google’s quantum computer is noisy, meaning errors can slip into calculations. Eventually, scientists plan to build quantum computers that can correct their own errors (SN: 6/22/20). But for now, even without that error correction, quantum machine learning prevailed.