Featured Post

Tracking air pollution disparities -- daily -- from space

Studies have shown that pollution, whether from factories or traffic-snarled roads, disproportionately affects communities where economicall...

Saturday, August 7, 2021

Scientists just simulated quantum technology on classical computing hardware

Lurking in the background of the quest for true quantum supremacy hangs an awkward possibility – hyper-fast number crunching tasks based on quantum trickery might just be a load of hype.


Now, a pair of physicists from École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland and Columbia University in the US have come up with a better way to judge the potential of near-term quantum devices – by simulating the quantum mechanics they rely upon on more traditional hardware.

Their study made use of a neural network developed by EPFL's Giuseppe Carleo and his colleague Matthias Troyer back in 2016, using machine learning to come up with an approximation of a quantum system tasked with running a specific process.

Known as the Quantum Approximate Optimization Algorithm (QAOA), the process identifies optimal solutions to a problem on energy states from a list of possibilities, solutions that should produce the fewest errors when applied.

"There is a lot of interest in understanding what problems can be solved efficiently by a quantum computer, and QAOA is one of the more prominent candidates," says Carleo.

The QAOA simulation developed by Carleo and Matija Medvidović, a graduate student from Columbia University, mimicked a 54 qubit device – sizeable, but well in line with the latest achievements in quantum tech




While it was an approximation of how the algorithm would run on an actual quantum computer, it did a good enough job to serve as the real deal.

Time will tell if physicists of the future will be quickly crunching out ground states in an afternoon of QAOA calculations on a bona fide machine, or take their time using tried-and-true binary code.

Engineers are still making incredible headway in harnessing the spinning wheel of probability trapped in quantum boxes. Whether current innovations will ever be enough to overcome the biggest hurdles in this generation's attempt at quantum technology is the pressing question.

At the core of every quantum processor are units of calculation called qubits. Each represents a wave of probability, one without a single defined state but is robustly captured by a relatively straight-forward equation.

Link together enough qubits – what's known as entanglement – and that equation becomes increasingly more complex.

As the linked qubits rise in number, from dozens to scores to thousands, the kinds of calculations its waves can represent will leave anything we can manage using classical bits of binary code in the dust.




But the whole process is like weaving a lace rug from spiderweb: Every wave is a breath away from entangling with its environment, resulting in catastrophic errors. While we can reduce the risk of such mistakes, there's no easy way right now to eliminate them altogether.

However, we might be able to live with the errors if there's a simple way to compensate for them. For now, the anticipated quantum speedup risks being a mirage physicists are desperately chasing.

"But the barrier of 'quantum speedup' is all but rigid and it is being continuously reshaped by new research, also thanks to the progress in the development of more efficient classical algorithms," says Carleo.

As tempting as it might be to use simulations as a way to argue classical computing retains an advantage over quantum machines, Carleo and Medvidović insist the approximation's ultimate benefit is to establish benchmarks in what could be achieved in the current era of newly emerging, imperfect quantum technologies.

Beyond that, who knows? Quantum technology is already enough of a gamble. So far, it's one that seems to be paying off nicely.

This research was published in Nature Quantum Information.





#Tech | https://sciencespies.com/tech/scientists-just-simulated-quantum-technology-on-classical-computing-hardware/

No comments:

Post a Comment