A major obstacle facing quantum devices has been solved by a University of Oxford study that leveraged machine learning capabilities. The results show how to bridge the “reality gap,” or the discrepancy between expected and observed behavior from quantum devices, for the first time. Physical Review X has published the findings.
Numerous applications, such as drug development, artificial intelligence, financial forecasting, and climate modeling, could be greatly enhanced by quantum computing. However, this will necessitate efficient methods for combining and scaling separate quantum bits (also known as qubits). Innate variability, which occurs when even seemingly similar units display distinct behaviors, is a significant obstacle to this.
It is assumed that nanoscale flaws in the materials utilized to create quantum devices are the source of functional variability. This internal disturbance cannot be represented in simulations since these cannot be measured directly, which accounts for the discrepancy between expected and observed results.
In order to overcome this, the research team indirectly inferred these disease traits using a “physics-informed” machine learning approach. This was predicated on how the internal instability impacted the electrons’ ability to move through the apparatus.
Using “crazy golf” as an example, lead researcher Associate Professor Natalia Ares of the University of Oxford’s Department of Engineering Science stated: “The ball may enter a tunnel and exit with a speed or direction that doesn’t match our predictions.” However, with a few more attempts, an insane golf simulator, and some machine learning, we could improve our ability to forecast the trajectory of the ball and close the reality difference.
One quantum dot device was used as a test subject, and the researchers recorded the output current across it at various voltage settings. A simulation was run using the data to determine the difference between the measured current and the theoretical current in the absence of an internal disturbance. The simulation was forced to find an internal disorder arrangement that could account for the measurements at all voltage levels by monitoring the current at numerous distinct voltage settings. Deep learning was combined with statistical and mathematical techniques in this method.
Associate Professor Ares added: ‘In the crazy golf comparison, it would be analogous to deploying a number of sensors along the tunnel, so that we could obtain measurements of the ball’s speed at different points. Even if we can’t see inside the tunnel just yet, the data will help us make more accurate predictions about how the ball will behave when we attempt the shot.
The novel model not only identified appropriate internal disorder profiles to explain the observed current levels, but it also demonstrated the ability to precisely forecast the voltage settings necessary for particular device operating regimes.
Most importantly, the model offers a fresh way to measure the differences in variability between quantum devices. This may make it possible to forecast device performance more precisely and aid in the development of ideal materials for quantum devices. It might guide compensatory strategies to lessen the undesirable consequences of material flaws in quantum devices.
Co-author David Craig, a PhD candidate at the University of Oxford’s Department of Materials, continued, “We have used simple measurements as a proxy for the internal variability of nanoscale quantum devices, similar to how we cannot directly observe black holes but infer their presence from their effect on surrounding matter.” Our study has shown the value of utilizing physics-aware machine learning to close the reality gap, even though the real device is still more complex than the model can reflect.