Since 2012, the amount of computing required in the largest AI training runs has grown exponentially, with a 3.4-month doubling time (compared to Moore’s Law’s two-year doubling time), as found by OpenAI.
The three key pillars that drive AI forward are algorithmic innovation, data (either supervised or unsupervised), and the amount of computing power available for training. Of these three, algorithmic innovation and data are quite difficult to measure, however, computing is not only quantifiable but is bound to see a significant rise.
AI needs a quantum hand
With growing investment in R&D for AI programming and computer hardware, artificial intelligence has progressed substantially over time. Now, AI requires quantum computing to achieve significant progress. Before delving into the issue, let’s understand what quantum AI (QAI) is? QAI is the use of quantum computing for the computation of machine learning algorithms. Quantum AI can help reach results that are hard to derive with traditional computers due to the computational benefits as it,
Widens the scope: Quantum computing starts where the boundaries of classical binary computers end. QC, rather than being limited to Boolean linear algebraic functions on 1s and 0s, allows us to apply linear algebra to quantum bits, or qubits, which are made up of integers, matrices and vectors, interacting in quantum states such as superposition, entanglement, and interference.
Executes behemoth tasks with ease: Quantum computing holds the potential to solve complex computational tasks that were beyond the reach of traditional computers. Take, for instance, QC can use brute force methods to predict the password used to encrypt a piece of data using a 256-bit algorithm. AES-256-encrypted data is deemed secure since it cannot be decrypted using a brute-force attack. It is possible, but with the current set of available technology, it would take thousands of years, hence practically impossible.
Fulfill training demands: Google AI, in the first week of April 2022 introduced the Pathways Language Model (PaLM) with 540-billion parameters. Deep learning, the most recent version of machine learning, is pushing the boundaries of what standard computers can manage. Moreover, future models will take much longer to train as the number of parameters increases into trillions.
Ensures speed with accuracy: The success of AI systems relies heavily on data. Also, global data creation is increasing enormously and is projected to reach beyond 180 zettabytes, as per a study. Quantum computers are meant to handle massive amounts of data while also swiftly revealing patterns and recognising anomalies. Further to this, it can manage and integrate numerous data sets from multiple sources.
Combat fraud detection: The application of quantum computing on AI in the BFSI sector will aid in enhancing and combating fraud detection. Simply put, the models trained on quantum computers will be good enough to spot patterns that would be difficult to detect with traditional methods available. Further to this, QCs can offer precise insights from the humongous data, within no time, to customise products for their customers.
Hurdles at hand
A cautious approach is required. It will be too early to put QCs into action and realise true potential without solving some of the existing challenges.
Technical obstacles: Quantum computing works with qubits that are volatile in nature. Qubits can represent any number of zeros and ones and can interact with each other. As a result, managing these interactions becomes extremely difficult, and the volatility of qubits can result in inputs being lost or altered, causing results to be inaccurate.
High sensitivity: Isolation is a common problem that plagues researchers. Heat and light can produce quantum decoherence; when qubits are exposed to these conditions, they lose quantum features like entanglement, which results in a loss of data contained in the qubits.
To sum up, there are plenty of doubts about when the two highly-hyped technologies, AI and quantum computing, can come together. “Few concepts in computer science cause as much excitement—and perhaps as much potential for hype and misinformation—as quantum machine learning,” stated IBM in its July 2021 blog.
Source: indiaai.gov.in