IBM Researchers Propose A Quantum Kernel Algorithm That, Given Only Classical Access To Data, Provides A Provable Exponential Speedup Over Classical Machine Learning Algorithms

Many quantum machine learning algorithms have been believed to provide exponential speed-ups over classical machine learning (ML) approaches, based on the assumption that classical data can be provided to the algorithm in the form of quantum states. Yet, no studies show whether a method exists that can efficiently provide data in this manner.

Many proposals for quantum machine learning algorithms have been made, but they lack proof supporting their performance. These proposals are driven by the challenge of finding algorithms that are friendly to implement short-term testing with only conventional data access. One such algorithm was for quantum-enhanced feature spaces—also known as quantum kernel methods, where a quantum computer only enters a part of the overall algorithm.

However, their performance over classical machine learning algorithms is still questionable. 

Recently, IBM researchers developed a specific classification task for which quantum kernel methods are provably better than traditional algorithms. Their quantum algorithm is based on a quantum kernel method. It is only given classical access to data and provides a certain exponential speed-up over classic ML algorithms for few classification problems.

It employs a time-proven conventional machine learning model to learn the kernel function, which determines the essential features in the data for classification. It also enables the construction of a dataset family for which only quantum computers can recognize the intrinsic labeling patterns, while for classical computers, the dataset looks like random noise.

Demonstration of Advantages of Quantum Algorithm

To show the power of the quantum method, they used a problem of computing logarithms in a cyclic group, also known as the discrete log problem. This problem separates the classical and quantum computation and allows for the generation of all the group members with a single mathematical operation. 

They created family classification problems based on a discrete log to employ this technique. Based on the difficulty level of discrete log problems, no efficient classical algorithm can do better than random guessing when solving this family of issues. Furthermore, they created a quantum feature map to view complex data in a higher-dimensional space and extract patterns to predict labels with high accuracy. Again, the high accuracy can be demonstrated in the presence of finite sampling noise from taking measurements.

The team acknowledges that there exist many real-world problems for which this quantum algorithm will not outperform conventional classical ML algorithms. As a result, the classification problem must have the aforementioned cyclical structure to benefit from the quantum algorithm.

In their future projects, they intend to generalize this structure in addition to performing this quantum kernel estimation algorithm for use cases that are perhaps beyond the reach of the quantum hardware available at present.

This study adds priceless value to the quantum machine learning domain by demonstrating an end-to-end quantum speed-up for a fault-tolerant quantum kernel method implemented with realistic assumptions. The team prioritizes delivering rigorously proven quantum advantages with robust speed-ups while also making an effort to build a quantum ecosystem that can generate more research.

Paper: https://arxiv.org/pdf/2010.02174.pdf

Source: https://research.ibm.com/blog/quantum-kernels

Tanushree Shenwai is a consulting intern at MarktechPost. She is currently pursuing her B.Tech from the Indian Institute of Technology(IIT), Bhubaneswar. She is a Data Science enthusiast and has a keen interest in the scope of application of artificial intelligence in various fields. She is passionate about exploring the new advancements in technologies and their real-life application.

🐝 [FREE AI WEBINAR] 'Beginners Guide to LangChain: Chat with Your Multi-Model Data' Dec 11, 2023 10 am PST