Can Entanglement-enhanced Quantum Kernels Improve Data Classification?
Abstract
Classical machine learning, extensively utilized across diverse domains, faces limitations in speed, efficiency, parallelism, and processing of complex datasets. In contrast, quantum machine learning algorithms offer significant advantages, including exponentially faster computations, enhanced data handling capabilities, inherent parallelism, and improved optimization for complex problems. In this study, we used the entanglement enhanced quantum kernel in quantum support vector machine to train complex respiratory data sets. Compared to classical algorithms, our findings reveal that QSVM performs better with higher accuracy 45% for complex respiratory data sets while maintaining comparable performance with linear datasets in contrast to their classical counterparts executed on a 2-qubit system. Through our study, we investigate the efficacy of the QSVM-Kernel algorithm in harnessing the enhanced dimensionality of the quantum Hilbert space for effectively training complex datasets.
I Introduction
In the ever-evolving landscape of machine learning in various sectors, from accelerating industrial automation to revealing the fundamental aspects of nature, machine learning algorithms have demonstrated remarkable efficacy in processing and analyzing data across multiple dimensions.[1, 2, 3] However, the performance of the ML algorithms is very dependent on the input dataset, having limitations in training random data sets and intricate optimizations. Classical algorithms, such as classical support vector machines (SVM), are extensively utilized in solving various problems in diverse domains; their strength lies in their ability to effectively solve classification problems, particularly through the use of kernel functions; their capability to handle nonlinear relationships between features makes them suitable for a wide range of applications, including bioactivity modeling, protein classification, and image enhancement.[4, 5] As the feature space becomes large and the kernel functions become computationally expensive to estimate, SVM faces challenges in successfully solving such problems. The choice of kernel function, kernel parameter, and regularization parameter are key parameters to effectively training the data sets [5]. Additionally, the computational complexity of increasing the non-linearity of kernels can lead to higher power consumption, posing practical challenges in real-world applications. [6, 7]

In contrast, quantum machine learning algorithms, including quantum support vector machines, have been performing better in speed, efficiency, and parallel processing of complex datasets compared to their classical counterparts.[8, 9, 10] Different quantum machine algorithms have been utilized for various tasks, including drug discovery [11], classification of particles produced by the large hadron collider (LHC) [12], and detection of quantum anomalies [13], calculation of electronic structure [14], and monitoring of healthcare [15]. Quantum SVM offers a significant speed-up gain in overall run-time complexity compared to their classical counterparts. [16] The inherent volatility of random data, its high-dimensional feature spaces, and the absence of clear patterns result in compromised accuracy and computational efficiency. Despite concerted efforts to enhance the performance of classical SVMs in such data sets through custom kernel functions and dimensionality reduction techniques, the problem persists. The ZZ feature map of Quantum Support Vector Machines (QSVM) plays a crucial role in transforming random data into a higher-dimensional space, thereby enhancing the training of QSVM in comparison to classical SVM. It is a non-linear mapping that extracts local properties of the input data, allowing for a more effective representation of the data in a higher-dimensional space. [17] This transformation is significant, as it changes the relative position between data points, making the data set easier to classify in the feature space. [8, 9] Additionally, the QSVM kernel method utilizes the large dimensionality of the quantum Hilbert space to replace the classical feature space, further enhancing the discriminative power of the QSVM. [18]
In this work, we used QSVM to classify the random dataset of different breathings acquired by the piezoelectric sensor. By merging the principles of quantum computing with the established SVM framework, our approach harnesses the intrinsic parallelism of the quantum realm and the ability to handle superpositions and entanglements. Using quantum-enhanced kernel functions, KQ-SVM seeks to navigate the intricacies of random data distributions and offers a viable solution to classical SVM limitations. Through empirical analyses spanning random-infused datasets, our research validates the superior performance of KQ-SVM, 45% higher precision than its classical counterparts. Thus, our study makes a pivotal advancement in quantum machine learning, setting a precedent for future explorations into the integration of quantum computing into the realm of data analysis.
II Methods
Kernel methods and quantum computing represent two intriguing yet distinct approaches for deciphering complex data, and while both have their merits, quantum algorithms, particularly quantum Support Vector Machines (SVM), demonstrate superiority, especially when dealing with random datasets. Kernel methods rely on the application of kernel functions to project data into a higher-dimensional feature space, unraveling intricate relationships within the data. This method, while effective, operates within the constraints of classical computation. However, quantum computing leverages the principles of quantum mechanics, utilizing qubits that exhibit superposition and entanglement to manipulate information in ways beyond classical capabilities. [19, 20] Quantum SVMs, specifically designed for quantum computers, provide a unique advantage by harnessing the power of quantum parallelism to process information more efficiently than classical SVMs. One notable distinction lies in the data representation paradigms employed by these approaches. Kernel methods visualize data as points that reside within the feature space, a representation limited by the classical computational framework. [21, 22] Quantum computers, on the contrary, utilize qubits existing in a vast Hilbert space, allowing for a more nuanced and flexible representation of the data. This fundamental difference underscores the diverse avenues through which information can be captured and manipulated, giving quantum algorithms an edge in handling complex, unpredictable datasets.
Although kernel methods have excelled in various machine learning tasks, boasting a well-established theoretical framework and diverse algorithms, they may face challenges when dealing with highly random datasets where the underlying patterns are elusive and non-linear. Quantum SVMs, on the other hand, offer a promising solution to this issue. The inherent quantum parallelism allows these algorithms to explore multiple solutions simultaneously, providing a more robust approach to capture intricate patterns in seemingly chaotic data. These are computationally demanding problems where classical SVMs and kernel methods may struggle due to their inherent limitations. The quantum advantage lies in its ability to process large amounts of information in parallel, offering a potential breakthrough for solving problems that were once deemed impractical for classical computation.
The captivating journey into the heart of a quantum support vector machine (QSVM) is a meticulous exploration of the intricate dance of quantum states, feature transformations, and learning algorithms that orchestrate this powerful machine learning tool. The journey begins with the preparation of qubits, the fundamental building blocks of quantum computation, in a specific configuration, laying the foundation for subsequent transformations. [22] The dynamical map then takes center stage, orchestrating the evolution of the quantum state under the combined influence of the input data and the chosen kernel function. [19, 23] This map acts as a translator, encoding the complex relationship between raw data and the feature space where classification ultimately occurs. [8] As the qubits evolve through this map, their state transforms into the evolved density matrix, reflecting the inherent uncertainty that defines the quantum realm. The measured feature vector then collapses the quantum wavefunction, transforming the probabilistic quantum state into a concrete classical vector suitable for classification algorithms. This vector serves as the bridge between the quantum realm and the classical world, carrying the distilled essence of the data within the feature space. The feature map plays a pivotal role in this transformation, acting as a portal that transports the data from its original input space to a higher-dimensional realm known as the feature space. Within this expanded canvas, complex relationships between data points that were previously hidden can become readily apparent, potentially leading to superior classification accuracy in challenging datasets. The training function plays a crucial role in guiding the behavior of the dynamical map and the resulting feature map, ultimately enabling the QSVM to navigate the vast feature space and distinguish between classes effectively. By meticulously optimizing this function through a training process, the QSVM gradually refines its ability to separate the data in the feature space, ultimately leading to more accurate classifications (Figure 1). [10, 24] Classical SVM seeks to find a hyperplane that maximizes the margin between the two classes. The decision function f(x) for SVM is
(1) |
where :Rd→F is the feature map that transforms the input data into a higher-dimensional feature space F. The optimization problem is
(2) |
Classical SVM uses a feature map to map input data to a higher-dimensional feature space H as (x), with the kernel function.
(3) |
This leads to the following decision function of SVM.
(4) |
Here, are the Lagrange multipliers, are the labels, and is the bias term.
While quantum feature maps input x to a quantum state —q(x)⟩ in Hilbert space Hq
(5) |
Entangling gates such as the CNOT gate create correlations between qubits
(6) |
Where U (x) is a quantum circuit parameterized by x., which leads to generating the quantum kernel as an inner product between quantum states.
(7) |
Entanglement-Enhanced Quantum Kernel Quantum feature maps embed data into an exponentially larger space, enabling better separation of complex data distributions:
(8) |
Entangled states represent dependencies between features more effectively than classical methods
(9) |
The quantum kernel naturally incorporates non-linear boundaries, making it ideal for datasets with complex structures
(10) |
Further entanglement maps to an entangled quantum state —q,e(x)⟩,
(11) |
Where Ue(x) is an entanglement based on the input x. Further, leads to the entanglement enhanced quantum kernel.
(12) |
(13) |
The higher accuracy of the quantum SVM is attributed to the entangled quantum states effectively mapping data to a much higher-dimensional space compared to classical or nonentangled quantum mappings, capturing intricate correlations between features, representing complex patterns more effectively, and the decision function now leverages the enhanced kernel.
III Results and discussion
In our investigation of unfolding the power of kernel-enhanced quantum machine learning (QML) model, such as the Kernel-Enhanced Quantum Support Vector Machine (KQ-SVM), on random datasets compared to classical SVMs, the following equations have been considered: 1. Classical SVM Optimization Problem: The classical SVM solves the following optimization problem to find the optimal hyperplane. In order to test the strength of the KQ-SVM in comparison to the classical SVM, various data sets have been selected, such as the breast cancer data set, the Iris data set and the randomly generated respiratory data sets. Figure 2(a) provides a visual representation of the respiratory dataset in a two-dimensional feature space, where f1 on the x-axis and f2 on the y-axis represent the two features. The breast cancer dataset has been taken as a linear dataset, where the two classes are distinguishable (Figure 2(b)). A quantum circuit of 2 qubits comprising the two Hadamard gates to create the entanglement and two ploy x-gates has been utilized to perform the quantum measurement of both datasets (Figure 2c). The QSVM enhanced with the kernel has been found to perform more accurately with 45 % higher accuracy for the randomly acquired respiratory dataset while providing almost comparable performance for the separate classes of the breast cancer dataset.

This approach holds promise for more accurate classifications by addressing complex relationships within the data. [25] Moreover, depicts the learning journey of a quantum circuit, showing that as the depth of the circuit increases, its training precision increases, indicating its ability to grasp more refined patterns in the data. This suggests that the model effectively uses quantum computing to learn complex relationships and improve its diagnostic capabilities.


Understanding the specific operations and interactions within this circuit is crucial for interpreting its predictions and ensuring its transparency and reliability in medical applications. [26, 27] The potential of combining kernel methods and quantum computing for breast cancer diagnosis is further supported by the literature.
To extend our investigation to more than binary classes, we have utilized the Iris dataset having 3 classes. Figure 3a presents a scatter plot that depicts instances of a dataset with features related to classification. However, the lack of clear labeling obscures the specific attributes used for classification, making precise interpretations difficult to find. The quantum circuit learning journey shows an increase in the training accuracy as the depth of the circuit increases, indicating its ability to grasp more refined patterns in the data (Figure 3b). The increased accuracy in Figure 3b prompts further investigation to confirm whether it indicates successful information extraction or potential overfitting. Quantum circuit, emphasizing the importance of understanding its functionality, specific gates, and connections to interpret the results and discern the potential advantages of this approach. The 4-qubit quantum circuit consists of a 4-hadammard gate to create the entanglement (Figure 3c). Additionally, the ability of quantum circuits to uniformly address the Hilbert space has been linked to classification accuracy, emphasizing the relevance of quantum computing in machine learning tasks. [28, 29]
The experiment depicted in the image explores the impact of different kernel types and learning rates on the performance of a machine learning model (Figure 4a). The study involved the use of a linear kernel, a polynomial kernel, a radial basis function (RBF) kernel, and a sigmoid kernel, with variations in the learning rate for each kernel. Performance evaluation was performed on both training and test data. The findings revealed that the choice of kernel and learning rate significantly influences model performance. For example, the RBF kernel with a learning rate of 0.01 exhibited the highest accuracy of 80% on the training data but the lowest accuracy of 30% on the testing data, indicating potential overfitting. In contrast, the linear kernel with a learning rate of 0.5 achieved the best performance on the test data with an accuracy of 60%, suggesting a better generalization to unseen data. However, it showed a lower accuracy of 57% in the training data, indicating potential underfitting. The other kernels yielded mixed results, with the polynomial kernel achieving 53% precision on the training data and 50% in the testing data, and the sigmoid kernel achieving 48% accuracy on the training data and 40% on the testing data. These results underscore the critical importance of carefully selecting the kernel and learning rate for machine learning models. The evaluation metrics for training the classical and quantum algorithms have indicated that there is not much deviation in accuracy when training the linear data, while for the random datasets, quantum machine learning performs better with higher accuracy 45% (Figure 4b). It indicates the different evaluation metrics such as precision, precision, recall, and F1 score to compare the performance among different databases, where i indicates the iris data set, r indicates the randomly generated respiratory dataset, while b represents the breast cancer data set. The optimal choice depends on the specific problem and the data set, emphasizing the need for experimentation to identify the best combination for a given task. [30, 31, 32, 33]
IV Conclusion
Classical SVMs often struggle with complex and randomly distributed datasets, compromising their accuracy and efficiency. Our proposed KQ-SVM leverages quantum-enhanced kernel functions and quantum parallelism to address these challenges. Empirical analysis across diverse datasets shows KQ-SVM significantly outperforms classical SVMs, achieving over 45% higher accuracy on complex datasets while maintaining comparable performance on linear datasets. This research demonstrates the transformative potential of quantum computing in machine learning, paving the way for enhanced performance and accuracy in real-world applications.
V Experimental Section
V.1 Fabrication and Characterization of the Sensor
The nylon-11 nanofibers were produced using the electrospinning technique. The PVDF solution has taken 10 wt% in the mixed solution of Trifluoroacetic acid: Acetone in the ratio of 6:4 and heated up at 60 °C for 6 hours. Further solution was loaded into a 10 ml syringe while applying the 18 kV voltage on the syringe tip, and the produced nanofibers were collected on the rotating drum collector at 1200 rpm. The nanofiber mat is sandwiched between the aluminum electrodes to fabricate the piezoelectric sensor.
V.2 Characterization Techniques
A digital storage oscilloscope (DSOX1102G, Keysight) was used to acquire the open-circuit voltage and respiratory signals. All the measurements were acquired in the non-invasive mode on the author himself and volunteers. Written consent has also been given prior to data recording.
V.3 Development of Machine Learning and Quantum Machine Algorithms
V.3.1 Quantum Machine Learning Algorithms
We utilized the panda’s library for data manipulation and sci-kit-learn for data preprocessing tasks such as feature scaling, dimensionality reduction, and train-test splitting. The Wisconsin Breast Cancer (Diagnostic) dataset is obtained from the UCI Machine Learning Repository. We used Qiskit for quantum computing functionalities for the quantum-based classification model, Qiskit Machine Learning for implementing quantum kernels and QSVC, and Qiskit Algorithms for algorithmic support. Additionally, we employed sci-kit-learn for traditional machine learning models.
Our quantum-based model consisted of feature mapping using the ZZFeatureMap from Qiskit’s circuit library, a fidelity quantum kernel implemented using the Fidelity Quantum Kernel from Qiskit Machine Learning, and the QSVC model for training and classification tasks. For comparison, we trained a classical Support Vector Classifier (SVC) using scikit-learn. We initialized the experiment with a fixed random seed for reproducibility and split the data set into training and testing sets with an 80:20 ratio using stratified sampling. All the quantum measurements have been carried out on the IBM computing platform.
V.3.2 Classical Machine Learning Algorithms
Support vector machine (SVM) has been built using Python libraries, including (NumPy, TensorFlow, and Matplotlib), and sequential data for a classification task. We loaded data into separate X and Y data frames, performed one hot encoding on the labels, and split the data set into 80% training and 20% testing. It was trained for 10 epochs with batch size 32, using categorical cross-entropy loss and the Adam optimizer. We evaluated model performance in the test set, visualized results with a confusion matrix using Seaborn, and tracked training accuracy over epochs.
Acknowledgements.
The authors express their gratitude to the entire Quantum Accelerated Computing workshop team. AB extends appreciation to the University Grants Commission (UGC) for the fellowship (1354/(CSIR-UGC NET DEC. 2018)). The authors deeply appreciate Param Smriti for providing the high-performance computing facility essential for conducting this work. Furthermore, we are sincerely grateful for the support from the IBM quantum computing facility and Pennylane for their contributions to quantum computing resources. The authors are very grateful for the fruitful discussion with Dr. Gurumohan Singh, CDAC-Mohali.References
- Carleo and Troyer [2017] G. Carleo and M. Troyer, Solving the quantum many-body problem with artificial neural networks, Science 355, 602 (2017).
- Brunton et al. [2021] S. Brunton, J. Kutz, K. Manohar, A. Aravkin, K. Morgansen, and J. K. et al., Data-driven aerospace engineering: Reframing the industry with machine learning, AIAA Journal , 1 (2021).
- Babu et al. [2023] A. Babu, S. Ranpariya, D. Sinha, and D. Mandal, Deep learning enabled perceptive wearable sensor: An interactive gadget for tracking movement disorder, Advanced Materials Technologies 8, 10.1002/admt.202300046 (2023).
- He [2019] T. He, High-performance support vector machines and its applications (2019).
- Kumar et al. [2019] B. Kumar, O. Vyas, and R. Vyas, A comprehensive review on the variants of support vector machines, Modern Physics Letters B 33, 1950303 (2019).
- Opper and Urbanczik [2001] M. Opper and R. Urbanczik, Universal learning curves of support vector machines, Physical Review Letters 86, 4410 (2001).
- Sassi et al. [2019] I. Sassi, S. Ouftouh, and S. Anter, Adaptation of classical machine learning algorithms to big data context: Problems and challenges: Case study: Hidden markov models under spark, in 2019 1st International Conference on Smart Systems and Data Science (ICSSD) (2019) pp. 1–7.
- Rebentrost et al. [2014] P. Rebentrost, M. Mohseni, and S. Lloyd, Quantum support vector machine for big data classification, Physical Review Letters 113, 10.1103/physrevlett.113.130503 (2014).
- García and José [2022] D. P. García and F. J. José, Systematic literature review: Quantum machine learning and its applications (2022), accessed May 27, 2024.
- Simões et al. [2023] R. D. M. Simões, P. Huber, N. Meier, N. Smailov, R. M. Füchslin, and K. Stockinger, Experimental evaluation of quantum machine learning algorithms, IEEE Access 11, 6197 (2023).
- Batra et al. [2021] K. Batra, K. Zorn, D. Foil, E. Minerali, V. Gawriljuk, and T. L. et al., Quantum machine learning algorithms for drug discovery applications, Journal of Chemical Information and Modeling 61, 2641 (2021).
- Wu et al. [2021] S. Wu, S. Sun, W. Guan, C. Zhou, J. Chan, and C. C. et al., Application of quantum machine learning using the quantum kernel algorithm on high energy physics analysis at the lhc, Physical Review Research 3, 10.1103/physrevresearch.3.033221 (2021).
- Liu and Rebentrost [2018] N. Liu and P. Rebentrost, Quantum machine learning for quantum anomaly detection, Physical Review A 97, 10.1103/physreva.97.042315 (2018).
- Xia and Kais [2018] R. Xia and S. Kais, Quantum machine learning for electronic structure calculations, Nature Communications 9, 10.1038/s41467-018-06598-z (2018).
- Flöther [2023] F. Flöther, The state of quantum computing applications in health and medicine, Research Directions: Quantum Technologies , 1 (2023).
- Khan and Robles-Kelly [2020] T. M. Khan and A. Robles-Kelly, Machine learning: Quantum vs classical, IEEE Access 8, 219275 (2020).
- Schuld and Killoran [2019] M. Schuld and N. Killoran, Quantum machine learning in feature hilbert spaces, Physical Review Letters 122, 10.1103/physrevlett.122.040504 (2019).
- Sajjan et al. [2022] M. Sajjan, J. Li, R. Selvarajan, S. Sureshbabu, S. Kale, and R. G. et al., Quantum machine learning for chemistry and physics, Chemical Society Reviews 51, 6475 (2022).
- Blank et al. [2020] C. Blank, D. Park, J. Rhee, and F. Petruccione, Quantum classifier with tailored quantum kernel, npj Quantum Information 6, 10.1038/s41534-020-0272-6 (2020).
- Jäger and Krems [2023] J. Jäger and R. Krems, Universal expressiveness of variational quantum classifiers and quantum kernels for support vector machines, Nature Communications 14, 10.1038/s41467-023-36144-5 (2023).
- Park et al. [2023] S. Park, D. Park, and J. Rhee, Variational quantum approximate support vector machine with inference transfer, Scientific Reports 13, 10.1038/s41598-023-29495-y (2023).
- Duan et al. [2019] B. Duan, J. Yuan, J. Xu, and D. Li, Quantum algorithm and quantum circuit for a-optimal projection: Dimensionality reduction, Physical Review A 99, 10.1103/physreva.99.032311 (2019).
- Bennett and DiVincenzo [2000] C. Bennett and D. DiVincenzo, Quantum information and computation, Nature 404, 247 (2000).
- Vasques et al. [2023] X. Vasques, H. Paik, and L. Cif, Application of quantum machine learning using quantum kernel algorithms on multiclass neuron m-type classification, Scientific Reports 13, 10.1038/s41598-023-38558-z (2023).
- Xu et al. [2022] L. Xu, X. Zhang, J. Wang, M. Li, L. Jian, and S. Shen, Variational quantum support vector machine based on hadamard test, Communications in Theoretical Physics 74, 055106 (2022).
- Ding et al. [2022] C. Ding, T. Bao, and H. Huang, Quantum-inspired support vector machine, IEEE Transactions on Neural Networks and Learning Systems 33, 7210 (2022).
- Moradi et al. [2022] S. Moradi, C. Brandner, C. Spielvogel, D. Krajnc, S. Hillmich, R. Wille, and L. Papp, Clinical data classification with noisy intermediate scale quantum computers, Scientific Reports 12, 10.1038/s41598-022-05971-9 (2022).
- Li et al. [2015] Z. Li, X. Liu, N. Xu, and J. Du, Experimental realization of a quantum support vector machine, Physical Review Letters 114, 10.1103/physrevlett.114.140504 (2015).
- Shan et al. [2022] Z. Shan, J. Guo, and X. D. et al., Demonstration of breast cancer detection using qsvm on ibm quantum processors (2022).
- Elmaghraby [2020] A. Elmaghraby, Dementia prediction applying variational quantum classifier (2020), accessed May 27, 2024.
- Kumar et al. [2023] T. Kumar, D. Kumar, and G. Singh, Brain tumour classification using quantum support vector machine learning algorithm, IETE Journal of Research , 1 (2023).
- Saxena and Saxena [2023] A. Saxena and S. Saxena, Pancreatic cancer data classification with quantum machine learning, Journal of Quantum Computing 5, 1 (2023).
- Neill et al. [2018] C. Neill, P. Roushan, K. Kechedzhi, S. Boixo, S. Isakov, and V. S. et al., A blueprint for demonstrating quantum supremacy with superconducting qubits, Science 360, 195 (2018).