Re-uploading Data in Tensor Network

Authors

DOI:

https://doi.org/10.18372/1990-5548.85.20428

Keywords:

machine learning, quantum computing, quantum machine learning, re-uploading, tensor network, barren plateaus, differential evolution, quantum neural network

Abstract

In this paper, we present an approach for enhancing quantum tensor networks through the method of data re-uploading. The proposed framework integrates multiple layers of classical data encoding into tensor network architectures, thereby improving their approximation capacity and reducing the impact of barren plateaus in training. The model construction relies on tree tensor networks combined with RX, RZ, and RY rotational gates and CNOT entanglement, while optimization is performed using differential evolution as a gradient-free algorithm. Experimental evaluation was carried out on the iris and wine datasets, comparing baseline tensor networks with architectures incorporating one to three re-uploading layers. The results demonstrate a consistent reduction in training and test loss, with accuracy, recall, and precision reaching 100% on the iris dataset for three layers and improvements of up to 40% in prediction quality on the wine dataset. These findings confirm that data re-uploading significantly enhances the performance and expressiveness of tensor network-based quantum models.

Author Biographies

Victor Sineglazov , State University "Kyiv Aviation Institute"

In this paper, we present an approach for enhancing quantum tensor networks through the method of data re-uploading. The proposed framework integrates multiple layers of classical data encoding into tensor network architectures, thereby improving their approximation capacity and reducing the impact of barren plateaus in training. The model construction relies on tree tensor networks combined with RX, RZ, and RY rotational gates and CNOT entanglement, while optimization is performed using differential evolution as a gradient-free algorithm. Experimental evaluation was carried out on the iris and wine datasets, comparing baseline tensor networks with architectures incorporating one to three re-uploading layers. The results demonstrate a consistent reduction in training and test loss, with accuracy, recall, and precision reaching 100% on the iris dataset for three layers and improvements of up to 40% in prediction quality on the wine dataset. These findings confirm that data re-uploading significantly enhances the performance and expressiveness of tensor network-based quantum models.

Petro Chynnyk , National Technical University of Ukraine “Igor Sikorsky Kyiv Polytechnic Institute”

PhD Student

Institute for Applied System Analysis

Faculty of Artificial Intelligence

References

M. Larocca, S. Thanasilp, S. Wang, K. Sharma, J. Biamonte, P. J. Coles, L. Cincio, J. R. McClean, Z. Holmes, and M. Cerezo, “A review of barren plateaus in variational quantum computing,” arXiv preprint, 2024. [Online]. Available: https://arxiv.org/html/2405.00781v1

J. McClean, S. Boixo, V. N. Smelyanskiy, R. Babbush, and H. Neven, “Barren plateaus in quantum neural network training landscapes,” Nature Communications, vol. 9, no. 1, p. 4812, 2018. https://doi.org/10.1038/s41467-018-07090-4

A. Pérez-Salinas, A. Cervera-Lierta, E. Gil-Fuster, and J. I. Latorre, “Data re-uploading for a universal quantum classifier,” Quantum, vol. 4, p. 226, 2020. https://doi.org/10.22331/q-2020-02-06-226

A. Araujo, V. Madhok, and A. Datta, “Practical overview of image classification with tensor-network quantum circuits,” Machine Learning: Science and Technology, vol. 3, no. 1, p. 015002, 2022. https://doi.org/10.1088/2632-2153/ac41bb

R. Storn and K. Price, “Differential evolution – a simple and efficient heuristic for global optimization over continuous spaces,” Journal of Global Optimization, vol. 11, pp. 341–359, 1997. https://doi.org/10.1023/A:1008202821328

J. Schuld, B. Sweke, and M. Schuld, “General parameter-shift rules for quantum gradients,” Quantum, vol. 5, p. 410, 2021. https://doi.org/10.22331/q-2021-03-15-410

M. Schuld and N. Killoran, “Supervised learning with quantum enhanced feature spaces,” Physical Review Letters, vol. 122, no. 4, p. 040504, 2019. https://doi.org/10.1103/PhysRevLett.122.040504

M. Schuld, V. Bergholm, C. Gogolin, J. Izaac, and N. Killoran, “Evaluating analytic gradients on quantum hardware,” Physical Review A, vol. 99, no. 3, p. 032331, 2019. https://doi.org/10.1103/PhysRevA.99.0323

M. Cerezo, A. Arrasmith, R. Babbush, S. C. Benjamin, S. Endo, K. Fujii, J. R. McClean, K. Mitarai, X. Yuan, L. Cincio, and P. J. Coles, “Variational quantum algorithms,” arXiv preprint, 2019. [Online]. Available: https://arxiv.org/pdf/2012.09265.pdf

L. Wright, F. Barratt, J. Dborin, V. Wimalaweera, B. Coyle, and A. G. Green, “Deterministic tensor network classifiers,” arXiv preprint, 2022. [Online]. Available: https://arxiv.org/pdf/2205.09768.pdf

D. P. Kingma and J. L. Ba, “Adam: A method for stochastic optimization,” arXiv preprint, 2014. [Online]. Available: https://arxiv.org/pdf/1412.6980

T. Hur, L. Kim, and D. K. Park, “Quantum convolutional neural network for classical data classification,” arXiv preprint, 2022. [Online]. Available: https://arxiv.org/abs/2108.00661

M. Cerezo, A. Sone, T. Volkoff, L. Cincio, and P. J. Coles, “Cost-function-dependent barren plateaus in shallow parametrized quantum circuits,” Nature Communications, vol. 12, no. 1, p. 1791, 2021. https://doi.org/10.1038/s41467-021-21728-w

J. Biamonte and V. Bergholm, “Tensor networks in a nutshell,” arXiv preprint, 2017. [Online]. Available: https://arxiv.org/abs/1708.00006

S. Lloyd, M. Schuld, A. Ijaz, J. Izaac, and N. Killoran, “Quantum embeddings for machine learning,” arXiv preprint, 2020. [Online]. Available: https://arxiv.org/abs/2001.03622

A. Skolik, S. Jerbi, A. Elben, D. Garcia-Pintos, and H. J. Briegel, “Layerwise learning for quantum neural networks,” Quantum Machine Intelligence, vol. 5, p. 5, 2023. https://doi.org/10.1007/s42484-023-00080-z

E. Grant, M. Benedetti, S. Cao, A. Hallam, A. Lockhart, V. Stojevic, L. Wossnig, and S. Severini, “Hierarchical quantum classifiers,” npj Quantum Information, vol. 4, p. 65, 2018. https://doi.org/10.1038/s41534-018-0116-9

M. Schuld and F. Petruccione, Supervised Learning with Quantum Computers. Springer International Publishing, 2018. https://doi.org/10.1007/978-3-319-96424-9

J. R. McClean, J. Romero, R. Babbush, and A. Aspuru-Guzik, “The theory of variational hybrid quantum-classical algorithms,” arXiv preprint, 2015. [Online]. Available: https://arxiv.org/pdf/1509.04279

A. Arrasmith, M. Cerezo, L. Cincio, and P. J. Coles, “Effect of barren plateaus on gradient-free optimization,” Quantum, vol. 5, p. 558, 2021. https://doi.org/10.22331/q-2021-10-05-558

E. M. Stoudenmire and D. J. Schwab, “Supervised learning with quantum-inspired tensor networks,” Advances in Neural Information Processing Systems, vol. 29, 2016. [Online]. Available: https://arxiv.org/abs/1605.05775

Downloads

Published

2025-09-29

Issue

Section

COMPUTER SCIENCES AND INFORMATION TECHNOLOGIES