Bulletin of the American Physical Society
APS March Meeting 2022
Volume 67, Number 3
Monday–Friday, March 14–18, 2022; Chicago
Session S37: Quantum Machine Learning IFocus Recordings Available

Hide Abstracts 
Sponsoring Units: DQI GDS Chair: Zoe Holmes, Los Alamos National Laboratory Room: McCormick Place W194B 
Thursday, March 17, 2022 8:00AM  8:12AM 
S37.00001: Representation Learning via Quantum Neural Tangent Kernels Junyu Liu, Francesco Tacchino, Jennifer R Glick, Liang Jiang, Antonio Mezzacapo Variational quantum circuits are used in quantum machine learning and variational quantum simulation tasks. Designing good variational circuits or predicting how well they perform for given learning or optimization tasks is still unclear. In this paper, we address these problems, studying variational quantum circuits using the theory of neural tangent kernels. We define quantum neural tangent kernels, and derive the dynamical equation of their loss function in optimization and learning tasks. We define and analyze quantum neural tangent kernels in the frozen limit, where their variational angles change slowly and a linear perturbation of the variational angles is good enough to describe the dynamics, which is commonly known in machine learning as the lazy training regime. We then extend the analysis to a dynamical setting, including quadratic corrections in the variational angles. We define a large width limit for quantum kernels, showing that a hybrid quantumclassical neural network can be approximately Gaussian. Our results elucidate a regime in which an analytical understanding of the training dynamics for variational quantum circuits, used for quantum machine learning and optimization problems, is possible. 
Thursday, March 17, 2022 8:12AM  8:24AM 
S37.00002: Importance of Kernel Bandwidth in Quantum Machine Learning Ruslan Shaydulin, Stefan Wild Quantum kernel methods are considered a promising avenue for applying quantum computers to machine learning problems. However, recent results overlook the central role hyperparameters play in determining the performance of machine learning methods, including quantum. In this work, we show how optimizing the bandwidth of a quantum kernel can improve the performance of the kernel method from random guess to being competitive with the best classical methods. We show that without hyperparameter optimization, kernel bandwidth reduces exponentially with qubit count. We identify this to be the cause behind recent observation that the performance of quantum kernel methods decreases with qubit count. We reproduce these negative results and show that if kernel bandwidth is optimized, the performance improves with growing qubit count, leading to the opposite conclusion about the possibility of quantum advantage. We provide numerical evidence of improved performance with increasing number of qubits using multiple quantum kernels and classical datasets. 
Thursday, March 17, 2022 8:24AM  8:36AM 
S37.00003: Quantum Enhanced Optimization for IndustrialScale Problems William P Banner, Tim Menke, Shima B Hadiashar, Grzegorz Mazur, Marcin Ziolkowski, Ken Kennedy, Jeffrey A Grover, Jhonathan Romero, William D Oliver Many of the most challenging optimization problems faced by industry today are combinatorial in nature. Quantum computing and related approaches offer new heuristics for tackling these problems that might provide advantages over traditional optimization methods. Establishing such advantages requires benchmarking on specific problem instances. In this work, we consider the production plant optimization problem under realistic conditions. We characterize the problem and carry out a benchmark of multiple classical and quantuminspired optimizers, including techniques based on generative modeling for quantum enhanced optimization. By comparing classical optimizers, quantumenhanced optimizers, and mixed optimizers that combine the two, we gain insights into which aspects of the problems influence the performance of the optimizers. In addition, we perform a scaling analysis of the optimization methods and estimate thresholds for advantage. 
Thursday, March 17, 2022 8:36AM  8:48AM 
S37.00004: Generative Quantum Learning of Joint Probability Distribution Functions Sonika Johri, Elton Zhu, Dave Bacon, Mert Esencan, Jungsang Kim, Mark Muir, Nikhil Murgai, Jason Nguyen, Neal Pisenti, Adam Schouela, Ksenia Sosnova, Ken Wright Modeling joint probability distributions is an important task in a wide variety of fields. One popular technique for this employs a family of multivariate distributions with uniform marginals called copulas. While the theory of modeling joint distributions via copulas is well understood, it gets practically challenging to accurately model real data with many variables. In this work, we design quantum machine learning algorithms to model copulas. We show that any copula can be naturally mapped to a multipartite maximally entangled state. A variational ansatz we christen as a `qopula' creates arbitrary correlations between variables while maintaining the copula structure starting from a set of Bell pairs for two variables, or GHZ states for multiple variables. As an application, we train a Quantum Generative Adversarial Network (QGAN) and a Quantum Circuit Born Machine (QCBM) using this variational ansatz to generate samples from joint distributions of two variables for historical data from the stock market. We demonstrate our generative learning algorithms on trapped ion quantum computers from IonQ for up to 8 qubits and show that our results outperform those obtained through equivalent classical generative learning. Further, we present theoretical arguments for exponential advantage in our model's expressivity over classical models based on communication and computational complexity arguments. 
Thursday, March 17, 2022 8:48AM  9:00AM 
S37.00005: Learning exotic phases of matter via hidden Born Machines Khadijeh Najafi, Abigail McClain Gomez, Susanne F Yelin Quantum Inspired generative models are becoming one of the most appealing tools in machine learning. Owning to the quantum nature they can express complex distributions which are intractable by a classical computer, leading to quantum advantage. However, their learning power and efficient training schemes are far from being understood and are an ongoing investigation. Here, we focus on Born machines as quantuminspired generative models, which allow the parameterization of the joint probability distributions of target data via Born probabilities of quantum states. We first focus on the Complex Born Machine, harnessing already existing tools such as tensor Network as an efficient ansatz, and present our results in learning exotic phases of quantum states obtained from XY and Rysberg spin chain. We further introduce the periodic Born Machine and show that matching the boundary condition of the Born Machine and that of training data improves the performance when limited data is available. We finally comment on the power of learning of complex Born machines when data from a different base is obtained. 
Thursday, March 17, 2022 9:00AM  9:12AM 
S37.00006: Quantum kernels for electronic health records classification Omar Shehab, Zoran Krunic, Frederik Floether, George Seegan, Nate EarnestNoble In this session, we present the first systematic study of quantum support vector machine (QSVM) complexity space and the first quantum classification of an electronic health records dataset. We classified the persistence of rheumatoid arthritis patients on biologic therapies, predicting 6month persistence via binary classification. In addition, we developed an endtoend framework to study empirical quantum advantage that can be generalized for other machine learning and optimization problems. This was achieved by comparing the landscapes of classical and quantum models via introduction of the terrain ruggedness index. We selected data subsets and created a grid of 5–20 features and 200–300 samples. For each grid coordinate (number of features, number of training samples), we trained classical SVM models based on radial basis function kernels and quantum models with custom kernels using Qiskit and IBM Quantum simulators and real hardware. We observed partial empirical quantum advantage and our generalizable framework enables a priori identification of datasets where quantum advantage could exist. 
Thursday, March 17, 2022 9:12AM  9:48AM 
S37.00007: Covariant quantum kernels for data with group structure Invited Speaker: Jennifer R Glick The use of kernel functions is a common technique to extract important features from data sets. A quantum computer can be used to estimate kernel entries as transition amplitudes of unitary circuits. It can be shown quantum kernels exist that, subject to computational hardness assumptions, can not be computed classically. It is an important challenge to find quantum kernels that provide an advantage in the classification of real world data. Here we introduce a class of quantum kernels that are related to covariant quantum measurements and can be used for data that has a group structure. The kernel is defined in terms of a single fiducial state that can be optimized by using a technique called kernel alignment. Quantum kernel alignment optimizes the kernel family to minimize the upper bound on the generalisation error for a given data set. We apply this general method to a specific learning problem we refer to as labeling cosets with error and implement the learning algorithm on 27 qubits of a superconducting processor. 
Thursday, March 17, 2022 9:48AM  10:00AM 
S37.00008: Direct implementation of a perceptron in superconducting circuit quantum hardware Stefan Filipp, Marek Pechal, Federico Roy, Samuel A Wilkinson, Gian Salis, Max Werninghaus, Michael J Hartmann The utility of classical neural networks as universal approximators suggests that their quantum analogues could play an important role in quantum generalizations of machinelearning methods. In this work we demonstrate a superconducting qubit implementation of an adiabatic controlled gate, which generalizes the action of a classical perceptron as the basic building block of a quantum neural network. We show full control over the steepness of the perceptron activation function, the input weight and the bias by tuning the adiabatic gate length, the coupling between the qubits and the frequency of the applied drive respectively. In its general form, the gate realizes an Nqubit entangling operation in a single step, whose decomposition into single and twoqubit gates would require a number of gates that is exponential in N. Its demonstrated direct implementation as perceptron in quantum hardware may therefore lead to more powerful quantum neural networks when combined with suitable additional standard gates. 
Thursday, March 17, 2022 10:00AM  10:12AM 
S37.00009: On the Expressive Power of Quantum versus Classical Generative Models Luis Serrano, Alejandro PerdomoOrtiz Generative models are one of the key candidates in the race for practical quantum advantage. We compare a family of quantum generative models (e.g., Quantum Circuit Born Machines) with wellknown families of classical generative models (e.g., Restricted Boltzmann Machines) using several different metrics. We study the distribution of eigenvalues from the Fisher information matrix corresponding to each model and its impact on the effective dimension computed as defined in Abbas et al. Nat Comput Sci 1, 403–409 (2021). As in that study, we found that the output of the quantum generative models has a higher effective dimension than its classical counterparts, implying that the quantums model have higher expressibility and a higher capacity to generate different probability distributions. 
Thursday, March 17, 2022 10:12AM  10:24AM 
S37.00010: Classification using quantum similarity learning Casey Jao, Santosh Radha Similarity learning tries to learn a function that measures the likeness of two objects; once trained, the similarity measure can be used to classify new data by comparing them with existing data. We present a classifier whose similarity function takes the form of a parametrized quantum kernel. Our method treats binary and multiclass classification problems in a unified fashion. Numerical experiments from both classical simulations and tests on trappedion quantum computing hardware are presented. 
Thursday, March 17, 2022 10:24AM  10:36AM 
S37.00011: Compression and dimensionality reduction techniques for Quantum Machnine Learning applications in High Energy Physics Panagiotis Barkoutsos, DenisPatrick Odagiu, Vasileios Belis, Lennart Schulze, Christina Reissel, Elias FernandezCombarro Alvarez, Jennifer R Glick, Sofia Vallecorsa, Guenther Dissertori, Ivano Tavernelli Currently available quantum processors are dominated by short coherence time, small number of qubits, and limited connectivity. Application of quantum machine learning techniques require the embedding of classical data in quantum circuits. Data I/O can become a practical challenge that can hinder the advantages of quantum algorithms. In this talk we discuss about the quantum counterpart of Support Vector Machines (namely Quantum SVMs) for the binary classification of High Energy Physics data associated with the production of the Higgs boson. Recent proposals employ a onefeaturetoonequbit mapping for the encoding of the classical data, prohibiting the simulation of datasets with extensive number of features. This imposes the need of feature compression on complex datasets with the challenge to maintain sufficient information to achieve high classification accuracy. In particular, we implement and compare feature extraction and dimensionality reduction techniques with respect to the quantum machine algorithm and identify the ones that have the minimal effect in the classification accuracy. 
Thursday, March 17, 2022 10:36AM  10:48AM 
S37.00012: Machine learning phase transitions in a scalable manner Marina Krstic Marinkovic, Arturo DeGiorgi The rapid progress of quantum technologies and AI over the past few years has the potential to revolutionize many areas of physics where analytical solutions are not feasible and conventional simulation techniques fail. Condensed matter and particle physics systems hampered by the sign problem, as well as realtime evolution of large open quantum systems are some examples where the applications of quantum machine learning are particularly sought after, as research performed to date has not been able to provide a systematic solution to the sign problem on classical computers. 
Follow Us 
Engage
Become an APS Member 
My APS
Renew Membership 
Information for 
About APSThe American Physical Society (APS) is a nonprofit membership organization working to advance the knowledge of physics. 
© 2024 American Physical Society
 All rights reserved  Terms of Use
 Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 207403844
(301) 2093200
Editorial Office
100 Motor Pkwy, Suite 110, Hauppauge, NY 11788
(631) 5914000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 200452001
(202) 6628700