Bulletin of the American Physical Society
APS March Meeting 2023
Volume 68, Number 3
Las Vegas, Nevada (March 510)
Virtual (March 2022); Time Zone: Pacific Time
Session N73: Quantum Generative ModelsFocus

Hide Abstracts 
Sponsoring Units: DQI Chair: Kathleen Hamilton, Oak Ridge National Laboratory Room: Room 405 
Wednesday, March 8, 2023 11:30AM  11:42AM 
N73.00001: Predicting Properties of Quantum Systems with Conditional Generative Models Haoxiang Wang, Maurice Weber, Josh Izzac, Cedric Lin Classical machine learning has emerged recently as a powerful tool for predicting properties of quantum manybody systems. For many ground states of gapped Hamiltonians, generative models can reconstruct the state accurately enough to predict 1 and 2body observables, given that they are trained on the output of repeated measurements on the same state. Alternatively, kernel methods can predict local observables after being trained on measurement outcomes on different but related quantum states, but effectively require a new model to be trained for each observable. In this work, we combine the benefits of both approaches and propose the use of conditional generative models to simultaneously represent a family of states, by learning shared structures of different quantum states from measurements. The trained model allows us to predict arbitrary local properties of ground states, even for states not present in the training data, and without necessitating further training for new observables. We numerically validate our approach (with simulation of up to 45 qubits) for two quantum manybody problems, 2D random Heisenberg models and Rydberg atom systems. 
Wednesday, March 8, 2023 11:42AM  11:54AM 
N73.00002: Fast QuantumAssisted Sampling as a Potential Source of Quantum Advantage at Training Generative Models with latent Variables Carla M Quispe Flores Approximating the underlying structure of realworld data is a central quest in unsupervised machine learning, where generative models with latent variables have proven to be a powerful tool. Restricted Boltzmann machines (RBMs) and Variational Autoencoders are important graphical models capable of learning multimodal distributions over highdimensional datasets. However, Loglikelihood gradients must be approximated via sampling, which generally requires computationally expensive MCMC chains. Given this challenging task of approximating a thermal state, quantum annealers are promising candidates to sample classical or quantum Gibbs distributions, replacing slow classical MCMC schemes. In particular, we introduce a nonconventional annealing protocol socalled Random Frequency Quantum Annealing (RFQA) [1] , a promising candidate to offer noise tolerant speed ups for optimization and sampling tasks. This work explores the performance of VAEs [2] and RBMs trained with state of the artand computationally expensiveclassical sampling algorithms, such as Persistent Contrastive divergence (PCD), gradient centered methods [3], Parallel tempering (PT) and Contrastive divergence (CDk) with several k steps, as proxies for the quantum device. Results on image reconstruction assessed on the MNIST dataset show that gradients estimated with samples close to the equilibrium Gibbs distribution generalize better, furnishing a less biased and higher Loglikelihood scores even when facing data scarcity. That is on numerical experiments ran with reduced samples containing only 4% of the original data. Finally, it is shown that deep convolutional VAEs with BMs priors placed in the latent space achieve higher LL scores than the commonly used gaussian priors. 
Wednesday, March 8, 2023 11:54AM  12:06PM 
N73.00003: Qubit seriation: Undoing data shuffling using spectral ordering Atithi Acharya, Manuel S Rudolph, Jing Chen, Jacob E Miller, Alejandro PerdomoOrtiz With the advent of quantum and quantuminspired machine learning, adapting the structure of learning models to match the structure of target datasets has been shown to be crucial for obtaining high performance. Probabilistic models based on tensor networks (TNs) are prime candidates to benefit from datadependent design considerations, owing to their bias towards correlations that are local with respect to the topology of the model. In this work, we use methods from spectral graph theory to search for optimal permutations of model sites that are adapted to the structure of an input dataset. Our method uses pairwise mutual information estimates from the target dataset to ensure that strongly correlated bits are placed closer to each other relative to the model’s topology. We demonstrate the effectiveness of such preprocessing for probabilistic modeling tasks, finding substantial improvements in the performance of generative models based on matrix product states (MPS) across a variety of datasets. We also show how spectral embedding, a dimensionality reduction technique from 
Wednesday, March 8, 2023 12:06PM  12:18PM 
N73.00004: Towards a scalable discrete quantum generative adversarial neural network Alexey Galda, Smit Chaudhary, Patrick Huembeli, Ian MacCormack, Jean Kossaifi, Taylor Patty We introduce a fully quantum generative adversarial network intended for use with binary data. The architecture incorporates several features found in other classical and quantum machine learning models, which up to this point had not been used in conjunction. In particular, we incorporate noise reuploading in the generator, auxiliary qubits in the discriminator to enhance expressivity, and a direct connection between the generator and discriminator circuits, obviating the need to access the generator's probability distribution. We show that, as separate components, the generator and discriminator perform as desired. We empirically demonstrate the expressive power of our model on both synthetic data as well as low energy states of an Ising model. Our demonstrations suggest that the model is not only capable of reproducing discrete training data, but also of potentially generalizing from it. 
Wednesday, March 8, 2023 12:18PM  12:30PM 
N73.00005: Enhancing Optimization Techniques with QuantumInspired Generative Models (Part 1) William P Banner, Shima Bab Hadiashar, Grzegorz Mazur, Tim Menke, Marcin Ziolkowski, Jeffrey A Grover, Jhonathan Romero, William D Oliver Largescale integer combinatorial problems represent some of the most commonly occurring optimization problems in industrial settings. Quantuminspired optimizers based on tensor networks can find unique optimization routes that may solve these problems faster than traditional approaches. In this work, we utilize such a quantuminspired optimizer to enhance traditional optimization methods and analyze performance on a BMW plant optimization problem. Specifically, we investigate optimizer performance under basic data encodings and parameterizations. We also explore a subspace of the hyperparameters for the quantuminspired optimizer and show that a maximum performance can be achieved as compared to other hyperparameter configurations. Finally, we compile these datasets to show the limits of quantuminspired improvement of traditional optimization methods in cases of little problemknowledge. 
Wednesday, March 8, 2023 12:30PM  12:42PM 
N73.00006: Enhancing Optimization Techniques with QuantumInspired Generative Models (Part 2) Shima Bab Hadiashar, William P Banner, Grzegorz Mazur, Tim Menke, Marcin Ziolkowski, Jeffrey A Grover, Jhonathan Romero, William D Oliver Optimization with classical, quantuminspired, or quantum methods often displays best performance when a high degree of problem knowledge is incorporated. In typical industrial applications this domainspecific knowledge already exists. In this work we identify a method of data encoding and search space reduction in a BMW plant optimization problem. The approach is based on a problem relaxation that incorporates varying degrees of problem knowledge. We then use this method to improve upon previous noknowledge results. In particular, we show how problem knowledge helps the quantuminspired optimizer find better solutions. 
Wednesday, March 8, 2023 12:42PM  1:18PM 
N73.00007: Generative Learning with Quantum Models Invited Speaker: Kaitlin M Gili Generative machine learning (ML) tasks are prominent across a wide range of industries, 
Wednesday, March 8, 2023 1:18PM  1:30PM 
N73.00008: Benchmarking the Generalization of QuantumInspired and Classical Generative Models Brian Chen, Mohamed HibatAllah, Javier LopezPiqueres, Marta Mauri, Daniel Varoli, Francisco J Fernandez Alcazar, Brian Dellabetta, Alejandro PerdomoOrtiz Recent work has demonstrated the effectiveness of tensor networks as "quantuminspired" generative models for modeling and sampling from unknown probability distributions, but it remains an open question how good tensor networks are at producing new, highquality samples (i.e., generalization) compared to the various other classes of generative models. Gili et al. (2022) showed that tensor networks may indeed generalize better than classical generative adversarial networks (GANs), in the case of a cardinalityconstrained discrete distribution. In this work, we conduct the first comprehensive study of the generalization capabilities of tensor networks against a wide range of classical generative models in addition to GANs, such as autoregressive models and variational autoencoders. Furthermore, we explore the instances where tensor networks seem to demonstrate an advantage in terms of generalization over the classical neural networks, and cases where they do not. Our goal with this study is to provide insight into instances where different classes of generative models, including the tensor networks, perform well, thus advancing on the path towards practical quantuminspired and quantum advantage. 
Wednesday, March 8, 2023 1:30PM  1:42PM 
N73.00009: Symmetric Tensor Networks for Constrained Combinatorial Optimization and Generative Modeling (Part I) Javier LopezPiqueres, Jing Chen, Alejandro PerdomoOrtiz Constrained combinatorial optimization problems abound in industry, from portfolio optimization to logistics. One of the major roadblocks in solving these problems is the presence of nontrivial hard constraints which limit the valid search space. In some heuristic solvers, these are typically addressed by introducing certain Lagrange multipliers in the cost function, by relaxing them in some way, or worse yet, by generating many samples and only to keep valid ones, which leads to very expensive and inefficient searches. In this work, we encode these constraints directly into symmetric tensor networks (TNs) and leverage their applicability as quantuminspired generative models to assist in the search of solutions to combinatorial optimization problems This allows us to exploit the generalization capabilities of TN generative models while constraining them so that they only output feasible samples. 
Wednesday, March 8, 2023 1:42PM  1:54PM 
N73.00010: Symmetric Tensor Networks for Constrained Combinatorial Optimization and Generative Modeling (Part II) Jing Chen, Javier LopezPiqueres, Alejandro PerdomoOrtiz Constrained combinatorial optimization problems in industry, from portfolio optimization to logistics. One of the major roadblocks in solving these problems is the presence of nontrivial hard constraints which limit the valid search space. In some heuristic solvers, these are typically addressed by introducing certain Lagrange multipliers in the cost function, by relaxing them in some way, or worse yet, by generating many samples and only to keep valid ones, which leads to very expensive and inefficient searches. In this work, we encode these constraints directly into symmetric Tensor Networks (TNs) and leverage their applicability as quantuminspired generative models to assist in the search of solutions to combinatorial optimization problems This allows us to exploit the generalization capabilities of TN generative models while constraining them so that they only output feasible samples 
Wednesday, March 8, 2023 1:54PM  2:06PM 
N73.00011: Hamiltonian Quantum Generative Adversarial Networks Leeseok Kim, Seth Lloyd, Milad Marvian We propose Hamiltonian Quantum Generative Adversarial Networks (HQuGANs), to learn to generate unknown input quantum states using two competing quantum optimal controls. The gametheoretic framework of the algorithm is inspired by the success of classical generative adversarial networks in learning highdimensional distributions. The quantum optimal control approach not only makes the algorithm naturally adaptable to the experimental constraints of nearterm hardware, but also has the potential to provide a better convergence due to overparameterization compared to the circuit model implementations. We numerically demonstrate the capabilities of the proposed framework to learn various highly entangled manybody quantum states, using simple twobody Hamiltonians and under experimentally relevant constraints such as lowbandwidth controls. We analyze the computational cost of implementing HQuGANs on quantum computers and show how the framework can be extended to learn quantum dynamics. 
Wednesday, March 8, 2023 2:06PM  2:18PM 
N73.00012: Comparing Generalization Performances of Quantum and Classical Generative Models Mohamed HibatAllah, Marta Mauri, Manuel S Rudolph, Alejandro PerdomoOrtiz Generating novel samples with high quality is the most desirable feature in generative model tasks. This property can be very beneficial to a wide range of applications including molecular discovery and combinatorial optimization. Recently, a welldefined framework [1] has been proposed to quantify generalization for different generative models on an equal footing. In this work, we aim to build on top of these results to compare the generalization performances of quantum and classical generative models. On the quantum side, we use Quantum Circuit Born Machines (QCBMs), which are known for their ability to model complex probability distributions, and which can be implemented on nearterm quantum devices. On the classical side, we use different generative models including autoregressive recurrent neural networks, which are known to be universal approximators of sequential data and have promoted significant progress in natural language processing. In our experiments, we choose a synthetic but applicationinspired dataset as test bed [2]. Our results show that by introducing different rules of comparing our generative models, we can obtain different results, that can sometimes yield an advantage of quantum over classical models. 
Wednesday, March 8, 2023 2:18PM  2:30PM 
N73.00013: Generative Learning of Continuous Data by Tensor Networks Alexander H Meiburg, Jacob E Miller, Jing Chen, Alejandro PerdomoOrtiz Beyond their origin in modeling manybody quantum systems, tensor networks have emerged as a promising class of models for solving machine learning problems, notably in unsupervised generative learning. While possessing many desirable features arising from their quantuminspired nature, tensor network generative models have previously been restricted to binary or categorical data, limiting their usefulness in realworld modeling tasks. We overcome this by introducing a new family of tensor network generative models for continuous data, which are capable of learning from distributions containing continuous random variables. We develop our method in the setting of matrix product states, first deriving a universality expressivity theorem proving the ability of this model family to approximate any reasonably smooth probability density function with arbitrary precision. We then benchmark the performance of this model on synthetic and realworld datasets, finding that the model learns and generalizes well on distributions of continuous and discrete variables. We develop methods for modeling different data domains, and introduce a trainable compression layer which we show increases model performance for a given amount of memory and computational resources. 
Follow Us 
Engage
Become an APS Member 
My APS
Renew Membership 
Information for 
About APSThe American Physical Society (APS) is a nonprofit membership organization working to advance the knowledge of physics. 
© 2024 American Physical Society
 All rights reserved  Terms of Use
 Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 207403844
(301) 2093200
Editorial Office
100 Motor Pkwy, Suite 110, Hauppauge, NY 11788
(631) 5914000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 200452001
(202) 6628700