Bulletin of the American Physical Society
APS March Meeting 2018
Monday–Friday, March 5–9, 2018; Los Angeles, California
Session E34: Machine Learning in Condensed Matter Physics IFocus

Hide Abstracts 
Sponsoring Units: DCOMP DCMP Chair: Ehsan Khatami, San Jose State Univ Room: LACC 409A 
Tuesday, March 6, 2018 8:00AM  8:36AM 
E34.00001: From Boltzmann machines to Born machines Invited Speaker: Lei Wang Statistical physics has made profound impacts on machine learning, e.g., energybased models for generative modeling and meanfield approaches for variational inference. We argue that quantum physics can be equally inspirational by exploiting mindprovoking analogies between the "image space" and the Hilbert space. Exchanging of ideas, insights, techniques, and even intuitions developed for machine learning and quantum physics will crossfertilize both research fields. In particular, I shall talk about quantum inspired generative models with explicit and implicit probability densities. 
Tuesday, March 6, 2018 8:36AM  8:48AM 
E34.00002: Neuralnetwork quantum state tomography Giacomo Torlai, Guglielmo Mazzola, Juan Carrasquilla, Matthias Troyer, Roger Melko, Giuseppe Carleo The reconstruction of an unknown quantum state from simple experimental measurements, quantum state tomography (QST), is a fundamental tool to investigate complex quantum systems, validate quantum devices and fully exploit quantum resources. In this talk, we introduce a novel scheme for QST using machinelearning. The wavefunction of an arbitrary manybody system is parametrized with a standard neural network, which is trained on raw data to approximate both the amplitudes and the phases of the target quantum state. This approach allows one to reconstruct highlyentangled states and reproduce challenging quantities, such as entanglement entropy, from simple measurements already available in the experiments. We show the main features of the “NeuralNetwork QST” and demonstrate its performances on a variety of examples, ranging from the prototypical W state, to unitary dynamics and ground states of manybody Hamiltonians in one and two dimensions. 
Tuesday, March 6, 2018 8:48AM  9:00AM 
E34.00003: Approximating quantum manybody wavefunctions using artificial neural networks Zi Cai In this talk, we demonstrate the expressibility of artificial neural networks (ANNs) in quantum manybody physics by showing that a feedforward neural network with a small number of hidden layers can be trained to approximate with high precision the ground states of some notable quantum manybody systems. We consider the onedimensional free bosons and fermions, spinless fermions on a square lattice away from halffilling, as well as frustrated quantum magnetism with a rapidly oscillating groundstate characteristic function. In the latter case, an ANN with a standard architecture fails, while that with a slightly modified one successfully learns the frustrationdriven complex sign rule in the ground state. The practical application of this method to explore the unknown ground states is also discussed. 
Tuesday, March 6, 2018 9:00AM  9:12AM 
E34.00004: Hunting for Hamiltonians: A Computational Approach to Learning Quantum Models Eli Chertkov, Bryan Clark The machine learning community has been widely successful in developing computational methods for learning models, i.e., probability distributions, from data sets. We present a novel numerical method, similar in spirit to machine learning techniques, for learning quantum models, i.e., Hamiltonians, from wave functions. The method receives as input a target wave function and produces as output a space of Hamiltonians with the target wave function as an energy eigenstate. We demonstrate that our method is able to discover multidimensional spaces of Hamiltonians with ground states exactly identical to the ground states of known model Hamiltonians, such as the Kitaev chain, the XX chain, the Heisenberg chain, and the MajumdarGhosh model. Using this method, we also find a large space of Hamiltonians with a new type of antiferromagnetic ground state, exhibiting triplet dimer ordering, which has not been previously observed in other models. Our results indicate that our new computational approach can systematically discover new Hamiltonians, and thereby potentially new materials, with exactly specified ground state properties. 
Tuesday, March 6, 2018 9:12AM  9:24AM 
E34.00005: Complexity and geometry of quantum state manifolds Zhoushen Huang, Alexander Balatsky A ubiquitous notion in quantum physics is a wavefunction manifold. Examples include the ground state of a parametrized Hamiltonian, the Hilbert space trajectory generated by time evolution, etc. In this talk, we will discuss an efficient description of the Hilbert space spanned by such manifolds. In particular, we will show that the effective size of this Hilbert space is a geometric quantity and is related to its informationtheoretic complexity. We also discuss its implication on topologically nontrivial manifolds. 
Tuesday, March 6, 2018 9:24AM  9:36AM 
E34.00006: Recurrent Neural Networks for Quantum Feedback Thomas Foesel, Talitha Weiss, Petru Tighineanu, Florian Marquardt Neural networks have become powerful tools to tackle complex problems in realworld applications and have recently attracted increasing attention from various fields. Recurrent network architectures (like long shortterm memory) provide an incorporated memory that make them suitable to also deal with situations where knowledge about the true state of a system is only collected over time. We apply this to the search for optimal control sequences in quantum feedback, where actions (the application of quantum gates) should be chosen solely based on previous actions and measurement outcomes, i.e., without knowledge of the full quantum state. We have investigated how to combine reinforcement learning techniques with recurrent neural networks in order to train an agent to preserve an arbitrary quantum state by choosing actions from a set of available quantum gates. 
Tuesday, March 6, 2018 9:36AM  9:48AM 
E34.00007: Interaction Distance: Measuring ManyBody Freedom via Quantum Correlation Structure Konstantinos Meichanetzidis, Christopher Turner, Ashk Farjami, Zlatko Papic, Jiannis Pachos The entanglement spectrum [Li, Haldane (2008)], obtained by reducing a pure state, 
Tuesday, March 6, 2018 9:48AM  10:00AM 
E34.00008: Machine learning modeling of superconducting critical temperature Valentin Stanev, Corey Oses, A. Gilad Kusne, Efrain Rodriguez, Johnpierre Paglione, Stefano Curtarolo, Ichiro Takeuchi Connection between superconductivity and chemical and structural properties of materials is the key to understanding the mechanisms of superconductivity, and yet finding this connection is major experimental and theoretical challenge. We have developed several machine learning methods for modeling the critical temperatures T_{c} of the 12,000+ known superconductors available via the SuperCon database. Materials are first divided into two classes based on their T_{c} values, above and below 10 K, and a classification model predicting this label is trained. The model uses coarsegrained features based only on the chemical compositions. It shows strong predictive power, with outofsample accuracy of 92%. Separate regression models are developed to predict the values of T_{c} for cuprate, ironbased, and low T_{c} compounds. These models demonstrate good performance, with learned predictors offering insights into the mechanisms behind superconductivity in different families. We combined the classification and regression models into a single pipeline and employed it to search the entire Inorganic Crystallographic Structure Database for potential new superconductors. We have identified 35 oxides as candidate materials. 
Tuesday, March 6, 2018 10:00AM  10:12AM 
E34.00009: Neural network prediction of Tc for conventional and unconventional superconductors Ethan Shapera, Suraj Dhanak, Andre Schleife <!StartFragment>We demonstrate the use of artificial neural networks to predicting the experimental superconductor transition temperature for conventional and unconventional superconductors. The training sets consist of 580 BCS superconductors, 6,489 uncategorized superconductors, 1,375 iron based superconductors, and 4,226 copper and oxygen containing superconductors. Descriptors are limited to quantities which can be obtained from the chemical formula and standard tables (e.g. atomic masses, electronegativities). Despite not explicitly accounting for crystal structure, neural networks are shown to predict Tc with mean absolute errors for BCS superconductors of 2 K, iron based superconductors of 5 K, and cuprate superconductors of 12 K. The approach fails to produce a usable single network model if multiple classes are combined in a training set. Several potential new superconductors are predicted by the neural network, and their Tc are compared to values computed using MigdalEliashberg theory for BCStype systems.<!EndFragment> 
Tuesday, March 6, 2018 10:12AM  10:24AM 
E34.00010: DataDriven Design of Nanoscale Features to Obtain HighzT Thermoelectrics Emily Conant, Timothy Brown, Raymundo Arroyave, Joseph Ross, Patrick Shamberger We describe methods for improving efficiency in thermoelectric materials, which we have explored through a statistical learningbased investigation into design of optimal structural features. Structure, composition, dimensionality, and corresponding thermoelectric properties have been extracted from 57 different thermoelectric materials systems publications and compiled into a database. Feature selection methods such as ANOVA, lasso regression and PCA were used to refine the dataset and remove unimportant features that have negligible impact on zT. Afterwards, extra trees, nonlinear SVM, and Gaussian process regressors were built on the data set, with the efficacy of each evaluated by the estimated error via leaveoneout crossvalidation. Finally, we will discuss optimal experimental design techniques, which have been implemented to verify the model and to exploit the design space such that criteria for highzT thermoelectrics can be extracted. 
Tuesday, March 6, 2018 10:24AM  10:36AM 
E34.00011: Materials prediction using machine learning: comparing MBTR, MTP and deep learning Chandramouli Nyshadham, Wiley Morgan, Brayden Bekker, Gus Hart With the advancement of supercomputers and electronic structure methods such as density functional theory, material scientists have developed huge computational databases of materials over the last two decades. The rate at which the material repositories increase their database determines the rate at which we can invent new materials. This necessitates faster surrogate models to replace the expensive methodology of density functional theory. In this regard the materials community have come up with quantum mechanics machine learning models, which are fast and accurate to describe the materials space of solids. We tested three different (MBTR[1], MTP[2] and Deep learning) machine learning models for predicting the ground state energies of solids. The database is generated using standard interatomic potentials. We present a comparison between performance of three different machine learning models. 
Tuesday, March 6, 2018 10:36AM  10:48AM 
E34.00012: Evaluation of Machine Learning Methods for the Prediction of Key Properties for Novel Transparent Semiconductors Christopher Sutton, Christopher Bartel, Xiangyue Liu, Mario Boley, Matthias Rupp, Luca Ghiringhelli, Matthias Scheffler Transparent conductors are crucial for the operation of a variety of technological devices such as photovoltaic cells and lightemitting diodes; however, only a small number of compounds are currently known to display both transparency and conductivity suitable enough to be used as transparent conducting materials. To address the need for finding new materials with an ideal functionality, an open bigdata competition was organized by Novel Materials Discovery Repository (NOMAD) and hosted by Kaggle for the prediction both the formation enthalpy (an indication of stability) and the bandgap energy (an indication of optical transparency) for a dataset of ca. 3000 groupIII oxide binary, ternary and quaternary alloys. 
Tuesday, March 6, 2018 10:48AM  11:00AM 
E34.00013: A robust artificial neural network potential for Si(001) Duy Le, Talat Rahman Despite being is a powerful method for material research, density functional theory (DFT) finds limited applications at large length scale or long time scale, at which realistic properties emerge, thanks to its computational expensiveness. Thus, development of simple potentials whose accuracies are at the level of DFT method and that are fast enough for large length scale and long time scale simulations is of interest. We will report our development of a robust artificial neural network (ANN) potential for Si. The ANN potential was trained by using about 15000 data points obtained from ab initio molecular dynamic simulations of 2×2×6 and 2×2×8 Si(001) slabs at 500 K and 1000 K. The trained ANN potential is capable of producing energy in agreement with DFT one (within few meV per atom). More importantly, it can reproduce different reconstruction phases of Si(001) surface, i.e. (2×1), p(2×1), p(2×2), and c(4×2). Molecular dynamics simulations of Si(001) using the ANN potential show the existence of p(2×1) phase at low temperature (1020K) and p(2×2) or c(4×2) phase at high temperature (100150K). 
Follow Us 
Engage
Become an APS Member 
My APS
Renew Membership 
Information for 
About APSThe American Physical Society (APS) is a nonprofit membership organization working to advance the knowledge of physics. 
© 2018 American Physical Society
 All rights reserved  Terms of Use
 Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 207403844
(301) 2093200
Editorial Office
1 Research Road, Ridge, NY 119612701
(631) 5914000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 200452001
(202) 6628700