Bulletin of the American Physical Society
APS March Meeting 2018
Monday–Friday, March 5–9, 2018; Los Angeles, California
Session F34: Machine Learning in Condensed Matter Physics IIFocus

Hide Abstracts 
Sponsoring Units: DCOMP DCMP Chair: Roger Melko, Univ of Waterloo Room: LACC 409A 
Tuesday, March 6, 2018 11:15AM  11:51AM 
F34.00001: Machine learning a dynamical phase diagram for manybody localization Invited Speaker: Evert Van Nieuwenburg We analyze the dynamical phase diagram of a 1dimensional disordered and interacting spinchain with a manybody localization transition, using a recurrent neural network trained on magnetization dynamics. The obtained phase diagram shows good agreement with previously known results obtained from timedependent data and entanglement spectra, but has was obtained using dynamics of only physically measurable quantities, namely the magnetization of the spins obtained from exact time evolution. 
Tuesday, March 6, 2018 11:51AM  12:03PM 
F34.00002: Machine learning outofequilibrium phases of matter Jordan Venderley, Vedika Khemani, EunAh Kim Neural network based machine learning is emerging as a powerful tool for obtaining phase diagrams when traditional regression schemes using local equilibrium order parameters are not available, as in manybody localized or topological phases. Here we show that a single feedforward neural network can decode the defining structures of two distinct MBL phases and a thermalizing phase, using entanglement spectra obtained from individual eigenstates. For this, we introduce a simplicial geometry based method for extracting multipartite phase boundaries. We find that this method outperforms conventional metrics (like the entanglement entropy) for identifying MBL phase transitions, revealing a sharper phase boundary and shedding new insight into the topology of the phase diagram. Furthermore, the phase diagram we acquire from a single disorder configuration confirms that the machinelearning based approach we establish here can enable speedy exploration of large phase spaces that can assist with the discovery of new MBL phases. 
Tuesday, March 6, 2018 12:03PM  12:15PM 
F34.00003: FiniteSize Effects in Machine Learning the KosterlitzThouless Transition Anna Golubeva Recently, machine learning (ML) algorithms have found use in physics as promising novel tools that might overcome the limitations faced by the standard computational methods. First applications proved successful in classifying conventional phases of matter which can be detected based on a local order parameter. The natural next step is the examination of topological phases and phase transitions. In this talk I present the study of the KosterlitzThouless transition in the classical 2D XY model with supervised ML methods. I will show the results obtained from training a simple feedforward and a convolutional network on MonteCarlo sampled spin configurations. In particular, I will discuss why the networks fail to learn topological features from raw input data and how the semisupervised confusion scheme provides indications for that the networks’ classification relies on local magnetization which is a finitesize artifact. 
Tuesday, March 6, 2018 12:15PM  12:27PM 
F34.00004: Machine Learning Vortices in the XY Model Matthew Beach Supervised machine learning algorithms can easily classify symmetrybroken phases in classical statistical physics. In this talk, I will discuss steps towards learning unconventional phase transitions driven by the emergence of topological defects. I will focus on the 2D XY model which exhibits a KosterlitzThouless transition due to vortexantivortex unbinding. Specifically, I will talk about a custom network architecture that (theoretically) can detect topological effects. For small lattice sizes, I claim that learning about vortices typically hinders the performance of a learning algorithm. For larger lattices, I will discuss some of the challenges of learning about vortices with artificial neural networks. 
Tuesday, March 6, 2018 12:27PM  12:39PM 
F34.00005: Machine Learning of Frustrated Classical Spin Models Ce Wang, Hui Zhai This work aims at the goal whether the artificial intelligence can recognize phase transition without the prior human knowledge. If this becomes successful, it can be applied to, for instance, analyze data from quantum simulation of unsolved physical models. Toward this goal, we first need to apply the machine learning algorithm to wellunderstood models and see whether the outputs are consistent with our prior knowledge, which serves as the benchmark of this approach. In this work, we feed the compute with data generated by the classical Monte Carlo simulation for the XY model in frustrated triangular and union jack lattices, which has two order parameters and exhibits two phase transitions. We show that the outputs of the principal component analysis agree very well with our understanding of different orders in different phases, and the temperature dependences of the major components detect the nature and the locations of the phase transitions. Our work offers promise for using machine learning techniques to study sophisticated statistical models, and our results can be further improved by using principal component analysis with kernel tricks and the neural network method. 
Tuesday, March 6, 2018 12:39PM  12:51PM 
F34.00006: Machine Learning the Spinglass State Humberto MunozBauza, Firas Hamze, Helmut Katzgraber Machine learning techniques have found use in the supervised classification of computational condensed matter phases by generalizing their training to situations where Monte Carlo data may be hard to obtain or is unavailable. We present evidence that a convolutional neural network can characterize the spinglass state and distinguish it from paramagnetic and ferromagnetic states in shortrange threedimensional Ising spinglass models. As a generalization test, we find that such a neural network does not find a significant spinglass state in the Gaussian disordered EdwardsAnderson model in a field, in accord with previous Monte Carlo results attempting to find a de AlmeidaThouless line in shortranged models. 
Tuesday, March 6, 2018 12:51PM  1:03PM 
F34.00007: Extrapolating the properties of lattice polarons with Machine Learning Rodrigo Alejandro VargasHernández, John Sous, Mona Berciu, Roman Krems Predicting a phase transition for condensed matter systems can be a relatively easy task for Machine Learning methods if they are trained with data from both phases. On the contrary, predicting a phase transition with only training data from one of the phases is challenging. 
Tuesday, March 6, 2018 1:03PM  1:15PM 
F34.00008: The dangers of inadvertently poisoned training sets in physics applications Chao Fang, Helmut Katzgraber An increasing number of attacks on online services that use machine learning techniques rely on training set poisoning where an attacker manipulates a fraction of the training data to subvert the training process and, for example, overcome advanced spam filters. While these 
Tuesday, March 6, 2018 1:15PM  1:27PM 
F34.00009: Identification of phase transtitions in molecular systems using unsupervised machine learning methods Nicholas Walker, KaMing Tam, Mark Jarrell Recent advancments in data science and machine learning provide interesting new avenues for studying physical systems. Pattern recognition is a distinct advantage of using machine learning for data analysis. This feature can be leveraged in order to study phase transitions in molecular systems by assuming that a phase transition is accompanied by a change in structure, which is valid for many systems. For example, molecular dynamics (MD) simulations that encapsulate phase transitions provide data for calculating structure information at different temperatures. Various dimensionality reduction methods can be used to isolate the most statistically relevant features of of the structure information in the sample space. From there, clustering methods allow for partitioning of the structure information into different groups based on a similarity criterion. These results can be used in combination with the MD temperature information to predict a transition temperature. The advantage of this approach is that it rerquires minimal a priori information about the system and very little parameter tuning. For systems that are not yet wellunderstood, such as high entropy alloys, this is an important advantage. 
Tuesday, March 6, 2018 1:27PM  1:39PM 
F34.00010: Linear Scaling, Quantumaccurate Interatomic Potentials with SNAP; Acessing those Hardtoreach Places in Classical Molecular Dynamics Mitchell Wood, Aidan Thompson Of the many ways to improve the predictive capacity of classical molecular dynamics (MD), progress into the fidelity of the underlying interatomic potential(IAP) seems to consistently lag behind the hardware and software advancements that expand the accessible length/timescales. In this talk we report on the current capability of SNAP, a machine learned IAP that has been demonstrated to preserve quantummechanical accuracy for a number of different material systems, most notably bccmetals. The focus of much of the SNAP development has been for ‘hardtoreach’ problems in classical MD, here we will show examples of materials in extreme environments such as plasma facing materials in fusion reactors and extremes of temperature and pressure. 
Tuesday, March 6, 2018 1:39PM  1:51PM 
F34.00011: Learning Force Fields using Covariant Compositional Networks Brandon Anderson, Risi Kondor, Horace Pan, Shubhendu Trivedi, Truong Son Hy Deep neural networks have emerged as a powerful technique for augmenting molecular dynamics simulations. Recent work from our group has introduced the concept of a Covariant Compositional Network, a neural network architecture that by construction respects the symmetries of underlying training data. These networks were shown to be competitive with recent deep learning techniques in quantum chemistry, including Google Brain. In this talk, we generalize these networks to learn potential energy surfaces and their corresponding force fields. Our technique produces highly accurate representations of molecular force fields. More generally we have a technique for developing a highly accurate universal force field for atomistic simulations. 
Tuesday, March 6, 2018 1:51PM  2:03PM 
F34.00012: Towards Exact Molecular Dynamics Simulations with MachineLearned Force Fields Stefan Chmiela, Huziel Sauceda, KlausRobert Müller, Alexandre Tkatchenko Molecular dynamics (MD) simulations employing classical force fields constitute the cornerstone of contemporary atomistic modeling. However, the predictive power of these simulations is only as good as the underlying interatomic potential. Classical potentials are based on mechanistic models of interatomic interactions, which often fail to faithfully capture key quantum effects in molecules and materials. Here we enable the direct construction of molecular force fields from highlevel ab initio calculations by incorporating spatial and temporal physical symmetries into a gradientdomain machine learning (sGDML) model in a datadriven way, thus greatly reducing the intrinsic complexity of the force field learning problem. The developed sGDML approach faithfully reproduces global force fields at quantumchemical CCSD(T) level of accuracy [coupled cluster with single, double, and perturbative triple excitations] and for the first time allows converged molecular dynamics simulations at the level of fully quantized electrons and nuclei for flexible molecules with up to a few dozen atoms. 
Tuesday, March 6, 2018 2:03PM  2:15PM 
F34.00013: Noncovalent interactions across organic and biological subsets of chemical space: Physicsbased potentials parametrized from machine learning Tristan Bereau, Robert Distasio, Alexandre Tkatchenko, O. Anatole Von Lilienfeld Computer simulations are increasingly acquiring the necessary speed and accuracy to tackle rational materials design. The screening of many compounds requires transferable force fields, so as to alleviate tedious parametrization efforts for every new compound. Here, we report on work aimed at optimizing classical intermolecular potentials that do away with a manual parametrization of every new molecule. By a combination of a machinelearningbased prediction of atomic properties, coupled with specific physicsbased interactions, the model only includes 8 global parameters—optimized once and for all across compounds. The model is validated on gasphase dimers, where chemical accuracy (1 kcal/mol) is reached for several datasets representative of noncovalent interactions in biologicallyrelevant molecules. We further focus on hydrogenbond complexes—essential but challenging due to their directional nature—where datasets of DNA base pairs and amino acids yield extremely encouraging results. 
Follow Us 
Engage
Become an APS Member 
My APS
Renew Membership 
Information for 
About APSThe American Physical Society (APS) is a nonprofit membership organization working to advance the knowledge of physics. 
© 2018 American Physical Society
 All rights reserved  Terms of Use
 Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 207403844
(301) 2093200
Editorial Office
1 Research Road, Ridge, NY 119612701
(631) 5914000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 200452001
(202) 6628700