Bulletin of the American Physical Society
APS March Meeting 2021
Volume 66, Number 1
Monday–Friday, March 15–19, 2021; Virtual; Time Zone: Central Daylight Time, USA
Session B60: AI and Statistical/Thermal PhysicsFocus Live
|
Hide Abstracts |
Sponsoring Units: GDS GSNP DCOMP Chair: Wolfgang Losert, University of Maryland, College Park |
Monday, March 15, 2021 11:30AM - 12:06PM Live |
B60.00001: Learning about learning by many-body systems Invited Speaker: Nicole Yunger Halpern Diverse many-body systems, from soap bubbles to suspensions to polymers, learn and remember patterns in the drives that push them far from equilibrium. This learning may be leveraged for computation, memory, and engineering. Until now, many-body learning has been detected with thermodynamic properties, such as work absorption and strain. We progress beyond these macroscopic properties first defined for equilibrium contexts: We quantify statistical mechanical learning using representation learning, a machine-learning model in which information squeezes through a bottleneck. By calculating properties of the bottleneck, we measure four facets of many-body systems' learning: classification ability, memory capacity, discrimination ability, and novelty detection. Numerical simulations of a classical spin glass illustrate our technique. This toolkit exposes self-organization that eludes detection by thermodynamic measures: Our toolkit more reliably and more precisely detects and quantifies learning by matter. |
Monday, March 15, 2021 12:06PM - 12:42PM Live |
B60.00002: Can artificial intelligence learn and predict molecular dynamics? Invited Speaker: Pratyush Tiwary The ability to rapidly learn from high-dimensional data to make reliable predictions about the future of a given system is crucial in many contexts. This could be a fly avoiding predators, or the retina processing terabytes of data almost instantaneously to guide complex human actions. In this talk we draw parallels between such tasks, and the efficient sampling of complex molecules with hundreds of thousands of atoms. Such sampling is critical for predictive computer simulations in condensed matter physics and biophysics, including but not limited to problems such as crystal nucleation and drug unbinding. For this we use the Predictive Information Bottleneck (PIB) and long short-term memory (LSTM) frameworks from artificial intelligence, and re-formulate them for the sampling of biomolecular structure and dynamics, especially when plagued with rare events. We demonstrate the methods on different test-pieces, where we calculate the dissociation pathway and timescales much longer than milliseconds. These include ligand dissociation from the protein lysozyme and and from flexible RNA. |
Monday, March 15, 2021 12:42PM - 12:54PM Live |
B60.00003: Optimal machine intelligence near the edge of chaos Ling Feng, Lin Zhang, Choy Heng Lai We develop a general theory that reveals the exact edge of chaos for generic non-linear systems is the boundary between the chaotic phase and the (pseudo)periodic phase arising from Neimark-Sacker bifurcation. This edge is analytically determined by the asymptotic Jacobian norm values of the non-linear operator and influenced by the dimensionality of the system. The optimality at the edge of chaos is associated with the highest information transfer between input and output at this point, inferred by the maximal information content in the system’s asymptotic periodic states similar to that of the logistic map. As empirical validations, our experiments on the deep learning models in computer vision trained on benchmark data set demonstrate the optimality of the models near the edge of chaos. Our finding contributes to the theoretical and empirical foundation of the edge of chaos hypothesis, while our theory provides fundamental understanding of machine intelligence in deep learning. |
Monday, March 15, 2021 12:54PM - 1:06PM Live |
B60.00004: Using learning by confusion to identify the order of a phase transition Monika Richter-Laskowska, Maciej Maska The conventional methods of classification of phase transitions rely on identification of order parameters and singularities in the free energy and its derivatives. Recently, artificial neural networks have been proven to be an efficient tool to perform this task. It has been shown that properly trained neural networks can precisely determine the critical temperature. One of the approaches is based on "learning by confusion", which is a combination of supervised and unsupervised learning. We show that the degree of neural network's "confusion" can be used to determine the order of the phase transition. For a few selected models we demonstrate how this method can be used to distinguish between first and second order phase transitions. |
Monday, March 15, 2021 1:06PM - 1:18PM Live |
B60.00005: Asymptotic stability of the neural network and its generalization power Lin Zhang, Ling Feng, Kan Chen, Choy Heng Lai The generalization power of neural networks is of great importance in both the theoretical and practical development of neural network models. Through the theoretical framework of dynamical stability analysis, we find that a fully connected neural network’s generalization power is associated with its asymptotic stability, i.e., a network that is more unstable in its asymptotic fixed points has lower generalization power due to the emergence of chaotic behaviors. We further find that the neural network’s training is a random-walk like diffusion process toward chaos in the parameter space, and the regularization technique of weight decay effectively reserves it. Specifically, regularization is only effective by pulling the model out of the unstable phase. Once the model is in the stable phase, test losses are similar regardless of how large the regularization strength is. Therefore, the model at the boundary of stability will achieve a balance between underfitting and overfitting. Based on this, we propose a method to calculate a lower bound for the regularization strength which could maintain the model at the boundary of stability. Lastly, the analogy with spin glasses also tells us why the training process deviates from the random walk behavior after it enters the chaos phase. |
Monday, March 15, 2021 1:18PM - 1:30PM Live |
B60.00006: Renormalized Mutual Information for Artificial Scientific Discovery Leopoldo Sarra, Andrea Aiello, Florian Marquardt Ranging from statistical physics to hydrodynamics, collective coordinates are one of the most useful general concept in the analysis of physical systems. However, there is no general procedure to find them, they are usually engineered manually. These low-dimensional "features" can be defined as low dimensional variables that preserve the largest mutual information with the original coordinates of the system. However, mutual information is ill-defined when one continuous random variable is deterministically dependent on the other. We develop a new “renormalized” version, with the same physical meaning but finite. This quantity can be used to find out how useful a given macroscopic quantity would be in characterizing a system. In addition, we can employ a neural network to parametrize the feature function, and optimize it to get the best feature. This high-level representation is learned in a completely unsupervised way. We show examples that involve many-particle systems and fluctuating fields, but the technique has potential applications not only in the most diverse physical scenarios, from statistical physics to dynamical systems or even quantum mechanics, but also as a tool to study neural networks themselves from the information-theoretic perspective. |
Monday, March 15, 2021 1:30PM - 1:42PM Live |
B60.00007: How neural nets compress invariant manifolds Jonas Paccolat, Leonardo Petrini, Mario Geiger, Kevin Tyloo, Matthieu Wyart The success of neural networks is often attributed to their ability to learn relevant features from data while becoming insensitive to invariants by compressing them. |
Monday, March 15, 2021 1:42PM - 1:54PM Live |
B60.00008: Perturbation Theory for the Information Bottleneck Vudtiwat Ngampruetikorn, David J. Schwab Extracting relevant information from data is crucial for all forms of learning. The information bottleneck (IB) method formalizes this, offering a mathematically precise and conceptually appealing framework for understanding learning phenomena. However the nonlinearity of the IB problem makes it computationally expensive and analytically intractable in general. Here we derive a perturbation theory for the IB method and report new analytical results for the learning onset - the limit of maximum relevant information per each bit, extracted from data. We test our results on synthetic probability distributions, finding good agreement with the exact numerical solution near the onset of learning. Our work also provides a fresh perspective on the intimate relationship between the IB method and the strong data processing inequality. |
Monday, March 15, 2021 1:54PM - 2:06PM Live |
B60.00009: Real-space mutual information neural estimation algorithm for single-step extraction of renormalisation group-relevant degrees of freedom Doruk Efe Gokmen, Zohar Ringel, Sebastian Huber, Maciej Koch-Janusz Deriving the emergent macroscopic properties of matter from microscopic models of interacting constituents is a perpetual theoretical challenge. In statistical physics a powerful framework for addressing it is provided by the renormalization group (RG), which associates physical theories at different scales by recursively combining local degrees of freedom. It has been proposed that an optimal RG rule for a given system maximises the real-space mutual information (RSMI). Here we show that the RSMI-optimal coarse-graining is a remarkable object itself, extracting the relevant degrees of freedom in a single step, instead of relying on the iterative nature of the RG procedure. It can be used to characterise spatial correlations, locate phase transitions and construct order parameters. In this sense it allows to extract and interpret low-energy physics at the outset of the RG flow. We develop an efficient numerical algorithm based on recent rigorous results on mutual information estimation with neural networks. We validate its capabilities on an interacting lattice dimer model with a nontrivial RG flow and discuss further applications, including in non-equilibrium problems. Our findings introduce a new conceptual paradigm and a numerical tool in investigating statistical systems. |
Monday, March 15, 2021 2:06PM - 2:18PM Live |
B60.00010: Deep learning in phase transition prediction of disordered materials Serveh Kamrava, Muhammad Sahimi Percolation and fracture propagation in disordered solids represent an important problems in science and engineering that is characterized by phase transitions: loss of macroscopic connectivity at the percolation threshold pc. An important unsolved problem is accurate prediction of physical properties of systems undergoing such transitions, given limited data far from the transition point. There is currently no theoretical method that can use limited data for a region far from a transition point pc and predict the physical properties all the way to that point, including their location. We present a deep neural network (DNN) for predicting such properties of two- and three-dimensional systems and in particular their percolation probability, the threshold pc. All the predictions are in excellent agreement with the data. This opens up the possibility of using the DNN for predicting physical properties of many types of disordered materials that undergo phase transformation, for which limited data are available for only far from the transition point. |
Follow Us |
Engage
Become an APS Member |
My APS
Renew Membership |
Information for |
About APSThe American Physical Society (APS) is a non-profit membership organization working to advance the knowledge of physics. |
© 2024 American Physical Society
| All rights reserved | Terms of Use
| Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 20740-3844
(301) 209-3200
Editorial Office
100 Motor Pkwy, Suite 110, Hauppauge, NY 11788
(631) 591-4000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 20045-2001
(202) 662-8700