Bulletin of the American Physical Society
APS March Meeting 2019
Volume 64, Number 2
Monday–Friday, March 4–8, 2019; Boston, Massachusetts
Session H52: Machine Learning in Nonlinear Physics and MechanicsFocus

Hide Abstracts 
Sponsoring Units: GSOFT GSNP Chair: Shmuel Rubinstein, Harvard University Room: BCEC 253B 
Tuesday, March 5, 2019 2:30PM  3:06PM 
H52.00001: A case study in neural networks for scientific data: generating atomic structures Invited Speaker: Tess Smidt Expertise in both the scientific domain of interest and deep learning techniques are essential in order to properly translate scientific problems into tasks amenable to deep learning. Scientific data has a lot of context; for example, the laws of physics obey certain symmetries. Can the network learn this context from the data or should we impose this context as constraints in our network or training procedures? Additionally, scientific data representations, neural network operations and appropriate loss functions desirable for scientific applications can be very different from those most prevalent in the deep learning literature; when is it appropriate to use existing methods and when is it necessary to develop new ones? 
Tuesday, March 5, 2019 3:06PM  3:18PM 
H52.00002: A computational model for crumpled thin sheets to complement datadriven machine learning Jovana Andrejevic, Jordan Hoffmann, Yohai BarSinai, Lisa Lee, Shruti Mishra, Shmuel Rubinstein, Christopher Rycroft Crumpling is ubiquitous across length scales and diverse structures in nature, yet a complete theoretical description of the mechanisms underlying ridge formation remains elusive. To characterize the intricate damage networks of crumpled thin sheets, our recent work has shown that appropriate simulations can assist datadriven machine learning to overcome the scarcity of highquality experimental data. Inspired by this data augmentation approach, here we detail a computational model for thin, viscoelastic sheets and demonstrate its ability to capture the properties and behavior of crumpled sheets. We validate the modelâ€™s robustness through statistical comparison with highquality experimental data, and discuss the prospects for its application in assisting datadriven machine learning. 
Tuesday, March 5, 2019 3:18PM  3:30PM 
H52.00003: Machine Learning in a datalimited regime: Augmenting experiments with synthetic data uncovers order in crumpled sheets Lisa Lee, Jordan Hoffmann, Yohai BarSinai, Jovana Andrejevic, Shruti Mishra, Shmuel Rubinstein, Christopher Rycroft Machine learning is a powerful tool for uncovering structure in complex, highdimensional data. However, a large amount of data is necessary in order to properly train a machine learning network, making it difficult to apply to experimental systems where data is limited. Here we resolve this difficulty by augmenting an experimental dataset with synthetically generated data from a simpler relevant system. Specifically, we study the local order in crease patterns of crumpled sheets, a paradigmatic example of spatial complexity. We supplement sparse crumpled experimental data with abundant simulated sheets of synthetic folds. This technique significantly improves the predictive power in a test problem of pattern completion, demonstrating the usefulness of machine learning in experiments where data may be scarce. Additionally, assessing the accuracy of networks trained with varying types of simulated data reveals the relevance of various physical rules to understanding crease patterns. 
Tuesday, March 5, 2019 3:30PM  3:42PM 
H52.00004: Search and design of stretchable graphene kirigami using convolutional neural networks Paul Hanakata, Ekin Dogus Cubuk, David K Campbell, Harold Park Making kirigamiinspired cuts in a sheet has been shown to be an effective way to design stretchable materials with metamorphic properties where the 2D shape can transform into complex 3D shapes. However, a systematic understanding on how cutting patterns alter the mechanical properties of the resulting kirigami remains elusive. Here, we use machine learning (ML) to approximate the objective functions, such as yield stress and yield strain, as functions of the cutting pattern[1]. Our approach enables the rapid discovery of graphene kirigami designs that yield extreme stretchability as verified by molecular dynamics (MD) simulations. We find that convolutional neural networks (CNN) can be applied for regression to achieve an accuracy close to the precision of the MD simulations. This approach can then be used to search for optimal designs that maximize elastic stretchability with only 1000 training samples in a large design space of roughly 4,000,000 candidate designs. This example demonstrates the power and potential of ML in finding optimal kirigami designs at a fraction of iterations that would be required of a purely MD or experimentbased approach, where no prior knowledge of the governing physics is known or available. 
Tuesday, March 5, 2019 3:42PM  3:54PM 
H52.00005: Clog prediction in granular hoppers using machine learning methods Jesse Hanlan, Douglas Durian Grains discharge from a hopper at constant rate, proceeding probabilistically until a stable arch forms. Thomas and Durian (PRL 2015) showed that recasting the characteristic measure of a flow event from the average mass discharged to the fraction of flow microstates that precede (i.e. cause) a clog explains why the former grows as an exponential function of hole diameter, rather than a critical power law. This makes clear that clogs form as the flow brings new microstates into the vicinity of the outlet, which are randomly sampled until a stable arch is found. Characterizing the flow microstates that cause clogs should then better inform a predictive framework. As a first step, Koivisto and Durian (PRE 2017) found that the same statistics governed hoppers in air or submerged in a viscous fluid. This implies that clog formation depends primarily on position degrees of freedom; however, the phase space of grain microstates in a hopper is extremely highdimensional. Here, we apply deep learning to probe the function space of position microstates in a twodimensional hopper, to identify and separate out characteristic structures responsible for clogging. Preliminary analysis of a small dataset gives a cross validation success rate of 90%. 
Tuesday, March 5, 2019 3:54PM  4:06PM 
H52.00006: Tracking Topological Defects in 2D Active Nematics Using Convolutional Neural Networks Ruoshi Liu, Pengyu Hong, Michael Norton, Seth Fraden The motion of topological defects in active nematics has been modeled by a number of hydrodynamic theories that remain to be fully tested, which we will do by measuring defect dynamics using video microscopy and comparing with theory. However, classical image processing techniques are cumbersome, as variations in image quality and defect morphology require fine tuning for each data set and impede highthroughput processing of experimental data. Here, we use a Convolutional Neural Network (CNN) to efficiently and precisely measure the locations of active defects. We construct a deep CNN to train a defect detector to automatically analyze videos of a microtubulebased active nematic. We labeled 8800 images from which we selected 6600 as training dataset and 1700 as testing dataset. To obtain higher precision, we also consider the temporal relation between the location of defects within consecutive frames and train our CNN correspondingly. We compare results obtained wtih CNN with results generated by traditional image processing algorithms. 
(Author Not Attending)

H52.00007: Connecting structure and dynamics in a model of confluent cell tissues using machine learning Tristan A Sharp, Andrea J Liu Cellular motion in dense tissues often consists of neighborswapping events or rearrangements. The rearrangements underlie the glassy dynamics and much of the collective dynamics of dense disordered cellular packings. Here we present a machine learning (ML) approach that links the local disordered structure surrounding a cell with the propensity of the cell to rearrange in a Voronoi cell vertex model. ``Softness,'' S, an MLderived quantity originally introduced to quantify the link between local structure and rearrangements in inert glassy liquids, provides an effective proxy for a cell's probability to rearrange, P_{R}. The local structural features that determine the softness of a cell are quantified. Decreasing temperature lowers P_{R} for a given value of S, but the distribution of S also shifts up, opposing the change, leading to previouslyobserved subArrhenius dynamics. This contrasts with the behavior of LennardJones glassy liquids, where the distribution of S shifts down with decreasing temperature, leading to superArrhenius dynamics. 
Tuesday, March 5, 2019 4:18PM  4:30PM 
H52.00008: Design and learning in multistable mechanical networks Menachem Stern, Matthew Pinson, Arvind Murugan Systems with multiple stable states have proven essential in biological and engineering contexts to show multiple functions, store memories and so on. In this work we contrast two paradigms for creating systems with specific stable states in elastic mechanical networks: design and learning. In the design framework, all desired stable states are known in advance and thus material parameters can be optimized on a computer. In contrast, our learning framework considers sequential introduction of desired stable states so that material parameters must be incrementally updated to stabilize each additional state. We show that designed states are optimally stable within elastic networks with Hookean springs. However, incremental learning requires springs with strong nonlinearity. We interpret such nonlinearity as biasing the distribution of strain in these elastic networks to be localized in a sparse subset of springs, much like a Bayesian prior in sparse regression. In this way, we identify principles for practical implementations of learned multistability. 
Tuesday, March 5, 2019 4:30PM  4:42PM 
H52.00009: DropNet : A neural network solution to flow instabilities. Maxime Lavech du Bos, Joel Marthelot, PierreThomas Brun Forecasting phenomenon arising from nonlinear dynamical systems is a daunting task. Formal models are rarely analytically tractable and 
Tuesday, March 5, 2019 4:42PM  4:54PM 
H52.00010: Visualizing probabilistic models and data with Intensive Principal Component Analysis (InPCA) Katherine Quinn, Colin Clement, Francesco De Bernardis, Michael D Niemack, James Patarasp Sethna Unsupervised learning makes manifest the underlying structure of data without curated training and specific problem definitions. However, the inference of relationships between data points is frustrated by the `curse of dimensionality' in highdimensions. Inspired by replica theory from statistical mechanics, we consider replicas of the system to tune the dimensionality and take the limit as the number of replicas goes to zero. The result is the intensive embedding, which is not only isometric (preserving local distances) but allows global structure to be more transparently visualized. We develop the Intensive Principal Component Analysis (InPCA) and demonstrate clear improvements in visualizations of the Ising model of magnetic spins, a neural network, and the dark energy cold dark matter ({\Lambda}CDM) model as applied to the Cosmic Microwave Background. 
Tuesday, March 5, 2019 4:54PM  5:06PM 
H52.00011: Physical Symmetries Embedded in Neural Networks Marios Mattheakis, David Sondak, Pavlos Protopapas Artificial neural networks (ANNs) have become indispensable tools in many machine learning applications and in recent years, ANNs have become an active area of research in the physical sciences. An important consideration for building ANNs for scientific applications is how to incorporate nonnegotiable physical constraints. We propose ANNs with embedded physical symmetries including evenodd symmetry, timereversibility, positivity, energymomentum conservation, and Galilean invariance. We constrain the weights of the NN so that the physical properties are exactly satisfied by the neural network output. Furthermore, embedding constraints into the NN can drastically reduce the search space thereby providing an efficient deep learning architecture. 
Tuesday, March 5, 2019 5:06PM  5:18PM 
H52.00012: Maximizing thermal efficiency of heat engines using neuroevolutionary strategies for reinforcement learning Christopher Beeler, Uladzimir Yahorau, Rory Coles, Kyle Mills, Isaac Tamblyn Classic control problems such as Mountain Car [1] and Acrobot [2] are based on simple Newtonian physics. Both have been solved previously with reinforcement learning algorithms. Here, we show that reinforcement learning can also be used to solve classical problems in thermodynamics. Using a reinforcement learning method based on genetic algorithms, our software agent can learn to reproduce thermodynamic cycles without prior knowledge of physical laws. We have created a simulated learning environment which models a simple piston, where an agent can activate thermodynamic processes. With this method, we were able to optimize an artificial neural network based policy to maximize the thermal efficiency for several different cases. Depending on the actions available to the agent, different known cycles emerged, including the Carnot, Stirling, and Otto cycles. Importantly, we show an example of how reinforcement learning can be used to aid scientists in finding solutions to problems that have yet to be fully explored. In one of the heat engine environments, we introduced a nonadiabatic process which caused the engine to lose energy. In this case, the agent produced, what is to the best our knowledge, the best solution for the problem. 
Follow Us 
Engage
Become an APS Member 
My APS
Renew Membership 
Information for 
About APSThe American Physical Society (APS) is a nonprofit membership organization working to advance the knowledge of physics. 
© 2019 American Physical Society
 All rights reserved  Terms of Use
 Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 207403844
(301) 2093200
Editorial Office
1 Research Road, Ridge, NY 119612701
(631) 5914000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 200452001
(202) 6628700