Bulletin of the American Physical Society
APS March Meeting 2022
Volume 67, Number 3
Monday–Friday, March 14–18, 2022; Chicago
Session N09: Predicting Nonlinear and Complex Systems with Machine Learning IIFocus Recordings Available
|
Hide Abstracts |
Sponsoring Units: GSNP DSOFT DCOMP Chair: Ying-Cheng Lai, Arizona State University Room: McCormick Place W-180 |
Wednesday, March 16, 2022 11:30AM - 12:06PM |
N09.00001: Choosing Optimal Reservoir Computers Invited Speaker: Thomas L Carroll A reservoir computer is a high dimensional dynamical system used for computation. Typically a reservoir computer is created by connecting a large number of nonlinear nodes in a network. There can be hundreds to thousands of nodes, so optimizing the structure of the reservoir computer is difficult. There are a number of conventional rules for optimizing a reservoir computer based on experience with simulations, but these rules are based on observations of a limited number of node nonlinearities. One feature of reservoir computers is that they may be built from connecting together analog nodes such as lasers, quantum dots, memristors, or other devices. There is a great range of possible nonlinear functions describing these nodes, so design rules beyond the conventional wisdom are required. |
Wednesday, March 16, 2022 12:06PM - 12:18PM |
N09.00002: Physical Reservoir Computing with Over-Moded Complex Systems Shukai Ma, Thomas M Antonsen, Steven M Anlage, Edward Ott The ceaseless development of machine learning technologies has introduced high-performance computation of a wide range of tasks, although at the expense of larger computational costs. The execution of machine learning (ML) algorithms largely depends on its computing `substrate', which is often not optimized for running ML tasks. Thus, the investigation of alternative computing methods with tailored physical structures has attracted a great deal of research interest. |
Wednesday, March 16, 2022 12:18PM - 12:30PM |
N09.00003: Data-driven Surrogate Modeling for Nonlinear Material Systems in Unconventional Computing Philip Buskohl, Benjamin Grossmann, Daniel Nelson, Amanda Criner, Timothy J Vincent, Andrew Gillman Input-driven physical systems that exhibit nonlinear behavior are prime candidates for demonstrating unconventional computing concepts such as physical implementations of reservoir computing. Reservoir computing is a class of recurrent neural networks that utilize a simple readout layer training approach, opening the door to emulating this form of signal processing in atypical materials and physics. However, the large parameter space of these systems and the high computational expense of characterizing their performance present a challenge to efficiently matching the nonlinear dynamics with optimal computing tasks. To address this challenge, we leverage a data-driven, physics-agnostic modeling technique based on Koopman theory to produce a low-cost surrogate of the reservoir dynamics for benchmarking. This approach combines an underlying linear time invariant system with a nonlinear mapping. Preliminary results indicate a 150x speed-up in time series data generation of the network dynamics compared to Runge-Kutta iteration, which is critical for efficient benchmarking of reservoir performance (since benchmarking requires repeated simulation). Moreover, as we will demonstrate, the model parameters can be directly interpreted to yield insight into the behavior of the system of interest. |
Wednesday, March 16, 2022 12:30PM - 12:42PM |
N09.00004: Koopman Theory and Predictive Equivalence: Learning Implicit Models of Complex Systems from Partial Observations Adam Rupe, Velimir V Vesselinov, James P Crutchfield Only a subset of degrees of freedom are typically accessible in the real-world. And so, the proper setting for empirical modeling is that of partially-observed systems. To predict the future behavior with a physics simulation model, the missing degrees of freedom must be explicitly accounted for using data assimilation and model parameterization. Recently, data-driven models have consistently outperformed physics simulations for systems with few observable degrees of freedom (e.g. hydrological systems). Here we provide an operator-theoretic explanation for this empirical success. Delay-coordinate embeddings and their evolution under the Koopman operator implicitly model the effects of the missing degrees of freedom. For complex and high-dimensional systems, data-driven models must accommodate noise, and this naturally leads to the concept of predictive equivalence. We employ a far-from-equilibrium Mori-Zwanzig formalism to show how predictive equivalence and the Koopman operator create a physically-consistent stochastic model for the observed degrees of freedom. This work clarifies the relationship between implicit data-driven models and explicit physics simulations. Our causal state analog forecasting algorithm demonstrates these results in practice on real and synthetic data. |
Wednesday, March 16, 2022 12:42PM - 12:54PM |
N09.00005: Local Flow Environment as Information Processing Medium Timothy J Vincent, Philip Buskohl, Benjamin Grossmann, Daniel Nelson, Benjamin Dickinson, Jeffery Baur, Alexander Pankonien Recent advances in sensor technology have opened the door to applying distributed sensing to bodies in complex flow environments. However, the increased computational burden of processing the resulting information is not trivial. Using the mathematical framework of reservoir computing, we demonstrate that it is possible to offload some of this burden to the dynamics of the environment. The local flow environment may be treated as a non-linear operator which maps an input signal, a perturbation of the mean external flow, into a high dimensional latent space. A complex non-linear output function, such as a control signal, may then be computed by strategically sampling this space with a distributed sensor array and combining the readouts using a simple weighted sum. In this work, we demonstrate this concept using computational simulations of flow in an open-cavity and explore how sensor placement and cavity geometry affect the information processing capability. We find that placing an array of sensors at the bottom of the cavity gives better computational performance on a benchmark non-linear task than uniform sampling throughout the volume. |
Wednesday, March 16, 2022 12:54PM - 1:06PM |
N09.00006: Reservoir Computing: Structure analysis and dynamics predictability Rosangela Follmann, Cassie Mcginnis, Gangadhar Katuri, Epaminondas Rosa Reservoir computing in machine learning is promoting better and faster predictability at lower computational cost. In this work we investigate the effects of reservoir network topology structures on temporal predictability. We employ reservoir computing to predict the time evolution of neuronal activity produced by the Hindmarsh-Rose neuronal model. Our results show accurate short and long-term predictions for periodic neuronal behaviors, but only short-term accurate predictions for chaotic neuronal states. However, after the accuracy of the short-term predictability deteriorates in the chaotic regime, the predicted output continues to display similarities with the actual neuronal behavior. This is reinforced by a striking resemblance between the bifurcation diagrams of the actual and of the predicted outputs. Distinct network topologies are tested, and error analyses of the reservoir's performance are consistent with standard results previously obtained. Given the relevance of early detection of troubling neuronal activity, particularly in the case of individuals with neurological disorders, we conceive the possibility for development of devices capable of anticipating trends toward undesirable neuronal states, early enough for effective preventive intervention. |
Wednesday, March 16, 2022 1:06PM - 1:18PM |
N09.00007: Learning Parametric Dynamical Systems from Videos with Integer Programming Kazem Meidani, Amir Barati Farimani Identification of dynamical systems from measurements using data-driven frameworks enables us to model, predict, and control nonlinear systems. Existing explicit and implicit parsimonious models can discover the state equations of a single system provided the relatively clean measurements from sensors. These models, however, are usually limited to the observed system and depend significantly on the high quality of data that can be measured by expensive sensors. We propose a novel framework based on Integer Programming (IP) that can robustly identify the parametric forms of dynamical systems which generalize well to the other choices of parameters. Also, by processing the data using a sequential filtering scheme, the proposed model identifies some mechanical dynamical systems from visual inputs by transforming pixel-space videos into state-space data. We show that full equations of systems like an inverted pendulum on a cart can be identified robustly in the presence of artificial noise or noisy object trajectories extracted from videos. |
Wednesday, March 16, 2022 1:18PM - 1:30PM |
N09.00008: Bayesian Modelling of Phase-Field Crystal Models for Targeted Crystalline Patterns Natsuhiko Yoshinaga, Satoru Tokuda Partial differential equations (PDE) have been widely used to reproduce patterns in nature and to give insight into the mechanism underlying pattern formation. Although many PDE models have been proposed, they rely on the pre-request knowledge of physical laws and symmetries, and developing a model to reproduce a given desired pattern remains difficult. We propose a novel method to estimate the best dynamical PDE for one snapshot of a target pattern under the stationary state without ground truth. We apply our method to nontrivial patterns, such as quasi-crystals (QCs), a double gyroid and Frank Kasper structures. Our method works for noisy patterns and the pattern synthesised without the ground truth parameters, which are required for the application toward experimental data. |
Wednesday, March 16, 2022 1:30PM - 1:42PM |
N09.00009: Learning and predicting complex systems dynamics from single-variable observations George Stepaniants, Alasdair Hastewell, Dominic J Skinner, Jan F Totz, Jorn Dunkel Advances in model inference and data-driven science have enabled the accurate discovery of governing equations from observations alone, accelerating our understanding and control of dynamical systems. However, despite the ever-growing amount of experimental data collected, many physical and biological systems can only be partially observed. Here, building on recent progress in the inference and integration of nonlinear differential equations, we introduce an approach to learn a model using observations of just a single variable within a multi-variable dynamical system, and use this model to accurately predict future dynamics. Furthermore, we validate our approach on a variety of physical, chemical and biological systems which exhibit nonlinear dynamics and chaos. |
Wednesday, March 16, 2022 1:42PM - 1:54PM |
N09.00010: The information bottleneck powered by deep learning to illuminate micro to macro relationships in complex systems Kieran A Murphy, Danielle S Bassett Deep learning is quickly becoming a ubiquitous tool for studying relationships between microscopic details and macroscopic behavior in complex systems. After a predictive model is successfully trained, however, the route to further science is often unclear as insight remains locked inside the metaphorical black box. Here we show how to leverage the information bottleneck (IB) to provide crucial interpretability and illuminate the rich interplay between details at the smallest scales of complex systems. Grounded in information theory, IB is about the fundamental tradeoff between frugality of detail and fidelity of representation when relating one variable to another. By varying the relative importance of the terms in the IB tradeoff, the nature of a relationship between variables is laid bare as information is sorted by order of relevance. As a practical case study, we apply a variational formulation of IB to simulated glasses and find the most relevant combinations of markers of local structure for determining future rearrangement. We study the sharp transitions which can occur in the IB and cast their significance in terms of physical features. Finally, we probe beyond the comparative relevance of variables by examining how variables are compressed along the IB tradeoff. |
Wednesday, March 16, 2022 1:54PM - 2:06PM |
N09.00011: Universality in Prediction Markets Keanu M Rock In a prediction market (PM), investors buy and sell contracts tied to the outcome of a real-world event such as “Will Donald Trump be Re-elected President in 2020?”. Those who guess correctly receive a fixed payout, while everyone else receives nothing. Contract prices are driven by supply and demand as investors react to new information. As such, a contract’s price captures the crowd’s estimation of the event’s likelihood over time. A large body of research has focused on the accuracy of PMs in predicting final results, with little attention given to the dynamics that drive markets to (in)accurate predictions. Here, we analyze 2859 contracts from a popular online PM – PredictIt – covering events such as elections, bills, and politicians career milestones. We find striking universal statistical laws governing the distribution of price fluctuations, the size of trade volume over time, and the likelihood of different price levels over a contract’s lifetime. What’s more, we find that (up to rescaling), PM time series cluster naturally into only a handful of characteristic shapes. Our findings suggest that the complex human interactions driving PM dynamics can be embedded in a low-dimensional space of variables, opening the door to the mechanistic modelling of these social systems. |
Wednesday, March 16, 2022 2:06PM - 2:18PM |
N09.00012: Lotka-Volterra predator-prey lattice model with a time-dependent carrying capacity. Mohamed Swailem, Uwe C Tauber Environmental variability is crucial to understanding species coexistence. Traditional population dynamics models seldom consider coupling environmental variability to the intrinsic noise of the ecological system. This study aims to investigate the effect of environmental change on the behavior of the two-species Lotka-Volterra lattice model for predator-prey competition and coexistence. It is well-established that a predator extinction phase transition occurs in lattice models with on-site restriction (i.e., finite carrying capacity), and it is absent when the carrying capacity is infinite. We model an environment with a varying nutrient abundance through a temporally changing carrying capacity, which is here implemented as a square-wave signal with a constant frequency. The output of the stochastic Monte Carlo simulation is used to study the density oscillations in this periodically driven system and investigate resonance phenomena. Additionally, we compute temporal correlations and study the effect of changing the carrying capacity on the predator extinction phase transition. Preliminary results show a period-doubling effect for specific predation rates, where the density oscillation frequency is twice the carrying capacity frequency. |
Wednesday, March 16, 2022 2:18PM - 2:30PM |
N09.00013: Cyclic predator-prey models with time varying rates Michel Pleimling, Hana Z Mir, James Stidham Cyclic predator-prey models have been shown to establish stable spiral waves that allow for long term species coexistence. In this study, we analyze a system in which we change the characteristics of species interactions over time, namely their predation and reproduction rates, using the May-Leonard model in order to replicate seasons and other environmental factors that occur in real life ecologies. We compare the effects of changing the reproduction rates vs the predation rates and find a change in the reproduction rates creates a much stronger effect on species densities and spiral size. We vary both rates periodically by using square and sinusoidal waves, and randomly such that at every time step, there is a set probability for the rates to change between two values. Discrete Fourier analysis of the auto-correlation function is done to find the characteristic frequencies and hence the spiral size of the system during different regimes. Numerical analysis of the average density is also done in the different regimes to provide more data. We extend our analysis to other multiple species games as well. |
Follow Us |
Engage
Become an APS Member |
My APS
Renew Membership |
Information for |
About APSThe American Physical Society (APS) is a non-profit membership organization working to advance the knowledge of physics. |
© 2024 American Physical Society
| All rights reserved | Terms of Use
| Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 20740-3844
(301) 209-3200
Editorial Office
100 Motor Pkwy, Suite 110, Hauppauge, NY 11788
(631) 591-4000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 20045-2001
(202) 662-8700