Bulletin of the American Physical Society
APS March Meeting 2023
Volume 68, Number 3
Las Vegas, Nevada (March 5-10)
Virtual (March 20-22); Time Zone: Pacific Time
Session A01: Predicting Nonlinear and Complex Systems with Machine Learning |
Hide Abstracts |
Sponsoring Units: GSNP DSOFT Chair: Yuhai Tu, IBM T. J. Watson Research Center Room: Room 124 |
Monday, March 6, 2023 8:00AM - 8:12AM |
A01.00001: Learning dynamics of complex systems from partial observations George Stepaniants, Alasdair Hastewell, Dominic J Skinner, Jan F Totz, Jorn Dunkel Complex multi-component systems from cells and tissues to biochemical reactors often exhibit oscillatory and chaotic nonlinear dynamics that are essential to their signaling properties and functions. Despite the rapid advancement of sensor and imaging technology, many physical and biological systems can only be partially observed with practitioners in need of model-fitting tools that can account for this missing information. Here we develop an automated inference method that discovers predictive differential equation models from a few noisy partial observations of a system's state. We illustrate our method on a combination of both simulation and experimental data from a variety of physical, chemical and biological systems showing that in many cases noisy partial observations are sufficient to infer predictive multivariate dynamical systems. |
Monday, March 6, 2023 8:12AM - 8:24AM |
A01.00002: Maximizing dynamical systems information embedded in experimental observables of molecules through statistical learning enabled Takens reconstruction Maximilian T Topel, Andrew L Ferguson Single molecule Forster Resonance Energy Transfer (smFRET) allows experimentalists to track changes in molecular conformations over time by tagging the molecule with two or more fluorescent probes whose emissivities correspond to inter-probe distances. While experiments of molecular dynamics are restricted to scalar time series, dye placement choice allows for the tuning of the intramolecular distance observable recorded. Takens' Delay Embedding Theorem guarantees that there exists a bijective Jacobian between manifolds embedding the full dimensional dynamics of a system and a time delayed embedding of scalar observables of that system under some minor technical conditions. The theorem is silent on the question of observable choice leading to the question of which dye placement choices maximally embed system dynamics. We combine Takens' Theorem, manifold learning, nonlinear mapping techniques and statistical mechanics to obtain mappings from experimental observation space to reconstructions of atomic coordinates using molecular dynamics training data. In previous work, we have reconstructed the mini-proteins Chignolin and Villin to accuracies better than 0.4 nm from synthetic noisy and time averaged trajectories generated from molecular dynamics simulations using single variate head to tail molecular distances. In this work, we expand this Single-molecule TAkens Reconstruction (STAR) technique to use synthetic multivariate data streams corresponding to a 3 dye multichannel smFRET system generated from molecular dynamics simulations of the mini-protein Villin to learn optimal dye placement to maximize information embedded in experiments. This work not only shows how dye placement can optimize reconstructed trajectories via STAR but in principle also suggests correct dye placement to enhance performance of standard smFRET analysis protocols such as Hidden Markov Models for identifying conformational states. |
Monday, March 6, 2023 8:24AM - 8:36AM |
A01.00003: Fractional neural networks for constitutive modeling of complex fluids Donya Dabiri, Milad Saadat, Deepak Mangal, Safa Jamali Constitutive modeling of complex fluids' rheological behavior has always been of great interest to academics and industrial researchers alike, as these complex fluids find applications in many diverse. In general, the goal is to describe the complex and non-linear responses of the fluid to an applied deformation/stress with a limited number of model parameters. Fractional derivatives have been found very effective in representing such complex behavior in a compact mathematical format. These fractional derivative, replacing the classical spring and dash-pot elements in a concise representation, can be used to alleviate the complexity of viscoelastic models and reduce the parameter count, thus preventing the unnecessary complications often encountered in rheological modeling. However, an automatic platform for guiding the experimental rheologists in determining the exact exponents of the fractional derivatives is lacking. Here, we will utilize rheology-informed neural networks (RhINNs) to quantify the quasi-properties and the fractional derivative orders. Also, we will show that the proposed fractional RhINN platform can accurately capture the material responses to transient flow protocols over the entire accessible input space, i.e., shear rate and time. |
Monday, March 6, 2023 8:36AM - 8:48AM |
A01.00004: Inferring force law in many-particle systems using physics-tailored machine learning Wentao Yu, Justin C Burton, Ilya M Nemenman, Eslam Abdelaleem Machine learning (ML) has the potential to revolutionize science by uncovering new physical laws governing complex systems. In recent years, ML has been successfully used to infer parsimonious equations describing simulated complex systems where the underlying physics is known. However, very few new laws of physics have been inferred from experimental data. Here we demonstrate an ML model capable of extracting physical forces from systems of many particles with complex interactions. The model is trained on experimental data: micron-sized particles immersed in weakly-ionized plasma (dusty plasma, DP). A vast array of collective, emergent behaviors have been reported in DP, but the particle charge, interaction law, and plasma properties are difficult to measure in-situ. Using laser-sheet tomography, we track the 3D positions of 10-20 particles in various plasma environments over minutes. The symmetries that govern our dusty plasma system have been built-in to the ML model, and we use neural networks to represent the particle interaction and environmental forces. The model achieves 99% of the variance when predicting the acceleration from experimental particle trajectories, and successfully identifies non-reciprocal interactions between particles, the particle mass, and the particle charge. We also demonstrate how the model fails in certain regimes where N-body interactions govern the force between particles. Finally, we expect the model can be easily tailored to describe other many-body systems where high-quality experimental data is available. |
Monday, March 6, 2023 8:48AM - 9:00AM |
A01.00005: Dynamics of long-term memory in recurrent neural networks Ling-Wei Kong, Junjie Jiang, Ying-Cheng Lai Recent development in artificial neural networks has opened many possibilities for developing long-term memory devices. The dynamics of the memory retrieval process have been poorly understood, with open issues such as how different memory states compete and how the desired memory state can be recalled. We study memory devices based on reservoir computing, a general class of recurrent neural networks, under two distinct settings: with or without an explicit index/address channel, corresponding to the "location-addressable" and "context-addressable" scenarios, respectively. We demonstrate that, for the location-addressable scenario, a single reservoir computer can restore more than a dozen sophisticated memory states, such as chaotic attractors, which are sustained and can be successfully recalled. The dynamics of the memory are studied with a focus on the transition success rates when switching among different memory states. Control strategies to enhance the success rates are articulated. For the "context-addressable" setting without an index channel, we exploit multistability to recall with the aid of some cue signals. These different memory states can be coexisting asymptotic attractors or transient states. A surprising transition phenomenon in the retrieval success rate emerges as the length of the cue signal varies. The dynamical behaviors associated with memory retrieval uncovered in this work provide foundational insights into developing long-term memory devices based on artificial neural networks. |
Monday, March 6, 2023 9:00AM - 9:12AM |
A01.00006: Searching for clog formation in hopper flow through comparative machine learning analyses Jesse M Hanlan, Sam J Dillavou, Douglas J Durian From salt in saltshakers to sheep in pens, particles flowing through outlets is ubiquitous. In granular hoppers, gravity causes grains to emerge from the outlet at a constant rate until, suddenly, a stable arch or dome forms and arrests any further flow. The formation of such clogs is a Poisson process (Thomas and Durian, PRL 2015); intuitively, the discharge randomly brings new flow microstates into the outlet region until one arises that causes a clog. These clog-causing flow microstates seem to be primarily configurational, rather than momentum-based (Koivisto and Durian, PRE 2017), but their exact nature remains to be elucidated. The high dimensional nature of the configuration states foils manual attempts to categorize them as flowing or clog-causing, so we instead employ machine learning techniques. We collect a large number of free-flowing, clog-causing, and totally-clogged configurations using a quasi-2d automated hopper and a high-speed camera. We then compare a variety of machine learning algorithms to probe for a signature of incipient clog formation. |
Monday, March 6, 2023 9:12AM - 9:24AM |
A01.00007: Variational Computation of the Committor for Reactive Events In and Out of Equilibrium Aditya N Singh, David T Limmer The identification of the committor, a function that encodes the exact reaction coordinate of rare events is a quintessential many-body problem in chemical and statistical physics. Of particular interest are out-of-equilibrium reactive processes, where the evaluation of the committor becomes intractable by a majority of current methods that rely on detailed balance. We present a method that employs a neural network ansatz with a variational optimization scheme to compute the exact committor function from a reactive trajectory ensemble. This function is related to the solution of the backward Kolmogorov equation and offers an optimal control policy to generate new uncorrelated reactive trajectories. Furthermore, we illustrate how this approach provides a novel way to decompose the rate and quantify the contributions to it from different degrees of freedom. We apply these methods to a variety of complex systems in and out of equilibrium including systems with active brownian particles and the isomerization of a peptide in solvent. Our results provide mechanistic insight into reactive events and address fundamental questions in kinetics regarding the choice of order parameters in molecular systems, and coupling of the rate to external driving. |
Monday, March 6, 2023 9:24AM - 9:36AM |
A01.00008: Predicting Microfluidic Droplet Diameters Using Machine Learning Serena Holte We have successfully generated a graphic user interface that predicts microfluidic droplet diameters from an artificial intelligence (AI) neural network. The neural network inputs are fluid properties and geometries of 3D glass capillary devices. For single emulsions, the mean-squared error at the end of 100 epochs for training and validation converged to a of 3.99% and 2.49%, respectively. The deep machine learning model provides an alternative method of predicting droplet size without the need for rigorous theory. Moreover, the model can be altered to predict other microfluidic parameters or properties and could be extended to other fluids as well. |
Monday, March 6, 2023 9:36AM - 9:48AM |
A01.00009: Machine Learning for Metamaterial Design Ryan van Mastrigt, Marjolein Dijkstra, Martin van Hecke, Corentin Coulais There is no standard method to design mechanical metamaterials. Instead, one is limited by designer's intuition and the tuning of a few predefined design parameters. Here we show that machine learning overcomes this limitation by learning the relation between design and mechanical property, which we use to inverse design for desired mechanical properties. This allows us to consider and efficiently explore much larger design spaces, opening up new possibilities for complex design of (meta)materials. |
Monday, March 6, 2023 9:48AM - 10:00AM |
A01.00010: Decomposing Long-Time Behavior of Dynamical Systems through Linear Regression Sam Quinn, Joshua L. Pughe-Sanford, Roman O Grigoriev Short trajectories of a dynamical system are a bridge between the governing equation and long-time behavior of the system. Trajectories are inherently one-dimensional objects. Despite this, the neighborhood of a trajectory extends to the full dimension of the system's state space. We use simple linear regression to determine the contribution of the individual trajectory neighborhoods to the long-time state-space probability distribution of the system. Constructing this probability distribution directly allows prediction of long-time averages of observable quantities. Better yet, the regression matrix and regression weights tell us which combinations of trajectories are relevant to the long-time behavior. |
Monday, March 6, 2023 10:00AM - 10:12AM |
A01.00011: Learning hydrodynamic equations from microscopic Langevin simulations of self-propelled particles dynamics Bappaditya Roy, Natsuhiko Yoshinaga In nonequilibrium systems, a collective movement of microscopic active particles often displays several common emerging properties, such as swarming, motility-induced phase separation, disorder-order transitions, anomalous density fluctuation, spatiotemporal patterning, and unusual rheological properties. However, those universal aspects of collective behaviors are hardly captured from microscopic particle-based simulation methods. The macroscopic properties obtained from nonlinear hydrodynamic equations are useful for understanding those aspects. Therefore, we start from the numerical Langevin simulations of the microscopic particle dynamics and present a data-driven strategy for the collection of self-propelled particles to develop the hydrodynamics equations. In our method, microscopic particle data is our input. Hence, the hydrodynamics fields are obtained by coarse-graining from the discrete description of particle dynamics. For partial differential equation (PDE) learning, the spectral representation gives the efficient and accurate computation of spatial and temporal derivatives of density and polarization density fields. Using sparse regression on the fields, we generate hydrodynamic equations.
The estimated PDEs from microscopic models are beneficial to understand the universal features of the system in comparison to standard supervised learning. Hence, the macroscopic features will be shared both by microscopic models and hydrodynamic equations.
|
Monday, March 6, 2023 10:12AM - 10:24AM |
A01.00012: Unraveling the role of Hydrogen bonds via two machine learning methods Freddie R Salsbury, Dizhou Wu Hydrogen bonds are essential for the creation and stability of protein structures because of their strong directional nature, short distance ranges, and abundance in folded proteins. H-bonds between atoms can maintain the protein's secondary structure and overall 3D structure. Protein structural changes are associated with the creation and destruction of hydrogen bonds. So, studying the hydrogen bonding network can help us better understand the allosteric pathway of the protein. In this research, we used two machine learning models, the logistic regression model, and the decision tree model, to study hydrogen bonding networks. We used these two models to study the H-bonds of four thrombin variants, WT, $Delta$K9, E8K, and R4A. We discovered that each model has unique benefits. The logistic regression model assesses the overall significance of each hydrogen bond, whereas the decision tree is better at detecting the hydrogen bonding motifs for each system. |
Monday, March 6, 2023 10:24AM - 10:36AM |
A01.00013: Statistical properties of empirical cross-covariance matrices of correlated large-dimensional datasets Arabind Swain, Eslam Abdelaleem, Ilya M Nemenman We study empirical cross-covariance matrices (ECCMs) between two large-dimensional variables that are correlated along a handful of latent dimensions. By analogy with the recent work on empirical covariance matrices of data with latent linear structure, we define a generative model for such cross-correlations and then use the Random Matrix Theory (RMT) to calculate the probability density of singular values of the ECCM as a function of the number of samples, signal-to-noise ratio along shared and non-shared dimensions, and the ratio of shared and non-shared latent features. In various limits in this parameter space, we obtain the sought density function analytically and numerically. This opens up a possibility for identification of existence of shared latent features in experimental datasets from the spectra of ECCMs. |
Monday, March 6, 2023 10:36AM - 10:48AM |
A01.00014: Automated neuron tracking using deep learning and targeted augmentation allows fast collection of C. elegans whole brain calcium activity during behavior Core Francisco Park, Sahand Rahi, Aravinthan Samuel, Mahsa Barzegar Keshteli, Kseniia Korchagina, Ariane Delrocq, Vladislav Susoy, Corinne Jones With advances in optical imaging and fluorescent proteins, it is now possible to record calcium activities from the whole brain of the roundworm C. elegans during freely moving behavior. However, tracking the position and shape of each neuron is a major analysis bottleneck limiting the throughput. The data presents numerous challenges: the animal moves, rotates and deforms rapidly; the limited frame rate causes motion blur; manually annotating 3D images is very difficult. While convolutional neural networks(CNNs) are highly effective for image analysis, they generally require a large training set. |
Follow Us |
Engage
Become an APS Member |
My APS
Renew Membership |
Information for |
About APSThe American Physical Society (APS) is a non-profit membership organization working to advance the knowledge of physics. |
© 2024 American Physical Society
| All rights reserved | Terms of Use
| Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 20740-3844
(301) 209-3200
Editorial Office
100 Motor Pkwy, Suite 110, Hauppauge, NY 11788
(631) 591-4000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 20045-2001
(202) 662-8700