Bulletin of the American Physical Society
APS March Meeting 2020
Volume 65, Number 1
Monday–Friday, March 2–6, 2020; Denver, Colorado
Session U68: Searching for Simplicity: Information Geometry and Model ReductionInvited
|
Hide Abstracts |
Sponsoring Units: DBIO Chair: James Sethna, Cornell University Room: Four Seasons 4 |
Thursday, March 5, 2020 2:30PM - 3:06PM |
U68.00001: Using Information Geometry to find simple models of complex processes Invited Speaker: Mark Transtrum Effective theories play a fundamental role in how we organize our knowledge about the world. Although reality is much more complicated than our models, we justify parsimonious representations by judiciously ignoring degrees of freedom that are irrelevant for the predictions of interest. Often, these models are related through a hierarchy of simplifying approximations that formally justify their respective domains of validity. I demonstrate how information geometry can be used to construct such effective theories for many complex systems, including systems beyond the reach of traditional methods. Embedded in the mathematical form of many model classes is a hierarchy of natural approximations. These approximations are manifest as boundaries of high-dimensional manifolds. These approximations are not black-boxes. They remain expressed in terms of the relevant combinations of mechanistic parameters and reflect the physical principles from which the complicated model was built. Furthermore, these approximations can be identified in a data-driven way for models with thousands of parameters. |
Thursday, March 5, 2020 3:06PM - 3:42PM |
U68.00002: Finding and explaining structural hierarchies in models of complex systems Invited Speaker: Katherine Quinn Sloppy models form a universality class of complex, nonlinear models in which outcomes are significantly affected by only a small subset of parameter combinations, arising in disparate fields from systems biology to accelerator physics. By unifying information geometric interpretations of sloppiness with Chebyshev approximation theory, I will derive a formal and systematic explanation of why sloppiness occurs. I will then extend this framework to general probabilistic models and data, to derive a widely-applicable manifold-learning method called InPCA that ameliorates a canonical problem in machine learning: the "curse of dimensionality". |
Thursday, March 5, 2020 3:42PM - 4:18PM |
U68.00003: Lucy Colwell Invited Talk
|
Thursday, March 5, 2020 4:18PM - 4:54PM |
U68.00004: Identifiability, uncertainty, and parameter reduction in mathematical biology Invited Speaker: Marisa Eisenberg The interactions between parameters, model structure, and outputs can determine what inferences, predictions, and control strategies are possible for a given system. Identifiability, estimability, and parameter space reduction methods are thus essential for many questions in mathematical modeling and uncertainty quantification. These approaches can help to determine what inferences and predictions are possible from a given model and data set, and help guide control strategies and new data collection. In this talk, I will discuss some of the ideas and methods from identifiability, how they link to ideas of model reduction and model selection, and present public health applications to recent epidemics of polio and cholera. We will illustrate how reparameterization and alternative data collection may help resolve various types of unidentifiability and allow for successful intervention predictions. |
Thursday, March 5, 2020 4:54PM - 5:30PM |
U68.00005: Manifold Learning for Parameter Reduction Invited Speaker: Yannis Kevrekidis Large scale dynamical systems (e.g. many nonlinear coupled differential equations) can often be summarized in terms of only a few state variables (a few equations), a trait that reduces complexity and facilitates exploration of behavioral aspects of otherwise intractable models. High model dimensionality and complexity makes symbolic, pen--and--paper model reduction tedious and impractical, a difficulty addressed by recently developed frameworks that computerize reduction. As the interest in mapping out and optimizing complex input--output relations keeps growing, it becomes clear that combating the curse of dimensionality also requires efficient schemes for input space exploration and reduction. Here, we explore systematic, data-driven parameter reduction by means of effective parameter identification, starting from current nonlinear manifold-learning techniques enabling state space reduction. Our approach aspires to extend the data-driven determination of effective state variables with the data-driven discovery of effective model parameters, and thus to accelerate the exploration of high-dimensional parameter spaces associated with complex models. We also discuss the data-driven exploration of conjugacies between different models/observations. |
Follow Us |
Engage
Become an APS Member |
My APS
Renew Membership |
Information for |
About APSThe American Physical Society (APS) is a non-profit membership organization working to advance the knowledge of physics. |
© 2024 American Physical Society
| All rights reserved | Terms of Use
| Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 20740-3844
(301) 209-3200
Editorial Office
100 Motor Pkwy, Suite 110, Hauppauge, NY 11788
(631) 591-4000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 20045-2001
(202) 662-8700