77th Annual Meeting of the Division of Fluid Dynamics
Sunday–Tuesday, November 24–26, 2024;
Salt Lake City, Utah
Session X01: Minisymposium: Bayesian Inference for Synthesis of Models and Data in Fluid Mechanics
8:00 AM–10:36 AM,
Tuesday, November 26, 2024
Room: Ballroom A
Abstract: X01.00002 : The Elephant in the Room: Adjoint-accelerated Bayesian Inference into multi-parameter CFD
8:26 AM–8:52 AM
Abstract
Presenter:
Matthew P Juniper
(Univ of Cambridge)
Author:
Matthew P Juniper
(Univ of Cambridge)
John von Neumann is often quoted as saying "with four parameters I can fit an elephant, and with five I can make him wiggle his trunk." The implication is that physical models should contain only a handful of parameters. A century later, we seem happy to use physics-agnostic neural networks containing millions of parameters. What would von Neumann say? How should physical modellers respond? I will show that von Neumann's quote is more nuanced than it sounds. I will then frame a response within a probabilistic framework in which priors and data are both treated as sources of information. Some prior information, such as conservation of mass and momentum, is hard-wired into models and cannot be violated. Other prior information, such as the value of viscosity or the position of a boundary, can be soft-wired into model parameters as a prior probability distribution. Data then becomes a source of information for each candidate model. The information content of data can be quantified for each model and the likelihoods of different candidate models can be compared after the data arrives. Crucially, adjoint methods can be used to accelerate Bayesian inference such that it becomes computationally tractable for many important multi-parameter fluids problems. I will demonstrate this through assimilation of 3D Flow-MRI data directly into 3D CFD (i) to segment and reconstruct the flow in an in-vitro aorta, (ii) to infer a non-Newtonian viscosity model from one MR image, (iii) to infer candidate turbulence models from data. This probabilistic framework shows the power of keeping physics in a model if you can, because models cannot violate the imposed physics, can output their uncertainty, and have far fewer parameters than neural networks so require much less training data.