Skip to main content

Loading Events

« All Events

  • This event has passed.

Applied Mathematics Graduate Student Association Seminar: William Anderson, Fast and Scalable Computation of Reduced-Order Nonlinear Solutions for PDEs, Abhijit Chowdhary, Sensitivity Analysis of the Information Gain in Infinite-Dimensional Bayesian Linear Inverse Problems

February 20, 2023 | 3:00 pm - 4:00 pm EST

– Presenter: William Anderson
– Title: Fast and Scalable Computation of Reduced-Order Nonlinear Solutions for PDEs
– Abstract: We develop a method for fast and scalable computation of reduced-order nonlinear solutions (RONS). RONS is a framework to build reduced-order models for time-dependent partial differential equations (PDEs), where the reduced-order solution depends nonlinearly on time-varying parameters. With RONS we obtain an explicit set of ordinary differential equations (ODEs) to evolve the parameters. These ODEs minimize the instantaneous error between dynamics of the governing PDE and dynamics of the reduced-order solution. Additionally, conserved quantities of the PDE are easily enforced in the reduced solution using the RONS framework. For a reduced-order model with n parameters, naive calculation of ODEs produced by RONS requires evaluating order n^2  integrals. By exploiting the structure of the RONS equations and using symbolic computation, we reduce the computational cost to order K^2 where K<< n. With this approach we apply RONS to problems which require many parameters in the reduced-solution, including simulation of vortex dynamics in turbulent fluid flow and the Fokker-Planck equation in high dimensions.
– Presenter: Abhijit Chowdhary
– Title: Sensitivity Analysis of the Information Gain in Infinite-Dimensional Bayesian Linear Inverse Problems
– Abstract. We consider sensitivity analysis of Bayesian linear inverse problems with respect to modeling uncertainties. To this end, we consider sensitivity analysis of the information gain, as measured by the Kullback—Leibler divergence from the posterior to the prior. This choice provides a principled approach that leverages key structures within the Bayesian inverse problem. Also, the information gain admits a closed-form expression in the case of linear Gaussian inverse problems. The derivatives of the information gain are extremely challenging to compute. To address this challenge, we present accurate and efficient methods that combine eigenvalue sensitivities and hyper-differential sensitivity analysis that take advantage of adjoint based gradient and Hessian computation. This results in a computational approach whose cost, in number of PDE solves, does not grow upon mesh refinement. These results are presented in an application-driven model problem, considering a simplified earthquake model to infer fault slip from surface measurements.

Details

Date:
February 20, 2023
Time:
3:00 pm - 4:00 pm EST
Event Category:

Venue

SAS 4201