Skip to main content

Events

Numerical Analysis Seminar: Nathan Kutz, University of Washington, The future of governing equations

SAS 4201

A major challenge in the study of dynamical systems is that of model discovery: turning data into reduced order models that are not just predictive, but provide insight into the nature of the underlying dynamical system that generated the data. We introduce a number of data-driven strategies for discovering nonlinear multiscale dynamical systems and their…

Numerical Analysis: Weiqi Chu, UCLA, Non-Markovian opinion models inspired by random walks

SAS 4201

For social networks, nodes encode social entities, such as people, twitter accounts, etc., while edges encode relationship or events between entities. Opinion dynamics model opinion evolution as dynamical processes on social networks. Traditional models of opinion dynamics consider how opinions evolve either on time-independent networks or on temporal networks with edges that follow Poisson statistics.…

Numerical Analysis Seminar: Shiying Li, UNC-Chapel Hill, Transport transforms for machine learning applications

SAS 4201

Data or patterns (e.g., signals and images) emanating from physical sensors often exhibit complicated nonlinear structures in high dimensional spaces, which post challenges in constructing effective models and interpretable machine learning algorithms.  When data is generated through deformations of certain templates, transport transforms often linearize data clusters which are non-linear in the original domain. We…

Numerical Analysis Seminar: Boaz Nadler, Weizmann Institute, Completing Large low rank Matrices with only few observed entries: A one-line algorithm with provable guarantees

SAS 4201

Suppose we observe very few entries from a large matrix. Can we predict the missing entries, say assuming the matrix is (approximately) low rank ? We describe a very simple method to solve this matrix completion problem. We show our method is able to recover matrices from very few entries and/or with ill conditioned matrices,…

Numerical Analysis Seminar: Yuehaw Khoo, University of Chicago, New approaches in simulation of transition paths

Zoom

Tensor method can be used for compressing high-dimensional functions arising from partial differential equations (PDE). In this talk, we focus on using these methods for the simulation of transition processes between metastable states in chemistry applications, for example in molecular dynamics. To this end, we also propose a novel generative modeling procedure using tensor-network without…

Numerical Analysis Seminar: Deep Ray, The University of Maryland, VarMiON: A variationally mimetic operator network

SAS 4201

Operator networks have emerged as promising deep learning tools for approximating the solution to partial differential equations (PDEs). These networks map input functions that describe material properties, forcing functions and boundary data to the solution of a PDE, i.e., they learn the solution operator of the PDE. In this talk, we consider a new type…

Numerical Analysis Seminar: Deep Ray, University of Maryland, College Park, VarMiON: A variationally mimetic operator network

SAS 4201

Operator networks have emerged as promising deep learning tools for approximating the solution to partial differential equations (PDEs). These networks map input functions that describe material properties, forcing functions and boundary data to the solution of a PDE, i.e., they learn the solution operator of the PDE. In this talk, we consider a new type…

Numerical Analysis Seminar: Li Wang, University of Minnesota, Neural network based solvers for kinetic equations

SAS 4201

Deep learning method has emerged as a competitive mesh-free method for solving partial differential equations (PDEs). The idea is to represent solutions of PDEs by neural networks to take advantage of the rich expressiveness of neural networks representation. In this talk, we will explore the applicability of this powerful framework to the kinetic equation, which…

Numerical Analysis Seminar: Pejman Sanaei, Georgia State University, On mathematical modeling and simulation of flight stability of objects, tissue engineering and droplets

SAS 4201

In this talk, I will present 3 problems on fluid structure interaction: 1) Flight stability of wedges: Recent experiments have shown that cones of intermediate apex angles display orientational stability with apex leading in flight. Here we show in experiments and simulations that analogous results hold in the two-dimensional context of solid wedges or triangular prisms in planar…

Numerical Analysis Seminar: Koffi Enakoutsa and Xinghao Dong , UCLA, The Morrey Conjecture: Insights from Numerical Simulations on Quasi-Convexity and Rank-One Convexity.

Zoom

The Morrey Conjecture concerns quasi-convexity and rank-one convexity of functions. While the former implies the latter, it's unclear if the converse is true. Sverak proved the conjecture in 3D, but it remains unresolved in the planar case. Analyzing these properties analytically is difficult, especially for vector-valued functions, hence we perform numerical simulations using example functions…

Numerical Analysis Seminar: Elizabeth Newman, Emory University, How to Train Better: Exploiting the Separability of Deep Neural Networks

SAS 4201

Deep neural networks (DNNs) have gained undeniable success as high-dimensional function approximators in countless applications. However, there is a significant hidden cost behind triumphs - the cost of training. Typically, DNN training is posed as a stochastic optimization problem with respect to the learnable DNN weights. With millions of weights, a non-convex and non-smooth objective…

Numerical Analysis Seminar: Elizabeth Newman, Emory University, Diving Deep Learning I

SAS 4201

Deep learning is one of the most universal techniques in the modern big data era, achieving remarkable success across imaging, healthcare, natural language processing, and more.  As applications begin to rely more heavily on deep learning, it is crucial that we understand how these algorithms make predictions and how we can make them better (e.g.,…

Numerical Analysis Seminar: Elizabeth Newman, Emory University, Diving Deep Learning II

SAS 4201

Deep learning is one of the most universal techniques in the modern big data era, achieving remarkable success across imaging, healthcare, natural language processing, and more.  As applications begin to rely more heavily on deep learning, it is crucial that we understand how these algorithms make predictions and how we can make them better (e.g.,…

Computational and Applied Mathematics Seminar: Shira Faigenbaum-Golovin, Duke University, Reconstruction, denoising, and studying the geometry of the base manifold in high-dimension space

SAS 4201

It is common to assume that the data was sampled from a low-dimensional manifold in a high-dimensional space. In real life, neither the dimension of this manifold nor its geometry is known, and the data is often contaminated with noise and outliers. In this talk, we first present a method for denoising and reconstructing a…

Computational and Applied Mathematics Seminar: Dimitris Giannakis, Dartmouth, Quantum Information Science for Modeling Classical Dynamics

Zoom

Over the past three decades, a fruitful approach for analysis and data-driven modeling of dynamical systems has been to consider the action of (nonlinear) dynamics in state space on linear spaces of observables. These methods leverage the linearity of the associated evolution operators, namely the Koopman and transfer operators, to carry out tasks such as…

Computational and Applied Mathematics Seminar:Tibor Illés, Corvinus University, Budapest, Hungary, Sufficient linear complementarity problems: pivot versus interior point algorithms

SAS 4201

Linear complementarity problems (LCP) generalizes some fundamental problems of mathematical optimization like linear programming (LP) problem, linearly constrained quadratic programming (LQP) problem and some others. It admits an enormous number of applications in economics, engineering, science, and many other fields. After all these, it is not surprising that LCPs are usually NP-complete problems (S.J. Chung,…

Computational and Applied Mathematics Seminar: Ke Chen, Maryland, Towards efficient deep operator learning for forward and inverse PDEs: theory and algorithms

Zoom

Deep neural networks (DNNs) have been a successful model across diverse machine learning tasks, increasingly capturing the interest for their potential in engineering problems where PDEs have long been the dominant model. This talk delves into efficient training for PDE operator learning in both the forward and the inverse problems setting. Firstly, we address the curse…

Computational and Applied Mathematics Seminar: Gregory Ongie, Marquette University, A function space view of infinite-width neural networks

Zoom

It is well-known that nearly any function can be approximated arbitrarily-well by a neural network with non-linear activations. However, one cannot guarantee that the weights in the neural network remain bounded in norm as the approximation error goes to zero, which is an important consideration when practically training neural networks. This raises the question: What…

Computational and Applied Mathematics Seminar: Antoine Blanchard, Verisk, A Multi-Scale Deep Learning Framework for Projecting Weather Extremes

Zoom

Extreme weather events are of growing concern for societies because under climate change their frequency and intensity are expected to increase significantly. Unfortunately, general circulation models (GCMs)--currently the primary tool for climate projections--cannot characterize weather extremes accurately. Here, we report on advances in the application of a multi-scale deep learning framework, trained on reanalysis data,…

Computational and Applied Mathematics Seminar: Gabriel P. Langlois, Courant Institute, An exact and efficient algorithm for the Lasso regression problem based on a Hamilton-Jacobi PDE formulation

Zoom

The Basis Pursuit Denoising problem, also known as the least absolute shrinkage and selection operator (Lasso) problem, is a cornerstone of compressive sensing, statistics and machine learning. In high-dimensional problems, recovering an exact sparse solution requires robust and efficient optimization algorithms. State-of-the-art algorithms for the Basis Pursuit Denoising problem, however, were not traditionally designed to…