Skip to main content

Events

Nonlinear Analysis Seminar and Differential Equation Seminar:Giuseppe Buttazzo, University of Pisa, Italy, Antagonistic cost functionals in shape optimization

Zoom

In several shape optimization problems one has to deal with cost functionals of the form ${\cal F}(\Omega)=F(\Omega)+kG(\Omega)$, where $F$ and $G$ are two shape functionals with a different monotonicity behavior and $\Omega$ varies in the class of domains with prescribed measure. In particular, the cost functional ${\cal F}(\Omega)$ is not monotone with respect to $\Omega$…

Computational and Applied Mathematics Seminar:Tibor Illés, Corvinus University, Budapest, Hungary, Sufficient linear complementarity problems: pivot versus interior point algorithms

SAS 4201

Linear complementarity problems (LCP) generalizes some fundamental problems of mathematical optimization like linear programming (LP) problem, linearly constrained quadratic programming (LQP) problem and some others. It admits an enormous number of applications in economics, engineering, science, and many other fields. After all these, it is not surprising that LCPs are usually NP-complete problems (S.J. Chung,…

Applied Math Graduate Student Seminar: Abhijit Chowdhary, NC State, Scalable Sensitivity Analysis and Optimal Design for Bayesian Inverse Problems

SAS 4201

Inverse problems are an expanding field with many practical applications in scientific computing and engineering. Their Bayesian enhancement encodes prior knowledge and data uncertainties into a posterior. This is an important tool in uncertainty quantification. However, performing uncertainty quantification tasks on top of this posterior is difficult to formulate and often computationally intractable. Hence, for…

Colloquium: Moody Chu, NC State, Optimal Hamiltonian Synthesis for Quantum Computing

SAS 4201

Simulating the time evolution of a Hamiltonian system on a classical computer is hard—the computational power required to even describe a quantum system scales exponentially with the number of its constituents, let alone integrating its equations of motion. Hamiltonian simulation on a quantum machine is a possible solution to this challenge. Assuming that a quantum…

Biomathematics Seminar: Yutong Sha, University of California Irvine, Reconstructing transition dynamics from static single-cell genomic data

Cox 306

Recently, single-cell transcriptomics has provided a powerful approach to investigate cellular properties in unprecedented resolution. However, given a small number of temporal snapshots of single-cell transcriptomics, how to connect them to obtain their collective dynamical information remains an unexplored area. One major challenge to connecting temporal snapshots is that cells measured at one temporal point…

Algebra and Combinatorics Seminar: Joel Brewster Lewis, George Washington University, Bargain hunting in a Coxeter group

SAS 4201

Petersen and Tenner defined the depth statistic for Coxeter group elements which, in the symmetric group, can be described in terms of a cost-minimization problem over the factorizations of a permutation into transpositions. We generalize that cost function to the other classical (finite and affine) Weyl groups, letting the cost of an individual reflection t…

Teaching and Learning Seminar: Kylan Schatz, Luke Castle, Tiancheng Xue, Lightning Talks

SAS 4201

Once per semester, the Teaching and Learning Seminar will host 10-minute "lightning talks" in which graduate students and/or faculty members will talk about some element of their teaching that they'd like to share. This may be a cool example you came up with for a class you are teaching, an innovative teaching technique or application…

Computational and Applied Mathematics Seminar: Ke Chen, Maryland, Towards efficient deep operator learning for forward and inverse PDEs: theory and algorithms

Zoom

Deep neural networks (DNNs) have been a successful model across diverse machine learning tasks, increasingly capturing the interest for their potential in engineering problems where PDEs have long been the dominant model. This talk delves into efficient training for PDE operator learning in both the forward and the inverse problems setting. Firstly, we address the curse…

Geometry and Topology Seminar: Kai-Wei Zhao, University of Notre Dame, On the blowup of regularized solutions to the Jang equation and constant expansion surfaces

SAS 1216

Schoen-Yau proved the spacetime positive energy theorem by reducing it to the time-symmetric (Riemannian) case using the Jang equation. To acquire solutions to the Jang equation, they introduced a family of regularized equations and took the limit of regularized solutions, whereas a sequence of regularized solutions could blow up in some bounded regions enclosed by apparent horizons. They analyzed the blowup behavior near and outside the apparent horizons, but what happens inside…

Applied Math Graduate Student Seminar: Harley Hanes, NC State, Boundary Quantification and Optimal Sample Identification in Reduced-Order Models

SAS 4201

Reduced-order models (ROMs) are a critical tool for sensitivity analysis, parameter inference, and uncertainty quantification where high-fidelity models would be computationally intractable. Galerkin POD-ROMs are one particular class of ROMs which project high-fidelity model equations onto a set of model solutions to construct ROMs retaining original model parameters and physics, enabling accurate sensitivity analysis, parameter inference,…

Algebra and Combinatorics Seminar: Kyle Celano, Wake Forest University, Chromatic Symmetric Functions and RSK for (3 + 1)-free Posets

SAS 4201

In 1995, Stanley introduced the chromatic symmetric function of a graph, a symmetric function analog of the classical chromatic polynomial of a graph. The Stanley-Stembridge e-positivity conjecture is a long-standing conjecture that states that the chromatic symmetric function of a certain class of graphs, called incomparability graphs of (3+1)-free posets, has nonnegative coefficients when expanded…

Nonlinear Analysis Seminar and Differential Equation Seminar: Leon Bungert, University of Würzburg, Adversarial robustness in machine learning: from worst-case to probabilistic

Zoom

In this talk I will first review recent results which characterize adversarial training (AT) of binary classifiers as nonlocal perimeter regularization. Then I will speak about a probabilistic generalization of AT which also admits such a geometric interpretation, albeit with a different nonlocal perimeter. Using suitable relaxations one can prove the existence of solutions for…

Teaching and Learning Seminar: Maria Meehan, University College Dublin, Video recordings to complement, or substitute for, the first-year mathematics lecture: One lecturer’s journey

SAS 4201

As part of a professional development project aimed at engaging in the Discipline of Noticing as conceptualised by John Mason, some colleagues and I wrote and shared brief-but-vivid accounts of our practice. Evident in these accounts is the catalyst for the subsequent change in my practice of introducing short, pre-recorded videos to complement, or substitute…

Computational and Applied Mathematics Seminar: Gregory Ongie, Marquette University, A function space view of infinite-width neural networks

Zoom

It is well-known that nearly any function can be approximated arbitrarily-well by a neural network with non-linear activations. However, one cannot guarantee that the weights in the neural network remain bounded in norm as the approximation error goes to zero, which is an important consideration when practically training neural networks. This raises the question: What…

Biomathematics Seminar: TBA

Cox 306

All BMA seminars have a virtual option with the following Zoom Link: https://ncsu.zoom.us/j/93046132033?pwd=dkZiTjlKazgzK2Q3aXJra1g2R1Q0dz09 Meeting ID: 930 4613 2033 Passcode: 075251