- This event has passed.

# Numerical Analysis Seminar: Elizabeth Newman, Emory University, How to Train Better: Exploiting the Separability of Deep Neural Networks

## April 18 | 3:00 pm - 4:00 pm EDT

In this talk, we will make DNN training easier by exploiting the separability of common DNN architectures; that is, the weights of the final layer of the DNN are applied linearly. We will leverage this linearity in two ways. First, we will approximate the stochastic optimization problem deterministically via a sample average approximation. In this setting, we can eliminate the linear weights through variable projection (i.e., partial optimization). Second, in the stochastic optimization setting, we will consider a powerful iterative sampling approach to update the linear weights, which notably incorporates automatic regularization parameter selection methods. Throughout the talk, we will demonstrate the efficacy of these two approaches through numerical examples.