Numerical Analysis Seminar: Elizabeth Newman, Emory University, How to Train Better: Exploiting the Separability of Deep Neural Networks
SAS 4201Deep neural networks (DNNs) have gained undeniable success as high-dimensional function approximators in countless applications. However, there is a significant hidden cost behind triumphs - the cost of training. Typically, DNN training is posed as a stochastic optimization problem with respect to the learnable DNN weights. With millions of weights, a non-convex and non-smooth objective…