Computational and Applied Mathematics Seminar: Gregory Ongie, Marquette University, A function space view of infinite-width neural networks
ZoomIt is well-known that nearly any function can be approximated arbitrarily-well by a neural network with non-linear activations. However, one cannot guarantee that the weights in the neural network remain bounded in norm as the approximation error goes to zero, which is an important consideration when practically training neural networks. This raises the question: What…