[Home]   [  News]   [  Events]   [  People]   [  Research]   [  Education]   [Visitor Info]   [UCSD Only]   [Admin]
Home > Events > CCoM > Abstract
Search this site:

Randolph E. Bank
Philip E. Gill
Michael Holst

Administrative Contact:
Juan Rodriguez

Office: AP&M 7409
Phone: (858)534-9056
Fax: (858)534-5273
E-mail: jcr009@ucsd.edu
Approximation Rates and Metric Entropy of Shallow Neural Networks

Jonathan Siegel
Pennsylvania State University


We consider the problem of approximating high dimensional functions using shallow neural networks, and more generally by sparse linear combinations of elements of a dictionary. We begin by introducing natural spaces of functions which can be efficiently approximated in this way. Then, we derive the metric entropy of the unit balls in these spaces, which allows us to calculate optimal approximation rates for approximation by shallow neural networks. This gives a precise measure of how large this class of functions is and how well shallow neural networks overcome the curse of dimensionality. Finally, we describe an algorithm which can be used to solve high-dimensional PDEs using this space of functions.

Tuesday, May 11, 2021
11:00AM Zoom ID 939 3177 8552