Math 277: Topics in Approximation Theory and Deep Learning

TuTh 3:30pm-5:00pm, APM 7321


  • Welcome to the course!

Instructor: Alex Cloninger

  • Email: acloninger (at) ucsd (dot) edu

  • Phone: 534-4889

  • Office: AP&M 5747 (5th floor annex)

  • Office hours: by appointment


This course will cover applied approximation theory, specifically in the context of deep neural networks for learning. We will also discuss bounds on the needed size of training data, connections to kernel methods, and scattering transforms.

This class will have a multiple project presentations involving current research directions in the field. We will discuss details of paper lists, coding aspects, and other details as the quarter moves forward. Homework may also be given to clarify details of the material.

Paper list


Grades will be based on class participation and homework/projects.


  • September 23: Introduction and Approximation Theory Terms

  • September 28: Fourier Example and Simple Functions

  • September 30: Haar Approximations

  • October 5: Artificial Neural Networks and Universality

  • October 7: Dependence on Dimension, Discussion of Lottery Ticket Hypothesis

  • October 12: Bounding Complexity with Yarotsky Nets

  • October 14: Bounding Complexity with Dimension

  • October 19: Bounding Complexity by Function Complexity

  • October 21: Rademacher Complexity

  • October 26: Student Presentation: Deep v. Shallow Networks; Rademacher Complexity

  • October 28: Guest Presentation (Xiuyuan Cheng): Neural Network Optimization and Neural Tangent Kernels

  • November 2: Bounding Network Rademacher Complexity and Generalization

  • November 4: Neural Tangent Kernels