[Home]   [  News]   [  Events]   [  People]   [  Research]   [  Education]   [Visitor Info]   [UCSD Only]   [Admin]
Home > Events > CCoM > Abstract
Search this site:

Neural Parametric Fokker-Planck equations

Shu Liu
Georgia Tech


We develop and analyze a numerical method proposed for solving high-dimensional Fokker-Planck equations by leveraging the generative models from deep learning. Our starting point is a formulation of the Fokker-Planck equation as a system of ordinary differential equations (ODEs) on finite-dimensional parameter space with the parameters inherited from generative models such as normalizing flows. We call such ODEs "neural parametric Fokker-Planck equations". The fact that the Fokker-Planck equation can be viewed as the 2-Wasserstein gradient flow of the relative entropy (also known as KL divergence) allows us to derive the ODE as the 2-Wasserstein gradient flow of the relative entropy constrained on the manifold of probability densities generated by neural networks. For numerical computation, we design a bi-level minimization scheme for the time discretization of the proposed ODE. Such an algorithm is sampling-based, which can readily handle computations in higher-dimensional space. Moreover, we establish bounds for the asymptotic convergence analysis as well as the error analysis for both the continuous and discrete schemes of the neural parametric Fokker-Planck equation. Several numerical examples are provided to illustrate the performance of the proposed algorithms and analysis.

Tuesday, February 22, 2022
11:00AM Zoom ID 922 9012 0877