Qiang Liu UT Austin
时间： 2022-10-28 10:00-2022-10-28 11:00
地点：FIT 1-222, FIT Building
We consider the problem of learning a transport mapping between two distributions that are only observed through unpaired data points. This problem provides a unified framework for a variety of fundamental tasks in machine learning: generative modeling is about transforming a Gaussian (or other elementary) random variable to realistic data points; domain transfer concerns with transferring data points from one domain to another; optimal transport (OT) solves the more challenging problem of finding a "best" transport map that minimizes certain transport cost. Unfortunately, despite the unified view, there lacks an algorithm that can solve the transport mapping problem efficiently in all settings. The existing algorithms need to be developed case by case, and tend to be complicated or computationally expensive.
In this talk, I will show you that the problem can be addressed with a simple and unified algorithm by learning neural ordinary differential equations (ODEs) and stochastic differential equations (SDEs). One of the key ideas is to learn dynamics to efficient transfer between two distributions by traveling along straight paths as much as possible. These methods both simplify and improve over the standard denoising diffusion models: the ODE models learned by our method generate can high quality results with a single discretization step, which is a significant speedup over existing diffusion generative models. I will highlight the applications in molecule generation and antibody design.
Dr. Liu is chief AI scientist for Helixon. Dr. Liu is also an assistant professor of Computer Science at UT Austin leading the Statistical Learning & AI Group. His research interests include statistical machine learning, reinforcement learning, probabilistic graphical models, Bayesian inference, deep learning, human computation and crowdsourcing, and other data-driven applications. He is a recipient of an NSF CAREER award, among numerous other accolades.