The Anisotropic Noise in Stochastic Gradient Descent: Its Behavior of Escaping from Minima and Regularization Effects

演讲人: Dr. Zhanxing Zhu 北京大学
时间: 2018-07-04 11:00-2018-07-04 12:00
地点:FIT 1-222
内容:

Understanding the behavior of stochastic gradient descent (SGD) in the context of deep neural networks has raised lots of concerns recently. Along this line, we theoretically study a general form of gradient based optimization dynamics with unbiased noise, which unifies SGD and standard Langevin dynamics. Through investigating this general optimization dynamics, we analyze the behavior of SGD on escaping from minima and its regularization effects. A novel indicator is derived to characterize the efficiency of escaping from minima through measuring the alignment of noise covariance and the curvature of loss function. Based on this indicator, two conditions are established to show which type of noise structure is superior to isotropic noise in term of escaping efficiency. We further show that the anisotropic noise in SGD satisfies the two conditions, and thus helps to escape from sharp and poor minima effectively, towards more stable and flat minima that typically generalize well. We verify our understanding through comparing this anisotropic diffusion with full gradient descent plus isotropic diffusion (i.e. Langevin dynamics) and other types of position-dependent noise.

 

个人简介:

Dr. Zhanxing Zhu, is currently a research assistant professor at Center of Data Science, Peking University, and at Beijing Institute of Big Data Research. He obtained Ph.D degree in  machine learning from University of Edinburgh in 2016. His research interests cover machine learning and its applications in various domains. Currently he mainly focuses on deep learning theory and optimization algorithms, reinforcement learning, and applications in traffic, computer security, computer graphics, medical imaging etc. He has published more than 20 AI papers on top journals and conferences, such as NIPS, ICML, ACL, IJCAI, AAAI, ECML etc.