Optimizing neural networks is a highly nonconvex problem, and even optimizing a 2-layer neural network can be challenging. In the recent years many different approaches were proposed to learn 2-layer neural networks under different assumptions. This talk will give a brief survey on these approaches, and discuss some new results using spectral methods and optimization landscape.
Rong Ge is an assistant professor in the computer science department at Duke University. He received his Ph.D. from Princeton University, and worked as a post-doctoral researcher at Microsoft Research New England. He is broadly interested in theoretical computer science and machine learning. His research focuses on designing algorithms with provable guarantees for machine learning problems, using techniques including tensor decompositions and non-convex optimization. The research has received NSF CAREER Award and Sloan Fellowship.