Optimization methods have a longer history than that of digital computers. The theoretical framework of optimization has greatly changed over the past century, driven by the rapidly increasing computation power and application scales. In this talk, we will take a closer look at the components of the modern oracle complexity framework in its historical context. Through a more detailed discussion of what we mean by "an algorithm converges at the rate of 1/epsilon" , we will reveal the strengths of oracle complexity as well as its problems in machine learning applications. We then show how the framework can be adjusted to better align and guide machine learning experiments, and finally explain why there is much work to be done along this direction.
Jingzhao Zhang is a PhD student at MIT in EECS department, affiliated with LIDS. He is co-advised by Prof Ali Jadbabaie and Prof. Suvrit Sra. His research interests lies in experiment-driven theoretical analysis of optimization methods. Before starting his PhD program at MIT, he was an undergrad at UC Berkeley and was advised by Prof Laura Waller.