演讲人:
Binhang Yuan
时间: 2023-03-01 18:00-2023-03-01 19:30
地点:FIT Building,Room 1-222 + Zoom Meeting: 895 3056 1898 (passcode:962687)
内容:
The recent success of machine learning (ML) has dramatically benefited from the exponential growth of ML model capacity. However, the enormous capacity of ML models also leads to a significantly higher cost. In practice, the high cost of ML comes from three sources: i) the cost of optimizing/deploying ML services over the ever-changing hardware; ii) the low utilization of the hardware due to parallel/distributed communication overhead; and iii) the high cost of accessing the hardware.
My work attempts to reduce the cost in all three categories above: Developing and deploying ML workflows in ever-changing execution environments is a tedious and time-consuming job and would require a significant amount of engineering effort to scale out the computation; my work proposes new abstractions for ML system design and implementation with expressivity, easy optimization, and high performance. In parallel/distributed ML training, communication is usually the main bottleneck that restricts hardware efficiency; my work explores system relaxations of communications under different parallel ML training paradigms to increase hardware efficiency without compromising their statistical efficiency. Given the advances in system optimization and relaxation, my work further investigates how to deploy the ML service over a decentralized open collective environment consisting of much cheaper and underutilized decentralized GPUs; the result is promising: when the decentralized interconnections are 100X slower than the data center network, under efficient scheduling, the end-to-end training throughput is only 1.7~3.5X slower than the state-of-the-art solutions inside a data center.
个人简介:
Binhang Yuan is a postdoc research scientist in the Department of Computer Science at ETH Zurich under the supervision of Dr. Ce Zhang. He received his Bachelor of Science (2013) in Computer Science from Fudan University and his Master of Science (2016) and Ph.D. (2020)