Login [Center] Logout Join Us Guidelines  I  中文  I  CQI

Optimization of discrete optimization in machine learning

Speaker: Dianbo Liu National University of Singapore
Time: 2024-11-27 10:00-2024-11-27 11:00
Venue: FIT 1-222

Abstract:

Discrete representations play a crucial role in many deep learning architectures, yet their non-differentiable nature poses significant challenges for gradient-based optimization. To address this issue, various gradient estimators have been developed, including the Straight-Through Gumbel-Softmax (ST-GS) estimator, which combines the Straight-Through Estimator (STE) and the Gumbel-based reparameterization trick. In this talk, we share several strategies recently developed in our team to improve efficiency of discrete optimization.

Short Bio:

Dianbo Liu is leader of Cognitive AI for Science team (CogAI4SCI.com) and assistant professor at National University of Singapore. Before starting CogAI4Sci team, Dianbo Liu was a group leader at the Broad Institute of MIT and Harvard. Prior to the Broad Institute, Dianbo worked as a postdoctoral researcher with Prof. Yoshua Bengio (a Turing Award winner) and led the Humanitarian AI team at the Mila-Quebec AI Institute. This followed his fellowship training and studies in medical informatics at Harvard University. Dianbo earned his PhD from the University of Dundee, Scotland, under the supervision of Prof. Timothea Newman. During his doctoral studies, he received the Vest Scholarship from the Massachusetts Institute of Technology (MIT) and was a special graduate student at the MIT Computer Science and Artificial Intelligence Lab. Dianbo also co-founded two start-ups, "GeneTank" and "SecureAILabs," to advance AI applications in biomedical sciences during his training.