Advanced theoretical computer science
This course is set for junior graduate students who are interested in theoretical computer science and relevant disciplines. Those registered for this course are expected to have sound foundation Discrete Math, Algorithm Design, basic machine learning knowledge.
Lectures for this course, all in English, aim to introduce current research directions, latest development and hot topics in computer science, and carry out in-depth discussions on issues of common interest. This course will help students to determine their future research interests and goals through featured discussions.
The course quickly recall basics about convex optimization and machine learning: linear/logistic regression, regularization, newton method, stochastic gradient descent (asychronized, variance reduction method), generative vs discrimitive, variance vs bias. Off-the-shelf machine learning and prediction algorithms: k-nn, SVM, kernel trick, clustering, Adaboost, gradient boosting, random forest. Online learning and sequential prediction. Multi-armed bandit, Universal portfolio, Multiplicative weighting method, online convex optimization, basic time series. linear algebra-based learning algorithms: SVD, principle component analysis (PCA), independent component analysis (ICA), Nonnegative matrix factorization (NMF), topic modeling, matrix completion, dictionary learning, tensor method, spectral clustering.
The course is mainly conducted through lectures and series seminars, supplemented by featured discussions. The students are required to take thesis reading exercises and give summary reports, with a view to helping them find their future research interests.