A Brief Introductory from Optimization to Monte Carlo Markov Chain to Diffusion
Abstract

Modern machine learning heavily relies on optimization algorithms for training parameterized models. However, the impact of these algorithms extends well beyond routine training, guiding essential statistical methodologies for high-dimensional data analysis and inspiring cutting-edge generative approaches such as denoising diffusion models. In this talk, I will begin by reviewing the fundamental principles of classical optimization algorithms. I will then expand the discussion by examining optimization within the framework of the Wasserstein space, revealing its theoretical connections to innovative Markov Chain Monte Carlo (MCMC) samplers. Finally, I will explore the interplay between MCMC and diffusion processes, offering insights into diffusion models from both a sampling and optimization standpoint and highlighting recent developments in this rapidly evolving field.

 

Speaker: Mr. Xunpeng HUANG
Date: 16 April 2025 (Wednesday)
Time: 9:30am – 10:30am
Zoom: Link
PosterClick here

 

Biography

Xunpeng Huang is a final-year Ph.D. student at the Hong Kong University of Science and Technology (HKUST), co-advised by Prof. Tong Zhang and Prof. Yang Xiang. He is currently a visiting student at the University of California, San Diego, working closely with Prof. Yi-an Ma, and he also collaborates with Prof. Difan Zou (HKU). Previously, he earned his M.Sc. in Computer Science and Technology from the University of Science and Technology of China under the supervision of Prof. Enhong Chen, and he spent time at ByteDance AI Lab under the guidance of Prof. Lei Li. His research interests focus on machine learning algorithms and theory, encompassing sampling algorithms, stochastic/nonconvex optimization, mean-field analysis, and diffusion theory.