学术报告
金睿楠博士: A Comprehensive Framework for Analyzing the Convergence of Adam: Bridging the Gap with SGD

Speaker:金睿楠博士,中国科学院香港创新研究院        

Inviter: 张波、程涤非

Title: A Comprehensive Framework for Analyzing the Convergence of Adam: Bridging the Gap with SGD

Language: Chinese 

Time & Venue: 2025.01.07  10:00-11:00  N208  腾讯会议:766-569-933

Abstract: Adaptive Moment Estimation (Adam) is an important optimization algorithm in deep learning. However, despite its practical success, the theoretical understanding of Adam's convergence has been constrained by stringent assumptions, such as almost surely bounded stochastic gradients or uniformly bounded gradients, which are more restrictive than those typically required for analyzing Stochastic Gradient Descent (SGD). In this talk, we introduce a novel and comprehensive framework for analyzing the convergence properties of Adam.