机器学习与数据科学博士生系列论坛(第九十一期)—— Shifted Composition for Bounding Information-Theoretic Divergences
报告人: 忻宇辰(北京大学)
时间:2025-09-25 16:00-17:00
地点:腾讯会议 989-3593-2097
Abstract:
Bounding the divergence between the laws of two stochastic processes is a classical topic with many applications in sampling and other fields. Standard approaches, such as the Girsanov method and the interpolation method, are applied to control the error in KL divergence for some basic sampling algorithms. However, it is not known how these methods can be used for more general algorithms or in more general settings.
In this talk, we introduce the shifted composition rule, based on several works by Altschuler et al. . This information-theoretic principle is applied to develop a user-friendly framework for bounding the long-time discretization error for sampling algorithms. We also introduce its application in proving reverse transport inequalities for diffusions.
About the Speaker:
论坛简介:该线上论坛每两周主办一次,每次邀请一位博士生就某个前沿课题做较为系统深入的介绍,主题包括但不限于机器学习、高维统计学、运筹优化和理论计算机科学。
Your participation is warmly welcomed!

欢迎扫码关注北大统计科学中心公众号,了解更多讲座信息!