Generalization TheoryWechat Group为了方便讨论,我们创建了一个微信群。可以先添加我的微信 (微信号: adoutengjiaye),我会拉您进群。 Video and NotesChapter 0: Preliminary 0.1.1: 序言 [video] [slides] 0.2.1: ERM 模型 [video] [slides] 0.3.1: No-Free-Lunch Theorem [video] [slides] 0.3.2: PAC Learning [video] [slides] 0.3.3: 有限假设类 [video] [slides] Chapter 1: Traditional Statistics 1.1.1: 参数一致性(consistency) [video] [slides] 1.1.2: 岭回归 (ridge regression) [video] [slides] 1.2.1: 广义线性模型 (generalized linear models) [video] [slides] Chapter 2: Uniform Convergence 2.1.1: 一致收敛 (uniform convergence) [video] [slides] 2.2.1: VC 维 [video] [slides] 2.3.1: 拉德玛赫复杂度 (Rademacher Complexity) [video] [slides] Chapter 3: Algorithmic Stability 3.1.1: 算法稳定性 (Algorithmic Stability) [video] [slides] 3.1.2: 算法稳定性证明 [video] [slides] Chapter 4: PAC-Bayesian 4.1.1: PAC-贝叶斯 (PAC-Bayesian) [video] [slides] 4.1.2: PAC-贝叶斯证明 [video] [slides] Chapter 5: Information-based 5.1.1: 泛化与信息论 (Information-based Generalization) [video] [slides] Chapter 6: Implicit Bias 6.1.1: 隐式误差 (Implicit Bias) [video] [slides] Reference[book] Understanding Machine Learning: From Theory to Algorithms, Shai Shalev-Shwartz and Shai Ben-David (2014) |