Theory Seminar: Abhishek Shetty
The speaker for this week’s theory seminar is Abhishek Shetty from UC Berkeley. The talk will take place on Friday 4/21 at 1PM in Levine 307.
Title: Optimal PAC Bounds without Uniform Convergence
Abstract: In statistical learning theory, determining the sample complexity of realizable binary classification for VC classes was a long-standing open problem. The results of Simon and Hanneke established sharp upper bounds in this setting. However, the reliance of their argument on the uniform convergence principle limits its applicability to more general learning settings such as multiclass classification. In this talk, we will discuss a simple technique that addresses this issue. We will present optimal high probability risk bounds through a framework that surpasses the limitations of uniform convergence arguments.
In addition to binary classification, we will see applications in three settings where uniform convergence is provably sub-optimal. For multiclass classification, we prove an optimal risk bound scaling with the one-inclusion hypergraph density of the class, addressing the suboptimality of the analysis by Daniely and Shalev-Shwartz. In partial concept classification, we determine the optimal sample complexity bound, resolving a question posed by Alon, Hanneke, Holzman, and Moran. In the context of realizable bounded regression with absolute loss, we derive an optimal risk bound that relies on a modified version of the scale-sensitive dimension, refining the results of Bartlett and Long. Our rates surpass standard uniform convergence-based results due to the smaller complexity measure in our risk bound.
Based on joint work with Ishaq Aden-Ali, Yeshwanth Cherapanamjeri and Nikita Zhivotivsky