Article
Localization of VC classes: Beyond local Rademacher complexities
In statistical learning the excess risk of empirical risk minimization (ERM) is controlled by (COMPn(F)n)α, where n is a size of a learning sample, COMPn(F) is a complexity term associated with a given class F and α∈[12,1] interpolates between slow and fast learning rates. In this paper we introduce an alternative localization approach for binary classificationthat leads to a novel complexity measure: fixed points of the local empirical entropy. We show that this complexity measure gives a tight control over COMPn(F) in the upper bounds under bounded noise. Our results are accompanied by a minimax lower bound that involves the same quantity. In particular, we practically answer the question of optimality of ERM under bounded noise for general VC classes.