• A
  • A
  • A
  • АБB
  • АБB
  • АБB
  • А
  • А
  • А
  • А
  • А
Обычная версия сайта

Статья

Rates of convergence for density estimation with GANs

We undertake a precise study of the non-asymptotic properties of vanilla generative adversarial networks (GANs) and derive theoretical guarantees in the problem of  estimating an unknown $d$-dimensional density $p$  under a proper choice of the class of generators and discriminators. We prove that the resulting  density estimate converges to $p$ in terms of  Jensen-Shannon (JS) divergence at the rate $n^{-2\beta/(2\beta+d)}$ where $n$ is the sample size and $\beta$ determines the smoothness of $p$. This is the first result in the literature on density estimation using vanilla GANs with JS rates faster than $n^{-1/2}$ in the regime $\beta>d/2$.