Minimax adaptive dimension reduction for regression
In this paper, we address the problem of regression estimation in the context of a -dimensional predictor when is large. We propose a general model in which the regression function is a composite function. Our model consists in a nonlinear extension of the usual sufficient dimension reduction setting. The strategy followed for estimating the regression function is based on the estimation of a new parameter, called the reduced dimension. We adopt a minimax point of view and provide both lower and upper bounds for the optimal rates of convergence for the estimation of the regression function in the context of our model. We prove that our estimate adapts, in the minimax sense, to the unknown value of the reduced dimension and achieves therefore fast rates of convergence when .