This paper discusses a universal approach to the construction of confidence regions for level sets {h(x) ≥ 0} ⊂ R q of a function h of interest. The proposed construction is based on a plug-in estimate of the level sets using an appropriate estimatehn of h. The approach provides finite sample upper and lower confidence limits. This leads to generic conditions under which the constructed confidence regions achieve a prescribed coverage level asymptotically. The construction requires an estimate of quantiles of the distribution of sup∆n |hn(x) − h(x)| for appropriate sets ∆n ⊂ R q . In contrast to related work from the literature, the existence of a weak limit for an appropriately normalized process {hn(x), x ∈ D}is not required. This adds significantly to the challenge of deriving asymptotic results for the corresponding coverage level. Our approach is exemplified in the case of a density level set utilizing a kernel density estimator and a bootstrap procedure
We study the Gaussian and robust covariance estimation, assuming the true covariance matrix to be a Kronecker product of two lower dimensional square matrices. In both settings we define the estimators as solutions to the constrained maximum likelihood programs. In the robust case, we consider Tyler’s estimator defined as the maximum likelihood estimator of a certain distribution on a sphere. We develop tight sufficient conditions for the existence and uniqueness of the estimates and show that in the Gaussian scenario with the unknown mean, p/q+q/p+2 samples are almost surely enough to guarantee the existence and uniqueness, where p and q are the dimensions of the Kronecker product factors. In the robust case with the known mean, the corresponding sufficient number of samples is max[p/q,q/p]+1.
In this paper, we address the problem of regression estimation in the context of a -dimensional predictor when is large. We propose a general model in which the regression function is a composite function. Our model consists in a nonlinear extension of the usual sufficient dimension reduction setting. The strategy followed for estimating the regression function is based on the estimation of a new parameter, called the reduced dimension. We adopt a minimax point of view and provide both lower and upper bounds for the optimal rates of convergence for the estimation of the regression function in the context of our model. We prove that our estimate adapts, in the minimax sense, to the unknown value of the reduced dimension and achieves therefore fast rates of convergence when .
Recently Lao and Mayer (2008) considered U-max-statistics, where the maximum of kernels over the set of indices is studied instead of the usual sums. Such statistics emerge frequently in stochastic geometry. The examples include the largest distance between random points in a ball, the maximal diameter of a random polygon, the largest scalar product within a sample of points, etc. Their limit distributions are related to the distributions of extreme values. Among the results obtained by Lao and Mayer, the limit theorems for the maximal perimeter and the maximal area of random triangles inscribed in a circumference are of great interest. In the present paper, we generalize these theorems to the case of convex m-polygons, m ≥ 3, with random vertices on the circumference. In addition, a similar problem for the minimal perimeter and the minimal area of circumscribed m-polygons is solved in this paper. This problem has not been studied in the literature so far.