• A
  • A
  • A
  • ABC
  • ABC
  • ABC
  • А
  • А
  • А
  • А
  • А
Regular version of the site
Of all publications in the section: 7
Sort:
by name
by year
Article
A. Gushchin. Mathematical Methods of Statistics. 2015. Vol. 24. No. 2. P. 110-121.

We consider the problem of testing two composite hypotheses in the minimax setting. To find maximin tests, we propose a new dual optimization problem which has a solution under a mild additional assumption. This allows us to characterize maximin tests in considerable generality. We give a simple example where the null hypothesis and the alternative are strictly separated, however, a maximin test is purely randomized.  

Added: Jun 18, 2015
Article
Nikitin Y. Y., Volkova K. Y. Mathematical Methods of Statistics. 2016. Vol. 25. No. 1. P. 54-66.

We build new tests of composite hypothesis of exponentiality which are functionals of U-empirical measures and which are closely related and inspired by some special property of exponential law. We study limiting distributions, large deviations and asymptotic efficiency of new tests. Most favorable alternatives are described. Finally we reject using our test the hypothesis on exponentiality of the lengths of reigns of Roman emperors which has been actively discussed last years.

Added: Mar 23, 2016
Article
Molchanov S., Zhang Z., Zheng L. Mathematical Methods of Statistics. 2018. Vol. 27. No. 1. P. 60-70.

Modern information theory is largely developed in connection with random elements residing in large, complex, and discrete data spaces, or alphabets. Lacking natural metrization and hence moments, the associated probability and statistics theory must rely on information measures in the form of various entropies, for example, Shannon’s entropy, mutual information and Kullback–Leibler divergence, which are functions of an entropic basis in the form of a sequence of entropic moments of varying order. The entropicmoments collectively characterize the underlying probability distribution on the alphabet, and hence provide an opportunity to develop statistical procedures for their estimation. As such statistical development becomes an increasingly important line of research in modern data science, the relationship between the underlying distribution and the asymptotic behavior of the entropic moments, as the order increases, becomes a technical issue of fundamental importance. This paper offers a general methodology to capture the relationship between the rates of divergence of the entropic moments and the types of underlying distributions, for a special class of distributions. As an application of the established results, it is demonstrated that the asymptotic normality of the remarkable Turing’s formula for missing probabilities holds under distributions with much thinner tails than those previously known.

Added: Nov 15, 2019
Article
Gushchin A. A., Küchler U. Mathematical Methods of Statistics. 2003. Vol. 12. No. 1. P. 31-61.
Added: Oct 7, 2013
Article
Kleptsyna M., Veretennikov A. Mathematical Methods of Statistics. 2016. Vol. 25. No. 3. P. 207-218.

A new result on stability of an optimal nonlinear filter for a Markov chain with respect to small perturbations on every step is established. An  exponential recurrence of the signal is assumed.

Added: Oct 16, 2016
Article
Esaulov D. Mathematical Methods of Statistics. 2013. Vol. 22. No. 4. P. 333-349.

In the paper a new nonparametric generalized M-test for hypotheses about the order of linear autoregression AR(p) is constructed. We also establish robustness of this test in the model of data contamination by independent additive outliers with intensity O(n−1/2). Robustness is formulated in terms of limiting power equicontinuity. Test statistics are constructed with the help of residual empirical processes. We establish the asymptotic uniform linearity of these processes in the defined contamination model.

Added: Oct 19, 2016
Article
A.A.Gushchin. Mathematical Methods of Statistics. 2016. Vol. 25. No. 4. P. 304-312.
Let (P_i,Q_i), i = 0, 1, be two pairs of probability measures defined on measurable spaces (Ω_i,F_i) respectively. Assume that the pair (P_1,Q_1) is more informative than (P_0,Q_0) for testing problems. This amounts to say that I_f (P_1,Q_1) ≥ I_f (P_0,Q_0), where I_f (·, ·) is an arbitrary fdivergence. We find a precise lower bound for the increment of f-divergences I_f (P_1,Q_1) − I_f (P_0,Q_0) provided that the total variation distances ||Q_1 − P_1|| and ||Q_0 − P_0|| are given. This optimization problem can be reduced to the case where P_1 and Q_1 are defined on the space consisting of four points, and P_0 and Q_0 are obtained from P_1 and Q_1 respectively by merging two of these four points. The result includes the well-known lower and upper bounds for I_f (P,Q) given ||Q − P||.
Added: Dec 29, 2016