• A
• A
• A
• ABC
• ABC
• ABC
• А
• А
• А
• А
• А
Regular version of the site
Of all publications in the section: 7
Sort:
by name
by year
Article
A. Gushchin. Mathematical Methods of Statistics. 2015. Vol. 24. No. 2. P. 110-121.

We consider the problem of testing two composite hypotheses in the minimax setting. To find maximin tests, we propose a new dual optimization problem which has a solution under a mild additional assumption. This allows us to characterize maximin tests in considerable generality. We give a simple example where the null hypothesis and the alternative are strictly separated, however, a maximin test is purely randomized.

Article
Nikitin Y. Y., Volkova K. Y. Mathematical Methods of Statistics. 2016. Vol. 25. No. 1. P. 54-66.

We build new tests of composite hypothesis of exponentiality which are functionals of U-empirical measures and which are closely related and inspired by some special property of exponential law. We study limiting distributions, large deviations and asymptotic efficiency of new tests. Most favorable alternatives are described. Finally we reject using our test the hypothesis on exponentiality of the lengths of reigns of Roman emperors which has been actively discussed last years.

Article
Molchanov S., Zhang Z., Zheng L. Mathematical Methods of Statistics. 2018. Vol. 27. No. 1. P. 60-70.

Modern information theory is largely developed in connection with random elements residing in large, complex, and discrete data spaces, or alphabets. Lacking natural metrization and hence moments, the associated probability and statistics theory must rely on information measures in the form of various entropies, for example, Shannon’s entropy, mutual information and Kullback–Leibler divergence, which are functions of an entropic basis in the form of a sequence of entropic moments of varying order. The entropicmoments collectively characterize the underlying probability distribution on the alphabet, and hence provide an opportunity to develop statistical procedures for their estimation. As such statistical development becomes an increasingly important line of research in modern data science, the relationship between the underlying distribution and the asymptotic behavior of the entropic moments, as the order increases, becomes a technical issue of fundamental importance. This paper offers a general methodology to capture the relationship between the rates of divergence of the entropic moments and the types of underlying distributions, for a special class of distributions. As an application of the established results, it is demonstrated that the asymptotic normality of the remarkable Turing’s formula for missing probabilities holds under distributions with much thinner tails than those previously known.

Article
Gushchin A. A., Küchler U. Mathematical Methods of Statistics. 2003. Vol. 12. No. 1. P. 31-61.
Article
Kleptsyna M., Veretennikov A. Mathematical Methods of Statistics. 2016. Vol. 25. No. 3. P. 207-218.

A new result on stability of an optimal nonlinear filter for a Markov chain with respect to small perturbations on every step is established. An  exponential recurrence of the signal is assumed.