### Article

## Generalization of Cram´er-Rao and Bhattacharyya inequalities for the weighted covariance matrix

The paper considers a family of probability distributions depending on a pa-

rameter. The goal is to derive the generalized versions of Cram´er-Rao and Bhattacharyya

inequalities for the weighted covariance matrix and of the Kullback inequality for the

weighted Kullback distance, which are important objects themselves [9, 23, 28]. The asymp-

totic forms of these inequalities for a particular family of probability distributions and for

a particular class of continuous weight functions are given.

Consider a Bayesian problem of estimating of probability of success in a series of trials with binary outcomes. We study the asymp- totic behaviour of weighted differential entropy for posterior probability density function (PDF) conditional on x successes after n trials, when n → ∞. Suppose that one is interested to know whether the coin is fair or not and for large n is interested in true frequency. In other words, one wants to emphasize the parameter value p = 1/2. To do so the concept of weighted differential entropy introduced in [1968] is used when the frequency γ is necessary to emphasize. It was found that the weight in suggested form does not change the asymptotic form of Shannon, Renyi, Tsallis and Fisher entropies, but change the constants. The leading term in weighted Fisher Information is changed by some constant which depend on distance between the true frequency and the value we want to emphasize.

We study Sobolev a priori estimates for the optimal transportation $T = \nabla \Phi$ between probability measures $\mu=e^{-V} \ dx$ and $\nu=e^{-W} \ dx$ on $\R^d$.

Assuming uniform convexity of the potential $W$ we show that $\int \| D^2 \Phi\|^2_{HS} \ d\mu$, where $\|\cdot\|_{HS}$ is the Hilbert-Schmidt norm,

is controlled by the Fisher information of $\mu$. In addition, we prove similar estimate for the $L^p(\mu)$-norms of $\|D^2 \Phi\|$ and obtain some $L^p$-generalizations of the well-known Caffarelli

contraction theorem.

We establish a connection of our results with the Talagrand transportation inequality.

We also prove a corresponding dimension-free version for the relative Fisher information with respect to a Gaussian measure.

We establish an expansion by Gamma-convergence of the Fisher information relative to the reference measure e (beta V)dx, where V is a generic multiwell potential and beta -> infinity. The expansion reveals a hierarchy of scales reflecting the metastable behavior of the underlying overdamped Langevin dynamics: distinct scales emerge and become relevant depending on whether one considers probability measures concentrated on local minima of V, probability measures concentrated on critical points of V, or generic probability measures on R-d. We thus fully describe the asymptotic behavior of minima of the Fisher information over regular sets of probabilities. The analysis mostly relies on spectral properties of diffusion operators and the related semiclassical Witten Laplacian and also covers the case of a compact smooth manifold as underlying space.

The concept of weighted entropy takes into account values of different outcomes, i.e., makes entropy context-dependent, through the weight function. We analyse analogs of the Fisher information inequality and entropy-power inequality for the weighted entropy and discuss connections with weighted Lieb’s splitting inequality. The concepts of rates of the weighted entropy and information are also discussed.

Consider a Bayesian problem of success probability estimation in a series of conditionally independent trials with binary outcomes. We study the asymptotic behaviour of the weighted differential entropy for posterior probability density function conditional on x successes after n conditionally independent trials when n tends to infinity. Suppose that one is interested to know whether the coin is approximately fair with a high precision and for large n is interested in the true frequency. In other words, the statistical decision is particularly sensitive in small neighbourhood of the particular value γ = 1/2. For this aim the concept of weighted differential entropy used when the frequency γ is necessary to emphasize. It was found that the weight in suggested form does not change the asymptotic form of Shannon, Renyi, Tsallis and Fisher entropies, but changes the constants. The leading term in weighted Fisher Information is changed by some constant which depends on distance between the true frequency and the value we want to emphasize.

This proceedings publication is a compilation of selected contributions from the “Third International Conference on the Dynamics of Information Systems” which took place at the University of Florida, Gainesville, February 16–18, 2011. The purpose of this conference was to bring together scientists and engineers from industry, government, and academia in order to exchange new discoveries and results in a broad range of topics relevant to the theory and practice of dynamics of information systems. Dynamics of Information Systems: Mathematical Foundation presents state-of-the art research and is intended for graduate students and researchers interested in some of the most recent discoveries in information theory and dynamical systems. Scientists in other disciplines may also benefit from the applications of new developments to their own area of study.