?
On joint conditional complexity (Entropy)
Conditional Kolmogorov complexity of a word $a$ given a word $b$ is the minimum length of a program that prints $a$ given $b$ as an input. We generalize this notion to quadruples of strings $a,b,c,d$: their joint conditional complexity $\K((a\to c)\land(b\to d))$ is defined as the minimum length of a program that given $a$ outputs $c$ and given $b$ outputs $d$. In this paper, we prove that the joint conditional complexity cannot be expressed in terms of usual conditional (and unconditional) Kolmogorov complexity. This result provides a negative answer to the following question, asked by A.Shen on a session of Kolmogorov seminar at Moscow State University in 1994: Is there a problem of information processing whose complexity is not expressible in terms of conditional (and unconditional) Kolmogorov complexity? We show that a similar result holds for classical Shannon entropy. We provide two proofs of both results, an effective one and a ``quasi-effective'' one. Finally we present a quasi-effective proof of a strong version of the following statement: there are two strings whose mutual information cannot be extracted. Previously, only a non-effective proof of that statement was known. The results concerning Kolmogorov complexity appeared, in a preliminary form, in the Proceedings of the 16th Annual IEEE Conference on Computational Complexity in 2001. [A. Muchnik and N. Vereshchagin. ``Logical operations and Kolmogorov complexity. II''. Proc. of 16th Annual IEEE Conference on Computational Complexity, Chicago, June 2001, pp. 256--265.]