### Book

## Lecture Notes in Computer Science (V. 10495, Proceedings of 5th International Castle Meeting on Coding Theory and Applications, 2017)

This book constitutes the refereed proceedings of the 5th International Castle Meeting on Coding Theory and Applications, ICMCTA 2017, held in Vihula, Estonia, in August 2017.

The 24 full papers presented were carefully reviewed and selected for inclusion in this volume. The papers cover relevant research areas in modern coding theory, including codes and combinatorial structures, algebraic geometric codes, group codes, convolutional codes, network coding, other applications to communications, and applications of coding theory in cryptography.

We compare two popular tracing traitor schemes (1) using non-binary codes with identifiable parent property (IPP-codes) and (2) using family of sets with identifiable parent property. We establish a natural basis for comparing and show that the second approach is stronger than IPP-codes. We also establish a new lower bound on the cardinality of the family of sets with identifiable parent property.

We investigate one possible generalization of locally recoverable codes (LRC) with all-symbol locality and availability when recovering sets can intersect in a small number of coordinates. This feature allows us to increase the achievable code rate and still meet load balancing requirements. In this paper we derive an upper bound for the rate of such codes and give explicit constructions of codes with such a property. These constructions utilize LRC codes developed by Wang et al.

Understanding the relation between (sensory) stimuli and the activity of neurons (i.e., "the neural code") lies at heart of understanding the computational properties of the brain. However, quantifying the information between a stimulus and a spike train has proven to be challenging. We propose a new (in vitro) method to measure how much information a single neuron transfers from the input it receives to its output spike train. The input is generated by an artificial neural network that responds to a randomly appearing and disappearing "sensory stimulus": the hidden state. The sum of this network activity is injected as current input into the neuron under investigation. The mutual information between the hidden state on the one hand and spike trains of the artificial network or the recorded spike train on the other hand can easily be estimated due to the binary shape of the hidden state. The characteristics of the input current, such as the time constant as a result of the (dis)appearance rate of the hidden state or the amplitude of the input current (the firing frequency of the neurons in the artificial network), can independently be varied. As an example, we apply this method to pyramidal neurons in the CA1 of mouse hippocampi and compare the recorded spike trains to the optimal response of the "Bayesian neuron" (BN). We conclude that like in the BN, information transfer in hippocampal pyramidal cells is non-linear and amplifying: the information loss between the artificial input and the output spike train is high if the input to the neuron (the firing of the artificial network) is not very informative about the hidden state. If the input to the neuron does contain a lot of information about the hidden state, the information loss is low. Moreover, neurons increase their firing rates in case the (dis)appearance rate is high, so that the (relative) amount of transferred information stays constant.

We address the problem of constructing coding schemes for the channels with high-order modulations. It is known, that non-binary LDPC codes are especially good for such channels and significantly outperform their binary counterparts. Unfortunately, their decoding complexity is still large. In order to reduce the decoding complexity, we consider multilevel coding schemes based on non-binary LDPC codes (NB-LDPC-MLC schemes) over smaller fields. The use of such schemes gives us a reasonable gain in complexity. At the same time, the performance of NB-LDPC-MLC schemes is practically the same as the performance of LDPC codes over the field matching the modulation order. In particular, by means of simulations, we showed that the performance of NB-LDPC-MLC schemes over GF(16) is the same as the performance of non-binary LDPC codes over GF(64) and GF(256) in AWGN channel with QAM 64 and QAM 256 accordingly. We also perform a comparison with bit-interleaved coded modulation based on binary LDPC codes.

Consider a Bayesian problem of estimating of probability of success in a series of trials with binary outcomes. We study the asymp- totic behaviour of weighted differential entropy for posterior probability density function (PDF) conditional on x successes after n trials, when n → ∞. Suppose that one is interested to know whether the coin is fair or not and for large n is interested in true frequency. In other words, one wants to emphasize the parameter value p = 1/2. To do so the concept of weighted differential entropy introduced in [1968] is used when the frequency γ is necessary to emphasize. It was found that the weight in suggested form does not change the asymptotic form of Shannon, Renyi, Tsallis and Fisher entropies, but change the constants. The leading term in weighted Fisher Information is changed by some constant which depend on distance between the true frequency and the value we want to emphasize.

The collection represents proceedings of the nineth international conference "Discrete Models in Control Systems Theory" that is held by Lomonosov Moscow State Uneversity and is dedicated in 90th anniversary of Sergey Vsevolodovich Yablonsky's birth. The conference subject are includes: discrete functional systems; discrete functions properties; control systems synthesis, complexity, reliability, and diagnostics; automata; graph theory; combinatorics; coding theory; mathematical methods of information security; theory of pattern recognition; mathematical theory of intellegence systems; applied mathematical logic. The conference is sponsored by Russian Foundation for Basic Research (project N 15-01-20193-г).

A words phonetic decoding method in automatic speech recognition is considered. The properties of Kullback–Leibler divergence are used to synthesize the estimation of the distribution of divergence between minimum speech units (e.g., single phonemes) inside a single class. It is demonstrated that the min imum variance of the intraphonemic divergence is reached when the phonetic database is tuned to the voice of a single speaker. The estimations are proven by experimental results on the recognition of vowel sounds and isolated words of Russian language.

We establish a new upper bound for the Kullback-Leibler divergence of two discrete probability distributions which

are close in a sense that typically the ratio of probabilities is nearly one and the number of outliers is small.