We present the basic properties of the a new pattern analysis method in parallel coordinates; results of the method do not depend on the ordering of data in the original sample of objects being analyzed. We prove that clusters obtained with this method do not overlap. We also show the possibility of representing objects of one cluster in the form of monotonically increasing/decreasing functions.

The prototype of the isolated words recognition software based on the phonetic decoding method with the Kullback-Leibler divergence is presented. The architecture and basic algorithms of the software are described. Finally, an example of application to the problem of isolated words recognition is provided.

This paper considers a voting problem in which the individual preferences of electors are defined by the ranked lists of candidates. For single-winner elections, we apply the criterion of weak positional dominance (WPD, PD), which is closely related to the positional scoring rules. Also we formulate the criterion of weak mutual majority (WMM), which is stronger than the majority criterion but weaker than the criterion of mutual majority (MM). Then we construct two modifications for the median voting rule that satisfy the Condorcet loser criterion. As shown below, WPD and WMM are satisfied for the first modification while PD and MM for the second modification. We prove that there is no rule satisfying WPD and MM simultaneously. Finally, we check a list of 37 criteria for the constructed rules.

An axiomatics of power indices in voting with quota was proposed. It relies on the additivity and dictator axioms. Established was an important property that the player’s power index is representable as the sum of contributions of the coalitions in which it is a pivot member. The coalition contributions are independent of the players’ weights or the quota. The general theorem of power index representation and the theorem of representation for a power index of anonymous players were formulated and proved.

We show the results of a statistical study of the complexity of the asymmetric traveling salesman problem (ATSP) obtained by processing a specially generated pool of matrices. We show that the normal distribution can serve as an approximation to the distribution of the logarithm of complexity for a fixed problem dimension. We construct a family of probability distributions that represent satisfactory approximations of the complexity distribution with a dimension of the cost matrix from 20 to 49. Our main objective is to make probabilistic predictions of the complexity of individual problems for larger values of the dimension of the cost matrix. We propose a representation of the complexity distribution that makes it possible to predict the complexity. We formulate the unification hypothesis and show directions for further study, in particular proposals on the task of clustering “complex” and “simple” ATSP problems and proposals on the task of directly predicting the complexity of a specific problem instance based on the initial cost matrix.

Consideration was given to a special problem of controlling a formation of mobile agents, that of uniform deployment of several identical agents on a segment of the straight line. For the case of agents obeying the first-order dynamic model, this problem seems to be first formulated in 1997 by I.A. Wagner and A.M. Bruckstein as “row straightening.” In the present paper, the straightening algorithm was generalized to a more interesting case where the agent dynamics obeys second-order differential equations or, stated differently, it is the agent’s acceleration (or the force applied to it) that is the control.

In Russia, chain stores have achieved considerable market power. In this work, we combine a Dixit–Stiglitz industry model with a monopolistic retailer in order to address the following questions: does the retailer always impair prices, variety of goods, and ultimately welfare? Which market structure is worse: Nash or Stackelberg? What should be the public policy in this area?

In this work, we study the optimal risk sharing problem for an insurer between himself and a reinsurer in a dynamical insurance model known as the Kramer–Lundberg risk process, which, unlike known models, models not per claim reinsurance but rather periodic reinsurance of damages over a given time interval. Here we take into account a natural upper bound on the risk taken by the reinsurer. We solve optimal control problems on an infinite time interval for mean-variance optimality criteria: a linear utility functional and a stationary variation coefficient. We show that optimal reinsurance belongs to the class of total risk reinsurances. We establish that the most profitable reinsurance is the stop-loss reinsurance with an upper limit. We find equations for the values of parameters in optimal reinsurance strategies.

In order to solve robust PageRank problem a saddle-point Mirror Descent algorithm for solving convex-concave optimization problems is enhanced and studied. The algorithm is based on two proxy functions, which use specificities of value sets to be optimized on (min-max search). In robust PageRank case the ones are entropy-like function and square of Euclidean norm. The saddle-point Mirror Descent algorithm application to robust PageRank leads to concrete complexity results, which are being discussed alongside with illustrative numerical example.

The paper is concerned with scheduling the two-way traffic between two stations connected by a single-track railway with a siding. It is shown that if, for each station, the order in which trains leave this station is known or can be found, then for various objective functions an optimal schedule can be constructed in polynomial time using the method of dynamic programming. Based on this result, the paper also presents a polynomial-time algorithm minimising the weighted number of late trains.

This paper defines the self-covariance property for the solutions of transferable utility cooperative games (TU-games) as a weakening of their covariance. Self-covariant solutions are positively homogenous and satisfy a “restricted” translation covariance so that feasible shifts are only the solution vectors themselves and their multipliers. A description of all non-empty, single-valued, efficient, anonymous, weakly and self-covariant solutions in the class of twoplayer TU-games is given. As demonstrated below, among them there exist just three solutions admitting consistent extensions in the Davis–Maschler sense. They are the equal share solution, the standard solution, and the constrained egalitarian solution for superadditive two-player TUgames. For the third solution mentioned, characterizations of some consistent extensions to the class of all TU-games are given.

This work is devoted to the problems of information transmission with frequency shift keying and fast frequency hopping in special channels where the signal/noise ratio is low, and a high energy interfering signal is present. We propose a demodulation algorithm that is significantly more stable to the influence of a powerful interfering signal as compared to other known algorithms. Under these conditions, we show a statistical criterion that lets one significantly reduce error probability on the demodulator’s output. For the chosen criterion we prove several lemmas that let us speed up the demodulation algorithm. Computer modeling results show that the proposed demodulation algorithm has better correcting ability under a powerful interfering signal than previously known ones.

We analyze the methods of stochastic and fuzzy comparison and ordering of random and fuzzy variables. We find simple formulas for computing a number of comparisons and establish the interrelations between various comparisons. We propose and study a new approach to comparing histograms of discrete random (fuzzy) variables based on computing a “directed” minimal transformation that maps one of the compared variables into another. We apply the method of minimal transformations to solving the problem of optimal reduction of discrete random (fuzzy) variables to unimodal form which is considered in the context of ranking the histograms of universities constructed by USE (Unified State Exam) results. We propose a model of “perfect” admission for high school graduates and show that the distribution of admitted graduates to a university in this model will be unimodal under sufficiently general assumptions on the preference function.

We consider an optimal portfolio selection problem to track a riskless reference portfolio. Portfolio management strategies are compared taking into account the investor’s temporal preferences. We investigate stochastic optimality of the strategy that minimizes the expected long-run cost, deriving an asymptotical upper (almost sure) estimate for the difference between the values of the objective functional corresponding to the optimal strategy and for any admissible control.

Time consistency is one of the most important properties of solutions in cooperative differential games. This paper uses the core as a cooperative solution of the game. We design a strong time-consistent subset of the core. The design method of this subset is based on a special class of imputation distribution procedures (IDPs).

We study the recognition problem for composite objects based on a probabilistic model of a piecewise regular object with thousands of alternative classes. Using the model’s asymptotic properties, we develop a new maximal likelihood enumeration method which is optimal (in the sense of choosing the most likely reference for testing on every step) in the class of “greedy” algorithms of approximate nearest neighbor search. We show experimental results for the face recognition problem on the FERET dataset. We demonstrate that the proposed approach lets us reduce decision making time by several times not only compared to exhaustive search but also compared to known approximate nearest neighbors techniques.