Sequential δ-optimal consumption and investment for stochastic volatility markets with unknown parameters
We consider an optimal investment and consumption problem for a Black-Scholes financial market with stochastic volatility and unknown stock price appreciation rate. The volatility parameter is driven by an external economic factor modeled as a diffusion process of Ornstein- Uhlenbeck type with unknown drift. We use the dynamical programming approach and find an optimal financial strategy which depends on the drift parameter. To estimate the drift coefficient we observe the economic factor Y in an interval [0, T0] for fixed T0 > 0, and use sequential estimation. We show that the consumption and investment strategy calculated through this sequential procedure is δ-optimal.
This paper reviews difficulties concerning a development of single-name CDS price (spread) dynamics model for the purpose of determination of margin requirements. It also discusses a possibility to construct such a model using information about respective equity prices and option implied volatilities. Finally, it presents the basic step towards the former idea demonstrating results for the CDS written on Gazprom senior debt.
We apply the suboptimal sequential nonparametric hypotheses testing approach for effectiveness of a statistical decision by sample space reducing. Numerical examples of the sample space reducing are given when an appropriate reducing makes it possible to construct robust sequential nonparametric hypotheses testing with a smaller mean duration time then one on the total sample space. © 2014 IEEE.
We study the problem of testing composite hypotheses versus composite alternatives when there is a slight deviation between the model and the real distribution. The used approach, which we called sub-optimal testing, implies an extension of the initial model and a modification of a sequential statistically significant test for the new model. The sub-optimal test is proposed and a non-asymptotic border for the loss function is obtained. Also we investigate correlation between the sub-optimal test and the sequential probability ratio test for the initial model.
In this paper we explore an application of the pyramid HOG (Histograms of Oriented Gradients) features in image recognition problem with small samples. A sequential analysis is used to improve the performance of hierarchical methods. We propose to process the next, more detailed level of pyramid only if the decision at the current level is unreliable. The Chow’s reject option of comparison of the posterior probability with a fixed threshold is used to verify recognition reliability. The posterior probability is estimated for the homogeneity-testing probabilistic neural network classifier on the basis of its relation with the Bayesian decision. Experimental results in face recognition are presented. It is shown that the proposed approach allows to increase the recognition performance in 2–4 times in comparison with conventional classification of pyramid HOGs.
We guarantee the existence and uniqueness (in the almost everywhere sense) of the solution to a Hamilton-Jacobi-Bellman (HJB) equation with gradient constraint and an integro-differential operator whose Lévy measure has bounded variation. This type of equation arises in singular stochastic control problems where the state process is a jump-diffusion with infinite activity and finite variation jumps. By means of ε-penalized controls we show that the value function associated to this class of problems agrees with the solution to our HJB equation.
This paper is focused on still-to-video face recog- nition with large number of subjects based on computation of distances between high-dimensional embeddings extracted using deep convolution neural networks. We propose to utilize granular structures and sequentially process granular representations of all frames of the input video. The coarse-grained granules include only low number of the first principal components of deep embeddings. The representation of each frame at finer granularity levels is matched with the representations of photos of only those individuals, for whom the decision at previous levels was reliable. The reliability is checked by thresholding the ratio of distance between reference instance and input frame to the minimal distance. As a result, the photos of all unreliable individuals are not examined anymore for a particular frame at the next levels with finer granularity. Decisions for all frames are united into a candidate set of identities, and the maximal a-posterior final decision is chosen. The experimental study with the LFW, YTF and IJB-A datasets and the state-of-the-art deep embeddings demonstrated that the proposed approach is 2-10 times faster than conventional methods