?
Asymptotically Optimal Method for Manifold Estimation Problem
P. 325–325.
Let X be an unknown nonlinear smooth q-dimensional Data manifold (D-manifold) embedded in a p-dimensional space (p> q) covered by a single coordinate chart. It is assumed that the manifold's condition number is positive so X has no self-intersections. Let Xn={X1, X2,..., Xn}⊂ X⊂ Rp be a sample randomly selected from the D-manifold Xindependently of each other according to an unknown probability measure on X with strictly positive density.
Language:
English
Kachan O., Yanovich Y., Abramov E., Uchenye Zapiski Kazanskogo Universiteta. Seriya Fiziko-Matematicheskie Nauki 2018 Vol. 160 No. 2 P. 300–308
According to the manifold hypothesis, high-dimensional data can be viewed and meaning- fully represented as a lower-dimensional manifold embedded in a higher dimensional feature space. Manifold learning is a part of machine learning where an intrinsic data representation is uncovered based on the manifold hypothesis.
Many manifold learning algorithms were developed. The one called Grassmann&Stiefel eigenmaps ...
Added: January 21, 2026
Nikita Puchkin, Vladimir Spokoiny, Eugene Stepanov et al., ESAIM - Control, Optimisation and Calculus of Variations 2024 Vol. 30 Article 3
We consider the problem of reconstructing an embedding of a compact connected Riemannian manifold in a Euclidean space up to an almost isometry, given the information on intrinsic distances between points from its “sufficiently large” subset. This is one of the classical manifold learning problems. It happens that the most popular methods to deal with ...
Added: February 2, 2024
Popkov Y., Popkov A. Y., Dubnov Y. A., Mathematical Models and Computer Simulations 2021 Vol. 13 No. 3 P. 382–394
© 2021, Pleiades Publishing, Ltd.Abstract: We develop/propose the method reducing the dimension of a data matrix, based on its direct and inverse projection, and the calculation of projectors that minimize the cross-entropy functional, remove. We introduce the concept of information capacity of a matrix, which is used as a constraint in the optimal reduction problem, ...
Added: October 28, 2022
Puchkin N., Spokoiny V., Journal of Machine Learning Research 2022 Vol. 23 No. 40 P. 1–62
We consider a problem of manifold estimation from noisy observations. Many manifold learning procedures locally approximate a manifold by a weighted average over a small neighborhood. However, in the presence of large noise, the assigned weights become so corrupted that the averaged estimate shows very poor performance. We suggest a structure-adaptive procedure, which simultaneously reconstructs ...
Added: February 3, 2022
Alanov A., Kochurov M., Volkhonskiy D. et al., , in: Proceedings of the 15th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISAPP 2020)Vol. 4.: SciTePress, 2020. P. 214–221.
We propose a novel multi-texture synthesis model based on generative adversarial networks (GANs) with a user-controllable mechanism. The user control ability allows to explicitly specify the texture which should be generated by the model. This property follows from using an encoder part which learns a latent representation for each texture from the dataset. To ensure ...
Added: November 8, 2020
Kachan O. N., Yanovich Y., Abramov E., Ученые записки Казанского университета. Серия: Физико-математические науки 2018 Vol. 160 No. 2 P. 300–308
According to the manifold hypothesis, high-dimensional data can be viewed and meaningfully represented as a lower-dimensional manifold embedded in a higher dimensional feature space. Manifold learning is a part of machine learning where an intrinsic data representation is uncovered based on the manifold hypothesis.
Many manifold learning algorithms were developed. The one called Grassmann & Stiefel eigenmaps (GSE) ...
Added: October 29, 2020
Abramov E., Yanovich Y., Ученые записки Казанского университета. Серия: Физико-математические науки 2018 Vol. 160 No. 2 P. 220–228
Real data are usually characterized by high dimensionality. However, real data obtained from real sources, due to the presence of various dependencies between data points and limitations on their possible values, form, as a rule, form a small part of the high-dimensional space of observations. The most common model is based on the hypothesis that ...
Added: October 29, 2020
Kuleshov A. P., Bernstein A. V., Yanovich Y., Ученые записки Казанского университета. Серия: Физико-математические науки 2018 Vol. 160 No. 2 P. 327–338
The problem of unknown high-dimensional density estimation has been considered. It has been suggested that the support of its measure is a low-dimensional data manifold. This problem arises in many data mining tasks. The paper proposes a new geometrically motivated solution to the problem in the framework of manifold learning, including estimation of an unknown ...
Added: October 28, 2020
Kuleshov A. P., Bernstein A. V., Yanovich Y., Ученые записки Казанского университета. Серия: Физико-математические науки 2018 Vol. 160 No. 2 P. 327–338
The problem of unknown high-dimensional density estimation has been considered. It has been suggested that the support of its measure is a low-dimensional data manifold. This problem arises in many data mining tasks. The paper proposes a new geometrically motivated solution to the problem in the framework of manifold learning, including estimation of an unknown ...
Added: October 28, 2020
Panov V., / Series Discussion paper SFB 649 "Economic risk". 2010. No. 2010-050.
Let a high-dimensional random vector $\vX$ be represented as a sum of two components - a signal $\vS$ that belongs to some low-dimensional linear subspace $\S$, and a noise component $\vN$. This paper presents a new approach for estimating the subspace $\S$ based on the ideas of the Non-Gaussian Component Analysis. Our approach avoids the ...
Added: September 3, 2015
Panov V., / Series Discussion paper SFB 649 "Economic risk". 2010. No. 2010-026.
In this article, we present new ideas concerning Non-Gaussian Component Analysis (NGCA). We use the structural assumption that a high-dimensional random vector $\vX$ can be represented as a sum of two components - a low-dimensional signal $\vS$ and a noise component $\vN$. We show that this assumption enables us for a special representation for the ...
Added: September 3, 2015
Karasev M., Novikova E., Mathematical notes 2014 Vol. 96 No. 6 P. 965–970
We discuss the general opportunity to create (asymptotically) a comletely integrable system from the original perturbed system by inserting additional perturbing terms. After such an artificial insertion, there appears an opportunity to make the secondary averaging and secondary reduction of the original system. Thus, by this way, the $3D$-system becomes $1$-dimensional. We demonstrate this approach ...
Added: November 25, 2014
Bernstein A., Kuleshov A. P., International Journal of Software and Informatics 2013 No. 7(3) P. 359–390
One of the ultimate goals of Manifold Learning (ML) is to reconstruct an unknown nonlinear low-dimensional Data Manifold (DM) embedded in a high-dimensional observation space from a given set of data points sampled from the manifold. We derive asymptotic expansion and local lower and upper bounds for the maximum reconstruction error in a small neighborhood ...
Added: November 21, 2014