A Simple Method to Evaluate Support Size and Non-uniformity of a Decoder-Based Generative Model
Theoretical analysis in  suggested that adversarially trained generative models are naturally inclined to learn distribution with low support. In particular, this effect is caused by the limited capacity of the discriminator network. To verify this claim,  proposed a statistical test based on the birthday paradox that partially confirmed the analysis. In this paper, we continue this line of work and develop a parameter-free and straightforward method to estimate the support size of an arbitrary decoder-based generative model. Our approach considers the decoder network from a geometric viewpoint and evaluates the support size as the volume of the manifold containing the generative model samples. Additionally, we propose a method to measure non-uniformity of a generative model that can provide additional insight into the model’s behavior. We then apply these tools to perform a quantitative comparison of common generative models.
Variational autoencoder (VAE) is a probabilistic unsupervised method that uses deep learning. We propose a robust approach to the training of VAE using a modified likelihood function. We propose and analyze two variational lower bound objectives. The effectiveness of the method is experimentally shown by artificially introducing noise objects.
One of the options for a more flexible approach to analyzing the reliability of supply chains is the principal component analysis (PCA). With a large number of variables describing supply chain, it is a difficult task to analyze the structure of variables in two-dimensional space. Within the analysis of the variables dependencies PCA allows to go from multidimensional space to low-dimensional space, leaving the most informative data in the array for analysis. Based on the generated data set, this paper demonstrates a possibility of applying PCA to supply chain reliability analysis. The generated data set includes observations of 50 supply chains described by five variables. Based on the array, maximizing the linear combination of parameters for each observation, we determined load coefficients and estimates of each of the main components. The calculation of these coefficients made it possible to move from multidimensional space to a two-dimensional one. The two-dimensional representation of all the data whose axes are the first two main components, explaining 84% of the variance, allows to see the structure of all supply chains, to identify outsiders and leaders in this set.
The progress of deep learning models in image and video processing leads to new artificial intelligence applications in Fashion industry. We consider the application of Generative Adversarial Networks and Neural Style Transfer for Digital Fashion presented as Virtual fashion for trying new clothes. Our model generate humans in clothes with respect to different fashion preferences, color layouts and fashion style. We propose that the virtual fashion industry will be highly impacted by accuracy of generating personalized human model taking into account different aspects of product and human preferences. We compare our model with state-of-art VITON model and show that using new perceptual loss in deep neural network architecture lead to better qualitative results in generating humans in clothes.
We propose a novel multi-texture synthesis model based on generative adversarial networks (GANs) with a user-controllable mechanism. The user control ability allows to explicitly specify the texture which should be generated by the model. This property follows from using an encoder part which learns a latent representation for each texture from the dataset. To ensure a dataset coverage, we use an adversarial loss function that penalizes for incorrect reproductions of a given texture. In experiments, we show that our model can learn descriptive texture manifolds for large datasets and from raw data such as a collection of high-resolution photos. We show our unsupervised learning pipeline may help segmentation models. Moreover, we apply our method to produce 3D textures and show that it outperforms existing baselines.
Pattern mining is an important task in AI for eliciting hypotheses from the data. When it comes to spatial data, the geo-coordinates are often considered independently as two different attributes. Consequently, rectangular shapes are searched for. Such an arbitrary form is not able to capture interesting regions in general. We thus introduce convex polygons, a good trade-off between expressiv-ity and algorithmic complexity. Our contribution is threefold: (i) We formally introduce such patterns in Formal Concept Analysis (FCA), (ii) we give all the basic bricks for mining convex polygons with exhaustive search and pattern sampling, and (iii) we design several algorithms, which we compare experimentally.
This book constitutes the proceedings of the 16th International Conference on Formal Concept Analysis, ICFCA 2021, held in Strasbourg, France, in June/July 2021.
The 14 full papers and 5 short papers presented in this volume were carefully reviewed and selected from 32 submissions. The book also contains four invited contributions in full paper length.
The research part of this volume is divided in five different sections. First, "Theory" contains compiled works that discuss advances on theoretical aspects of FCA. Second, the section "Rules" consists of contributions devoted to implications and association rules. The third section "Methods and Applications" is composed of results that are concerned with new algorithms and their applications. "Exploration and Visualization" introduces different approaches to data exploration.
English language teaching improvement has as its goal the communicative competence development within integration processes.Collocations are essential for communicative competence development. Collocations and different forms of unsupervised acquisition are compulsory components for IELTS preparation.
A model for organizing cargo transportation between two node stations connected by a railway line which contains a certain number of intermediate stations is considered. The movement of cargo is in one direction. Such a situation may occur, for example, if one of the node stations is located in a region which produce raw material for manufacturing industry located in another region, and there is another node station. The organization of freight traﬃc is performed by means of a number of technologies. These technologies determine the rules for taking on cargo at the initial node station, the rules of interaction between neighboring stations, as well as the rule of distribution of cargo to the ﬁnal node stations. The process of cargo transportation is followed by the set rule of control. For such a model, one must determine possible modes of cargo transportation and describe their properties. This model is described by a ﬁnite-dimensional system of diﬀerential equations with nonlocal linear restrictions. The class of the solution satisfying nonlocal linear restrictions is extremely narrow. It results in the need for the “correct” extension of solutions of a system of diﬀerential equations to a class of quasi-solutions having the distinctive feature of gaps in a countable number of points. It was possible numerically using the Runge–Kutta method of the fourth order to build these quasi-solutions and determine their rate of growth. Let us note that in the technical plan the main complexity consisted in obtaining quasi-solutions satisfying the nonlocal linear restrictions. Furthermore, we investigated the dependence of quasi-solutions and, in particular, sizes of gaps (jumps) of solutions on a number of parameters of the model characterizing a rule of control, technologies for transportation of cargo and intensity of giving of cargo on a node station.
Event logs collected by modern information and technical systems usually contain enough data for automated process models discovery. A variety of algorithms was developed for process models discovery, conformance checking, log to model alignment, comparison of process models, etc., nevertheless a quick analysis of ad-hoc selected parts of a journal still have not get a full-fledged implementation. This paper describes an ROLAP-based method of multidimensional event logs storage for process mining. The result of the analysis of the journal is visualized as directed graph representing the union of all possible event sequences, ranked by their occurrence probability. Our implementation allows the analyst to discover process models for sublogs defined by ad-hoc selection of criteria and value of occurrence probability
Existing approaches suggest that IT strategy should be a reflection of business strategy. However, actually organisations do not often follow business strategy even if it is formally declared. In these conditions, IT strategy can be viewed not as a plan, but as an organisational shared view on the role of information systems. This approach generally reflects only a top-down perspective of IT strategy. So, it can be supplemented by a strategic behaviour pattern (i.e., more or less standard response to a changes that is formed as result of previous experience) to implement bottom-up approach. Two components that can help to establish effective reaction regarding new initiatives in IT are proposed here: model of IT-related decision making, and efficiency measurement metric to estimate maturity of business processes and appropriate IT. Usage of proposed tools is demonstrated in practical cases.