Computationally effective mathematical model of airplane flight
An approach to modeling is proposed, where the airplane is considered as a system with a minimal set of basic material points, and its mathematical description is given. We construct the mathematical model of the conventional airplane with an adequate accuracy for evaluation calculations in creating the control system. The flight simulation software for the Yak-52 and NG-4 airplanes is developed in MATLAB and the modeling results are analyzed.
This collection of articles contain materials of the talks presented at the International Conference "Systems Analysis: Modeling and Control" in memory of Academician A.V. Kryazhimskiy, Moscow, May 31 - June 1, 2018
High performance querying and ad-hoc querying are commonly viewed as mutually exclusive goals in massively parallel processing databases. In the one extreme, a database can be set up to provide the results of a single known query so that the use of available of resources are maximized and response time minimized, but at the cost of all other queries being suboptimally executed. In the other extreme, when no query is known in advance, the database must provide the information without such optimization, normally resulting in inefficient execution of all queries. This paper introduces a novel technique, highly normalized Big Data using Anchor modeling, that provides a very efficient way to store information and utilize resources, thereby providing ad-hoc querying with high performance for the first time in massively parallel processing databases. A case study of how this approach is used for a Data Warehouse at Avito over two years time, with estimates for and results of real data experiments carried out in HP Vertica, an MPP RDBMS, are also presented.
Ways to improve the sensitivity and signal-to-noise ratio of primary converters in each case have their fundamental and practical limitations. To determine the optimal frequency of the electromechanical conversion and ensure the specified accuracy and stability of metrological characteristics are encouraged to use the method of probabilistic stability studies of the output characteristic of the converter. The research is based on the methods of the stability characteristics of the probabilistic stability studies, provides the most comprehensive account of the random nature of the structural abnormalities and electrical parameters under the influence of destabilizing factors in serial production. Formed the basis of the method of moments technique allows for a small amount of calculations to obtain the necessary accuracy.
The paper discloses the method for estimating the deformation of fibroreticulate materials under the conditions of spatial multiaxial cyclic tension. The relevance of the method application for estimating the reliability performance of materials used for upholstering and finishing the interior of aircraft cabins has been justified. The equations of the elastic state of flexible fibroreticulate materials are obtained under the spatial tension in stresses and deformations. The geometric parameters of elastic deformations in the material during in the material in the design, processing treatment (molding process) and operation (shape stability) of the products. The advantage of the developed method of spatial cyclic tension of flexible fibroreticulate materials is the ability to model test conditions that simulate operating conditions. This capability of the test method allows to examine the dynamics of changes in the markers of the molding properties and shape stability of materials and to predict their behavior during operation.
High performance querying and ad-hoc querying are commonly viewed as mutually exclusive goals in massively parallel processing databases. Furthermore, there is a contradiction between ease of extending the data model and ease of analysis. The modern 'Data Lake' approach, promises extreme ease of adding new data to a data model, however it is prone to eventually becoming a Data Swamp - unstructured, ungoverned, and out of control Data Lake where due to a lack of process, standards and governance, data is hard to find, hard to use and is consumed out of context. This paper introduces a novel technique, highly normalized Big Data using Anchor modeling, that provides a very efficient way to store information and utilize resources, thereby providing ad-hoc querying with high performance for the first time in massively parallel processing databases. This technique is almost as convenient for expanding data model as a Data Lake, while it is internally protected from transforming to Data Swamp. A case study of how this approach is used for a Data Warehouse at Avito over a three-year period, with estimates for and results of real data experiments carried out in HP Vertica, an MPP RDBMS, is also presented. This paper is an extension of theses from The 34th International Conference on Conceptual Modeling (ER 2015) (Golov and Rönnbäck 2015) , it is complemented with numerical results about key operating areas of highly normalized big data warehouse, collected over several (1-3) years of commercial operation. Also, the limitations, imposed by using a single MPP database cluster, are described, and cluster fragmentation approach is proposed.