This book constitutes the proceedings of the 16th International Conference on Formal Concept Analysis, ICFCA 2021, held in Strasbourg, France, in June/July 2021.
The 14 full papers and 5 short papers presented in this volume were carefully reviewed and selected from 32 submissions. The book also contains four invited contributions in full paper length.
The research part of this volume is divided in five different sections. First, "Theory" contains compiled works that discuss advances on theoretical aspects of FCA. Second, the section "Rules" consists of contributions devoted to implications and association rules. The third section "Methods and Applications" is composed of results that are concerned with new algorithms and their applications. "Exploration and Visualization" introduces different approaches to data exploration.
This book constitutes the proceedings of the 20th International Conference on Mathematical Optimization Theory and Operations Research, MOTOR 2021, held in Irkutsk, Russia, in July 2021.
The 29 full papers and 1 short paper presented in this volume were carefully reviewed and selected from 102 submissions. Additionally, 2 full invited papers are presented in the volume. The papers are grouped in the following topical sections: combinatorial optimization; mathematical programming; bilevel optimization; scheduling problems; game theory and optimal control; operational research and mathematical economics; data analysis.
The book begins with a discussion of what a performant system is and progresses to measuring performance and setting performance goals. It introduces different classes of queries and optimization techniques suitable to each, such as the use of indexes and specific join algorithms. You will learn to read and understand query execution plans along with techniques for influencing those plans for better performance. The book also covers advanced topics such as the use of functions and procedures, dynamic SQL, and generated queries. All of these techniques are then used together to produce performant applications, avoiding the pitfalls of object-relational mappers.
The materials of The International Scientific – Practical Conference is presented below. The Conference reflects the modern state of innovation in education, science, industry and social-economic sphere, from the standpoint of introducing new information technologies. It is interesting for a wide range of researchers, teachers, graduate students and professionals in the field of innovation and information technologies.
This special conference starts the new series, and therefore, it launches a tradition to follow, and an opportunity for a rapid spin up. This ICCQ conference was organized by the HSE and leading IT innovative companies such Huawei and Yandex. The ICCQ 2021 attracted a number of renowned experts including Jens Palsberg, Anders Møller, and David West. The papers were submitted from the world over. The conference attracted speakers and attendees from the USA, Europe, and Asia; therefore, this is truly an international event. Although ICCQ started as a relatively small single-day conference, it immediately gained the IEEE support. The plan for the next years is to embrace the world while keeping high quality standards.
We discuss the applicability of multiphase lattice Boltzmann method for the simulation of the drop oscillation. We demonstrate that the simulation of the single drop excited to the first eigenmode does follow Rayleigh formula. Simulations show no sensitivity to the number of the discrete velocities with D3Q19 and D3Q27 representations of the distribution function in three dimensions. The boundaries do influent the motion of the drop—division of the computational area by the even and the odd number of cells comes out important and leads to symmetry violence. The second part of the chapter describes the oscillations of the ensemble of three drops due to the excitation of the central drop in the first eigenmode. The motion of the backdrops does strongly depend on the viscosity of the fluid. We provide future details of simulations.
The relevance of studying the regulation of protein-ligand interactions is due to the emergence of new views on the role of metabolites and their key importance in vital processes. To study the protein-ligand interaction, the AB0 antigen-antibody blood system and the enzyme-substrate system of dehydrogenases were used as a test system, and ethanol was used as an influencing factor. In experiments performed with A and B blood erythrocyte antigens, natural AB0 system antibodies and monoclonal antibodies under the influence of ethanol performed change of the degree of agglutination and the time to onset of erythrocyte agglutination. It was found that ethanol can regulate the enzyme-substrate interactions of dehydrogenases: lactate dehydrogenase (EC 126.96.36.199), glyceraldehyde phosphate dehydrogenase (EC 188.8.131.52), and α-glycerol phosphate dehydrogenase (EC 184.108.40.206). The increase in the activity of studied enzymes under the influence of ethanol in the whole blood hemolysate was 2.5 - 3 times higher than in the isolated medium (with pure enzyme preparations).
This textbook on political geography is devoted to a discipline concerned with the spatial dimensions of politics. This course is an introduction to the study of political science, international relations and area studies, providing a systemic approach to the spatial dimension of political processes at all levels. It covers their basic elements, including states, supranational unions, geopolitical systems, regions, borders, capitals, dependent, and internationally administered territories. Political geography develops fundamental theoretical approaches that give insight into the peculiarities of foreign and domestic policies. The ability to use spatial analysis techniques allows determining patterns and regularities of political phenomena both at the global and the regional and local levels.
This book focuses on crisis management in software development which includes forecasting, responding and adaptive engineering models, methods, patterns and practices. It helps the stakeholders in understanding and identifying the key technology, business and human factors that may result in a software production crisis. These factors are particularly important for the enterprise-scale applications, typically considered very complex in managerial and technological aspects and therefore, specifically addressed by the discipline of software engineering. Therefore, this book throws light on the crisis responsive, resilient methodologies and practices; therewith, it also focuses on their evolutionary changes and the resulting benefits.
The International conference “Linguistic Forum 2020: Language and Artificial Intelligence” took place in 2020 on November 12-14 in Moscow, Russia. The conference is organized by the Institute of Linguistics, Russian Academy of Sciences. This conference is part of a series of annual forums initiated by the Institute of Linguistics RAS in 2019. The aim of the 2020 forum is to foster dialogue among researchers working at the interface of linguistics and artificial intelligence including those engaged in computational linguistics and natural language processing. Developments in AI have been responsible for recent advances in natural language generation and comprehension; they have also expanded the boundaries of these technologies’ applicability. Neural networks and dense embeddings have replaced models based on feature engineering and traditional discrete categories of linguistic analysis. As a result, the boundary between fundamental and applied linguistic research is being eroded. Empirical linguistics is taking on board these new technologies, in part, to enable better modelling of language and documentation of data. AI is also increasingly becoming a part of the everyday life of language users. Can fundamental linguistics currently offer technologically viable ideas or methods? These and similar conceptual and methodological problems were the focus of the forum.
In this paper, a statistical game was defined and solved. Its solution is: the optimal randomized decision rule, the probability of a correct decision on this rule, and the worst a priori distribution of the test subjects knowledge levels. We have developed a method for assessment the accuracy and reliability of decision making by on test results. The proposed program allows you to assessment the reliability of the solution for a test containing 10 items with different levels of difficulty, and 11 different levels of knowledge level.
TThe present paper proposes a model for evaluating geo-ecological protection technologies based on multi-criteria optimization and weighted convolution criteria, on the basis of which the method of calculation is developed, allowing to determine the PQ factor for different objects according to the selected technologies using the Mathlab environment. The work demonstrated the application of the technique in the case of materials made of ash foam concrete with densities and ash content from the incineration of sewage sludge. The determination of the optimum composition of solopenobeton is relevant for the design of noise shields in railway transport. The proposed simulation algorithm in the Matlab environment makes it possible to use the procedure of processing the raw data, using several options of their input: in the form of tables of the format. csv or manual input.
This is a companion book to Asymptotic Analysis of Random Walks: Heavy-Tailed Distributions by A.A. Borovkov and K.A. Borovkov. Its self-contained systematic exposition provides a highly useful resource for academic researchers and professionals interested in applications of probability in statistics, ruin theory, and queuing theory. The large deviation principle for random walks was first established by the author in 1967, under the restrictive condition that the distribution tails decay faster than exponentially. (A close assertion was proved by S.R.S. Varadhan in 1966, but only in a rather special case.) Since then, the principle has always been treated in the literature only under this condition. Recently, the author jointly with A.A. Mogul'skii removed this restriction, finding a natural metric for which the large deviation principle for random walks holds without any conditions. This new version is presented in the book, as well as a new approach to studying large deviations in boundary crossing problems. Many results presented in the book, obtained by the author himself or jointly with co-authors, are appearing in a monograph for the first time.
The Volume includes a Special Section on "Analytical and Computational Methods in Probability"
The materials of the 5th International conference on stochastic methods are presented including the following directions: probability and statistics (analytic modelling, asymptotic methods and limit theorems, stochastic analysis, Markov processes and martingales, actuarial and financial mathematics, et al.); applications of stochastic methods (queueing theory and stochastic networks, reliability theory and risk analysis, probability in indistry, economics and other areas, computer science and computer networks, machine learning and data analysis, etc.).
We construct a mirabolic analogue of the geometric Satake equivalence. We also prove an equivalence that relates representations of a supergroup to the category of GL(N − 1, C[[t]])-equivariant perverse sheaves on the affine Grassmannian of GLN . We explain how our equivalences fit into a more general framework of conjectures due to Gaiotto and to Ben-Zvi, Sakellaridis and Venkatesh.
MiRNA isoforms (isomiRs) are single stranded small RNAs originating from the same pri-miRNA hairpin as a result of cleavage by Drosha and Dicer enzymes. Variations at the 5ʹ-end of a miRNA alter the seed region of the molecule, thus affecting the targetome of the miRNA. In this manuscript, we analysed the distribution of miRNA cleavage positions across 31 different cancers using miRNA sequencing data of TCGA project. As a result, we found that the processing positions are not tissue specific and that all miRNAs could be correctly classified as ones exhibiting homogeneous or heterogeneous cleavage at one of the four cleavage sites. In 42% of cases (42 out of 100 miRNAs), we observed imprecise 5ʹ-end Dicer cleavage, while this fraction was only 14% for Drosha (14 out of 99). To the contrary, almost all cleavage sites of 3ʹ-ends (either Drosha or Dicer) were heterogeneous. With the use of only four nucleotides surrounding a 5ʹ-end Dicer cleavage position we built a model which allowed us to distinguish between homogeneous and heterogeneous cleavage with the reliable quality (ROC AUC = 0.68). Finally, we showed the possible applications of the study by the analysis of two 5ʹ-end isoforms originating from the same exogeneous shRNA hairpin. It turned out that the less expressed shRNA variant was functionally active, which led to the increased off-targeting. Thus, the obtained results could be applied to the design of shRNAs whose processing will result in a single 5ʹ-variant.
There are two different modal logics: the logic T assuming contingency and the logic K = assuming logical determinism. In the paper, I show that the Aristotelian treatise On Interpretation (Περί ερμηνείας, De Interpretatione) has introduced some modal-logical relationships which correspond to T. In this logic, it is supposed that there are contingent events. The Nāgārjunian treatise Īśvara-kartṛtva-nirākṛtiḥ-viṣṇoḥ-ekakartṛtva-nirākaraṇa has introduced some modal-logical relationships which correspond to K =. In this logic, it is supposed that there is a logical determinism: each event happens necessarily (siddha) or it does not happen necessarily (asiddha). The Nāgārjunian approach was inherited by the Yogācārins who developed, first, the doctrine of causality of all real entities (arthakriyātva) and, second, the doctrine of momentariness of all real entities (kṣaṇikavāda). Both doctrines were a philosophical ground of the Yogācārins for the logical determinism. Hence, Aristotle implicitly used the logic T in his modal reasoning. The Madhyamaka and Yogācāra schools implicitly used the logic K = in their modal reasoning.
The goal of the paper is to develop a new algorithm for predicting whether the company will go bankrupt on the base of unbalanced data. To do it, we propose to consider the classification as a multi-objective optimization problem and construct a prediction model as an ensemble while minimizing the parameters FPR (False Positive Rate) and FNR (False Negative Rate) at the same time. To create the ensemble, the proposed algorithm of a Multi-Objective Classifier Selection (MOCS) selects only classifiers that belong to the Pareto-optimal set in FPR/FNR space; that is, there is no dominance between them, and they satisfy some additional conditions. In the general case, MOCS is determined by three parameters: two threshold values that limit false rates (FNR and FPR), and the crowding distance, which defines the uniqueness of the classifier's results. We tested the proposed algorithm on data collected from 2457 Russian companies, 456 of which went bankrupt, and 5910 Polish companies, 410 of which received bankruptcy status. Datasets contain features such as financial ratios and business environment factors. In the testing, we used more than 70 combinations of under-sampling, over-sampling, and no sampling methods with static and dynamic classification models. Final ensembles include seven classifiers for the Russian dataset and four classifiers for the Polish dataset combined by soft voting rule. In both cases, the proposed algorithm produces a significant improvement of prediction results as in terms of standard metrics (geometric mean, the area under the ROC curve) and in the visual representation in the FNR/FPR space, namely in the shift from a Pareto-optimal set of classifiers.
Recent statistics report that more than 3.7 million new cases of cancer occur in Europe yearly, and the disease accounts for approximately 20% of all deaths. High-throughput screening of cancer cell cultures has dominated the search for novel, effective anticancer therapies in the past decades. Recently, functional assays with patient-derived ex vivo 3D cell culture have gained importance for drug discovery and precision medicine. We recently evaluated the major advancements and needs for the 3D cell culture screening, and concluded that strictly standardized and robust sample preparation is the most desired development. Here we propose an artificial intelligence-guided low-cost 3D cell culture delivery system. It consists of a light microscope, a micromanipulator, a syringe pump, and a controller computer. The system performs morphology-based feature analysis on spheroids and can select uniform sized or shaped spheroids to transfer them between various sample holders. It can select the samples from standard sample holders, including Petri dishes and microwell plates, and then transfer them to a variety of holders up to 384 well plates. The device performs reliable semi- and fully automated spheroid transfer. This results in highly controlled experimental conditions and eliminates non-trivial side effects of sample variability that is a key aspect towards next-generation precision medicine.
Kinematic dynamo in incompressible isotropic turbulent flows with high magnetic Prandtl number is considered. The approach interpreting an arbitrary magnetic field distribution as a superposition of localized perturbations (blobs) is developed. We derive a general relation between stochastic properties of an isolated blob and a stochastically homogenous distribution of magnetic field advected by the same stochastic flow. This relation allows us to investigate the evolution of a localized blob at a late stage when its size exceeds the viscous scale. It is shown that in three-dimensional flows, the average magnetic field of the blob increases exponentially in the inertial range of turbulence, as opposed to the late-batchelor stage when it decreases. Our approach reveals the mechanism of dynamo generation in the inertial range both for blobs and homogenous contributions. It explains the absence of dynamo in the two-dimensional case and its efficiency in three dimensions. We propose a way to observe the mechanism in numerical simulations
We consider a linear-quadratic control problem where a time parameter evolves according to a stochastic time scale. The stochastic time scale is defined via a stochastic process with continuously differentiable paths. We obtain an optimal infinite-time control law under criteria similar to the long-run averages. Some examples of stochastic time scales from various applications have been examined.
We consider the nanoscale electronic phase separation in a wide class of different materials, mostly in strongly correlated electron systems. The phase separation turns out to be quite ubiquitous manifesting itself in different situations, where the itineracy of charge carriers competes with their tendency toward localization. The latter is often related to some specific type of magnetic ordering, e.g. antiferromagnetic in manganites and low-spin states in cobaltites. The interplay between the localization induced lowering of potential energy and metallicity (which provides the gain in the kinetic energy) favors an inhomogeneous ground state such as nanoscale ferromagnetic droplets in an antiferromagnetic insulating background. The present review article deals with the advances in the subject of electronic phase separation and formation of different types of nanoscale ferromagnetic (FM) metallic droplets (FM polarons or ferrons) in antiferromagnetically ordered (AFM), charge-ordered (CO), or orbitally-ordered (OO) insulating matrices, as well as the colossal magnetoresistance (CMR) effect and tunneling electron transport in the nonmetallic phase-separated state of complex magnetic oxides. It also touches upon the compounds with spin-state transitions, inhomogeneous phase separated state in strongly correlated multiband systems, and electron polaron effect. A special, attention is paid to the systems with the imperfect Fermi surface nesting such as chromium alloys, iron-based pnictides, and AA stacked graphene bilayers.
Solid-state reaction of CaHPO4 with CaCO3 in >2:1 ratio at 1350 °C resulted in α-tricalcium phosphate (α-TCP) formation, following sintering at 850 °C produced a homogeneous β-TCP phase which does not contain crystalline impurities.