Towards automation of data quality system for CERN CMS experiment
Daily operation of a large-scale experiment is a challenging task, particularly from perspectives of routine monitoring of quality for data being taken. We describe an approach that uses Machine Learning for the automated system to monitor data quality, which is based on partial use of data qualified manually by detector experts. The system automatically classifies marginal cases: both of good an bad data, and use human expert decision to classify remaining "grey area" cases. This study uses collision data collected by the CMS experiment at LHC in 2010. We demonstrate that proposed workflow is able to automatically process at least 20% of samples without noticeable degradation of the result.
This book constitutes the refereed proceedings of the 10th International Conference on Formal Concept Analysis, ICFCA 2012, held in Leuven, Belgium in May 2012. The 20 revised full papers presented together with 6 invited talks were carefully reviewed and selected from 68 submissions. The topics covered in this volume range from recent advances in machine learning and data mining; mining terrorist networks and revealing criminals; concept-based process mining; to scalability issues in FCA and rough sets.
This book constitutes the second part of the refereed proceedings of the 10th International Conference on Formal Concept Analysis, ICFCA 2012, held in Leuven, Belgium in May 2012. The topics covered in this volume range from recent advances in machine learning and data mining; mining terrorist networks and revealing criminals; concept-based process mining; to scalability issues in FCA and rough sets.
Pattern structures, an extension of FCA to data with complex descriptions, propose an alternative to conceptual scaling (binarization) by giving direct way to knowledge discovery in complex data such as logical formulas, graphs, strings, tuples of numerical intervals, etc. Whereas the approach to classification with pattern structures based on preceding generation of classifiers can lead to double exponent complexity, the combination of lazy evaluation with projection approximations of initial data, randomization and parallelization, results in reduction of algorithmic complexity to low degree polynomial, and thus is feasible for big data.
This is a textbook in data analysis. Its contents are heavily influenced by the idea that data analysis should help in enhancing and augmenting knowledge of the domain as represented by the concepts and statements of relation between them. According to this view, two main pathways for data analysis are summarization, for developing and augmenting concepts, and correlation, for enhancing and establishing relations. Visualization, in this context, is a way of presenting results in a cognitively comfortable way. The term summarization is understood quite broadly here to embrace not only simple summaries like totals and means, but also more complex summaries such as the principal components of a set of features or cluster structures in a set of entities.
The material presented in this perspective makes a unique mix of subjects from the fields of statistical data analysis, data mining, and computational intelligence, which follow different systems of presentation.
Formal Concept Analysis Research Toolbox (FCART) is an integrated environment for knowledge and data engineers with a set of research tools based on Formal Concept Analysis. FCART allows a user to load structured and unstructured data (including texts with various metadata) from heterogeneous data sources into local data storage, compose scaling queries for data snapshots, and then research classical and some innovative FCA artifacts in analytic sessions.
In 2015-2016 the Department of Communication, Media and Design of the National Research University “Higher School of Economics” in collaboration with non-profit organization ROCIT conducted research aimed to construct the Index of Digital Literacy in Russian Regions. This research was the priority and remain unmatched for the momentIn 2015-2016 the Department of Communication, Media and Design of the National Research University “Higher School of Economics” in collaboration with non-profit organization ROCIT conducted research aimed to construct the Index of Digital Literacy in Russian Regions. This research was the priority and remain unmatched for the moment
The article is dedicated to the analysis of Big Data perspective in jurisprudence. It is proved that Big Data have to be used as the explanatory and predictable tool. The author describes issues concerning Big Data application in legal research. The problems are technical (data access, technical imperfections, data verification) and informative (interpretation of data and correlations). It is concluded that there is the necessity to enhance Big Data investigations taking into account the abovementioned limits.
Event logs collected by modern information and technical systems usually contain enough data for automated process models discovery. A variety of algorithms was developed for process models discovery, conformance checking, log to model alignment, comparison of process models, etc., nevertheless a quick analysis of ad-hoc selected parts of a journal still have not get a full-fledged implementation. This paper describes an ROLAP-based method of multidimensional event logs storage for process mining. The result of the analysis of the journal is visualized as directed graph representing the union of all possible event sequences, ranked by their occurrence probability. Our implementation allows the analyst to discover process models for sublogs defined by ad-hoc selection of criteria and value of occurrence probability
The Handbook of CO₂ in Power Systems' objective is to include the state-of-the-art developments that occurred in power systems taking CO₂ emission into account. The book includes power systems operation modeling with CO₂ emissions considerations, CO₂ market mechanism modeling, CO₂ regulation policy modeling, carbon price forecasting, and carbon capture modeling. For each of the subjects, at least one article authored by a world specialist on the specific domain is included.
By using superconducting quantum interference device (SQUID) magnetometry, we investigated anisotropic high-field (H less than or similar to 7T) low-temperature (10 K) magnetization response of inhomogeneous nanoisland FeNi films grown by rf sputtering deposition on Sitall (TiO2) glass substrates. In the grown FeNi films, the FeNi layer nominal thickness varied from 0.6 to 2.5 nm, across the percolation transition at the d(c) similar or equal to 1.8 nm. We discovered that, beyond conventional spin-magnetism of Fe21Ni79 permalloy, the extracted out-of-plane magnetization response of the nanoisland FeNi films is not saturated in the range of investigated magnetic fields and exhibits paramagnetic-like behavior. We found that the anomalous out-of-plane magnetization response exhibits an escalating slope with increase in the nominal film thickness from 0.6 to 1.1 nm, however, it decreases with further increase in the film thickness, and then practically vanishes on approaching the FeNi film percolation threshold. At the same time, the in-plane response demonstrates saturation behavior above 1.5-2T, competing with anomalously large diamagnetic-like response, which becomes pronounced at high magnetic fields. It is possible that the supported-metal interaction leads to the creation of a thin charge-transfer (CT) layer and a Schottky barrier at the FeNi film/Sitall (TiO2) interface. Then, in the system with nanoscale circular domains, the observed anomalous paramagnetic-like magnetization response can be associated with a large orbital moment of the localized electrons. In addition, the inhomogeneous nanoisland FeNi films can possess spontaneous ordering of toroidal moments, which can be either of orbital or spin origin. The system with toroidal inhomogeneity can lead to anomalously strong diamagnetic-like response. The observed magnetization response is determined by the interplay between the paramagnetic-and diamagnetic-like contributions.
Many electronic devices operate in a cyclic mode. This should be considered when forecastingreliability indicators at the design stage.The accuracy of the prediction and the planning for the event to ensure reliability depends on correctness of valuation and accounting greatest possiblenumber of factors. That in turn will affect the overall progress of the design and, in the end,result in the quality and competitiveness of products
Let G be a semisimple algebraic group whose decomposition into the product of simple components does not contain simple groups of type A, and P⊆G be a parabolic subgroup. Extending the results of Popov , we enumerate all triples (G, P, n) such that (a) there exists an open G-orbit on the multiple flag variety G/P × G/P × . . . × G/P (n factors), (b) the number of G-orbits on the multiple flag variety is finite.
I give the explicit formula for the (set-theoretical) system of Resultants of m+1 homogeneous polynomials in n+1 variables