Big Data Refining on the Base of Cognitive Modeling
In conditions of rapid external changes the requirement to quality of control of purposeful development of complex system (states, regions, corporations etc.) dramatically increases. Automation support of the key stages of decision making process is one of the ways to cope with the challenges. This paper focuses on the approach based on the Big Data Refining during cognitive modeling that proves the correctness of modeling and decision-making. The approach uses the requests to the Big Data for cognitive model components verification. The requests are created by intelligent agents with feedback from decision makers. Some practical results confirm the adequacy of the proposed approach.
A problem of difficulties of large amount of innovation project analysis and ways of solving the problem are described in the article. Disadvantages of existing information retrieval systems and advantages of multiagent technology systems, as well as classifications, tasks and mechanisms of intellectual agents are pointed out.
Pattern structures, an extension of FCA to data with complex descriptions, propose an alternative to conceptual scaling (binarization) by giving direct way to knowledge discovery in complex data such as logical formulas, graphs, strings, tuples of numerical intervals, etc. Whereas the approach to classification with pattern structures based on preceding generation of classifiers can lead to double exponent complexity, the combination of lazy evaluation with projection approximations of initial data, randomization and parallelization, results in reduction of algorithmic complexity to low degree polynomial, and thus is feasible for big data.
The proceedings of the 11th International Conference on Service-Oriented Computing (ICSOC 2013), held in Berlin, Germany, December 2–5, 2013, contain high-quality research papers that represent the latest results, ideas, and positions in the field of service-oriented computing. Since the first meeting more than ten years ago, ICSOC has grown to become the premier international forum for academics, industry researchers, and practitioners to share, report, and discuss their ground-breaking work. ICSOC 2013 continued along this tradition, in particular focusing on emerging trends at the intersection between service-oriented, cloud computing, and big data.
The paper focuses on the explicit cognitive model of translation process stressing its didactic potential that makes it possible to apply it empirically to the professional activity of a translator by following the steps claimed in the research. The given detailed introspective process scheme of a translator's conscious mental acts has been designed taking into account the creative part of his/her professional work. Thus the first stage is devoted to shaping the background of the cognitive translation process, which comprises an ultimate complex of necessary multifaceted knowledge of linguistic, meta-linguistic and extra-linguistic kinds. The second stage is aimed at combining the elements of that accumulated initial knowledge base, where comparison and selection lead to the search for a mental program image of all possible translation variants. The third stage marks the final choice and taking a most adequate translation decision. The novelty of the presented research is in its attempt to describe an intuitive component being a characteristic feature of any individual translation decision-making process. With this in view, the author starts by critically analyzing both foreign and domestic approaches to creativity and intuition in terms of philosophy and creative psychology. Then their related achievements have been compared to those of the world translation theory. Basing on that the paper suggests the productivity of applying the semiotic and interpretive methodology to the explanation of the cognitive mechanisms of understanding the original and further making translation. The present interdisciplinary research methodology includes logical meditation, analytical modeling, cognitive and comparative analyses, the synthesis of information learned with its further critical evaluation, reflexive thinking, and making deductive/inductive conclusions. The paper consists of the relevant academic literature review, theoretic and methodological framework, results obtained and the didactic recommendations of the author. The research bibliography covers 50 sources, which have been referred to in the paper, by both domestic and foreign scholars, ranging from classical to modern publications.
Full texts of third international conference on data analytics are presented.
In 2015-2016 the Department of Communication, Media and Design of the National Research University “Higher School of Economics” in collaboration with non-profit organization ROCIT conducted research aimed to construct the Index of Digital Literacy in Russian Regions. This research was the priority and remain unmatched for the momentIn 2015-2016 the Department of Communication, Media and Design of the National Research University “Higher School of Economics” in collaboration with non-profit organization ROCIT conducted research aimed to construct the Index of Digital Literacy in Russian Regions. This research was the priority and remain unmatched for the moment
The article is dedicated to the analysis of Big Data perspective in jurisprudence. It is proved that Big Data have to be used as the explanatory and predictable tool. The author describes issues concerning Big Data application in legal research. The problems are technical (data access, technical imperfections, data verification) and informative (interpretation of data and correlations). It is concluded that there is the necessity to enhance Big Data investigations taking into account the abovementioned limits.
I give the explicit formula for the (set-theoretical) system of Resultants of m+1 homogeneous polynomials in n+1 variables