In modern society, skills of working with information play a significant role. The influence of information in our everyday lives is rapidly increasing, while methods of data processing remain the same.
Research is being directed at problems of data processing and visualization of information, which become more popular with each year. This paper describes infological models, a new method of data visualization and information processing based on technologies of information presentation, as well as on principles of semantic networks, open data and data banks.
The technology of infological models represents a new approach to data storage and exchange which enables us to look at information processing in a new way. Based on principles of open data, semantic networks and data banks, the concept seeks to define the set of entities and relations, based on which an independent information block is displayed as a block diagram, which is easy to understand for an average user with a computer.
This work provides a brief overview of the information overload problem, describes the technology of infological models, its general principles and contains an application of mentioned methods in e-commerce using the example of knowledge bases, news portals, on-line shops, smart house and Internet of Things, with description of features and advantages, ending with an overall conclusion.
Data integration in enterprises is a crucial but at the same time costly and challenging problem. While business-critical information is stored in ERP, CRM, SCM and in Content Management systems the integration of such becomes even more critical when integrating with the growing information space on the Web. The IT industry has developed over the last decade integration solutions based on Master Data Management, Business Intelligence and the Service Oriented Architecture. However, we become increasingly aware that such technologies are not sufficient to ultimately solve all data integration challenges. Under the vision of context-aware services and integration we propose to apply the technology of the Linked Data paradigm. This approach seems to be promising, as scientists in the evolution of the Semantic Web have used it. We discuss Linked Data approaches in relation to the value chain and information integration of heterogeneous content and present an example of a CRM business process applying the Linked Data principles.
This article is devoted to the problems of formalization of decision-making in the choice of development strategy in an organization and the ways to implement this strategy. A prioritization method is proposed that allows converting qualitative indicators into qualitative variables by means of pairwise comparison of the objects. As opposed to a simple summation of estimates preferences, this computational algorithm allows one to take into account indirect benefits of all the objects under consideration. The approaches to the ranking of experts and challenges an organization faces at various stages of its development are set forth. The algorithm is validated on the example of a particular company. The estimates of the priorities of experts are provided; the tree of tasks for which the comprehensive priorities are designed (taking into account the importance and relevance of expert tasks for each expert) is constructed; the analysis of the results for different conditions of the external and internal environment of the organization is made. Recommendations are given on the choice of the deviation values for matrix of pairwise comparisons of objects, as well as a reasonable number of iterations of the calculation of the integrated power of these objects. The practical significance of the work lies in the fact that the proposed algorithm and methodology for ranking of experts, tasks and subtasks may be used to prepare management accounting regulations to improve decision-making methods, taking into account the strategic and tactical objectives of the organization.
This paper analyses the possibility of using mobile technologies and applications in the Russian healthcare system and evaluates the opportunities for its further development. The research provides an overview of global trends in digital healthcare with some examples of the best solutions for eHealth (healthcare practice supported by electronic processes and communication). An analysis is made of the Russian medical system in order to identify the main stages of its formation, achievements and areas for improvement. The authors also conducted research into the current Russian medical healthcare system aimed at identifying gaps and concerns regarding security, reliability and service availability for on-line and mobile services and personal health records in Russia. Certain diffi culties in the establishment of an up-to-date healthcare system in Russia with examples of barriers are also analyzed to get a better understanding of the prospects for mobile healthcare development. Starting from the premise that support for information technologies is essential to medical healthcare development, the paper gives an overview of the current IT initiatives of the Russian government in the fi eld of medicine and provides examples of the independent applications of Russian software developers for digital and mobile healthcare. As a result of the research, three possible development scenarios of Russian mobile healthcare are described. The barriers identifi ed as well as worldwide healthcare transformation aspects such as cost reduction and personalization are considered in the possible scenarios.
A model for organizing cargo transportation between two node stations connected by a railway line which contains a certain number of intermediate stations is considered. The movement of cargo is in one direction. Such a situation may occur, for example, if one of the node stations is located in a region which produce raw material for manufacturing industry located in another region, and there is another node station. The organization of freight traﬃc is performed by means of a number of technologies. These technologies determine the rules for taking on cargo at the initial node station, the rules of interaction between neighboring stations, as well as the rule of distribution of cargo to the ﬁnal node stations. The process of cargo transportation is followed by the set rule of control. For such a model, one must determine possible modes of cargo transportation and describe their properties. This model is described by a ﬁnite-dimensional system of diﬀerential equations with nonlocal linear restrictions. The class of the solution satisfying nonlocal linear restrictions is extremely narrow. It results in the need for the “correct” extension of solutions of a system of diﬀerential equations to a class of quasi-solutions having the distinctive feature of gaps in a countable number of points. It was possible numerically using the Runge–Kutta method of the fourth order to build these quasi-solutions and determine their rate of growth. Let us note that in the technical plan the main complexity consisted in obtaining quasi-solutions satisfying the nonlocal linear restrictions. Furthermore, we investigated the dependence of quasi-solutions and, in particular, sizes of gaps (jumps) of solutions on a number of parameters of the model characterizing a rule of control, technologies for transportation of cargo and intensity of giving of cargo on a node station.
This paper is devoted to mathematical modeling and optimization of business processes and process systems under conditions of uncertainty. At present, modeling of business processes is mainly descriptive, which does not allow quantitative modeling and optimization in the design of processes and process systems. In addition, the existing methods of decision-making in business processes are based on the assumption that the decisive factors are deterministic. Despite uncertainty of the real processes caused by the uncertainty of future costs of resources, the market environment, economy, ﬁ nances, etc., the factors of an uncertain future are either not taken into account, or are believed to be the same as those observed currently. In this paper, a stochastic interval mathematical optimization model is developed. This model allows us to simulate in a quantitative way the business processes and process systems in which they take place, taking into account the uncertainties of the future state of the economy, ﬁ nances, market environment, costs of resources, as well as future realization of chances and risks related to the productive, supporting, and service processes. The criterion for optimality of the model is the maximization of the smallest deviation of the projected chances and risks, which makes it possible to make the best decision in the case that the most unfavorable conditions for the business process occur in the future. The criterion of optimality adopted in the mathematical model takes into account not only the uncertainty of the future state of the economy, ﬁ nance, and market environment, but also the psychology of decision-making and the subjective nature of judgments and estimates. We present a concept and method for estimating the inductive (logical, subjective) probabilities of the occurrence of uncertain projected business process factors. The models and methods developed in the paper make it possible to carry out mathematical modeling and optimization of business processes in a variety of activities without restrictions on the complexity of the structural model of the business process, the qualitative and quantitative composition of the connections in the process systems. On their basis, a software package for the quantitative design of business processes and process systems under conditions of uncertainty can be developed.
Enterprises today operate in a rapidly changing macroeconomic environment, and this factor should be taken into account when forecasting the financial standing of the company. In developing the company’s financial stability assessment model taking into account macroeconomic factors there is a problem of inclusion into the model of factors with frequency of measurement different from that of the internal financial performance. This paper proposes an approach to the development of financial stability model assessment taking into account macroeconomic factors and their aggregation. Based on the data of 291 enterprises of Volga Federal District metallurgical industry for the period 2012-2014, a financial stability assessment model has been built with the method of tree solution. The accuracy of the model is approximately 86%. Recommendations for optimization of the operating activities of the enterprise to achieve financial stability are given.
The datacenter modularity is a new term in the datacenter world. The approach of the article is to make the distinction between the modular, mobile, modern and traditional datacenter and give some data on the solutions available in the market. The present work aims to systematize the existing world data center technology solutions. As well as preconditions for the emergence of modular solutions for data centers. Classification and identification of key differences and features allows precise positioning of the applicability of existing technologies. Sets out the key features. The limits of applicability of existing technologies. Technological solutions from different vendors evaluated for systems engineering staffing. Estimates are presented in a convenient tabular form and comparable. The result of a summary table that allows comparing the capabilities of each solution in several aspects: form factor, complete solution, modularity, flexibility, growth in several key systems included in each solution.
In contemporary society, teachers often have to deal with a multicultural student audience, both in a traditional format, and in the process of online training. In general, the culture of each country has an impact on the educational process and largely determines is. This, in turn, implies a uniqueness of the educational content, objectives, value and tasks of education, teaching methods, pedagogical discourse, specifics of building an educational path, etc. This paper traces the relationship between the cultural influence and educational practices expressed in target, value and communication formats. Many teachers call attention to the problem of constructive knowledge transfer in a multicultural teaching environment as the main problem in this context, in addition to the specifics of cognitive, communication and psycho-pedagogical factors. However, the multicultural environment is taken to mean not only national differences, but also a different previous professional “background” (this refers to students of master’s programs, etc.). In this paper, we share the experience of selecting criteria for the possibility of building a cultural cognitive model of communication with students (tactical and strategic methods of developing various types of discourse) in order to optimize the teaching process in the multicultural environment. The criteria based on which a new-generation multicultural educational environment is to be built and which is able to provide constructive knowledge transfer are presented as follows: communication criterion (change of traditional communication forms in the “teacher – student” system), methodological criterion (emergence of the cultural and adaptive methods of work with educational information), content criterion (differentiation and possible inhomogeneity of the educational content in the educational process) and information criterion (development and use of educational resources taking into account cultural specifics of information perception and handling). The aforementioned points, in turn, cannot but affect
The decision-making system in international organizations is still very conservative. The composition of the international forums that can generate significant international instruments has not changed for centuries. Only diplomats and representatives of international organizations whose credentials confirmed in a certain way admitted to international decision-making. The only exception to this rule is the International Labor Organization (ILO), which works on the principle of tripartism. The ILO in its work involves not only the representatives of the states, but also representatives of employers and workers from each of the member states of the ILO.
Internet Governance Forum, under the auspices of the UN, UNESCO and the International Telecommunication Union, is established in 2006 on the basis of the World Summit on the Information Society, which is today the world's most authoritative international discussion forum on Internet governance, not fully use their potential in order to best regulation of international Internet governance processes. The basis for this regulation is multistakeholder approach, which consists in a multiplicity of categories of decision-making mechanism, which includes, in addition to the traditional representatives of states and international organizations, civil society, business, academic and technical community, the media, and other interested stakeholders.
This research is expected to provide guidance in improving the global Internet governance arrangements, taking into account the interests of all categories of participants, as well as to establish rules of procedure for decision-making based on multistakeholder-approach in the Internet governance to give the Internet Governance Forum the opportunity to adopt international “soft law” instruments. An example of this is the Draft Charter of rights and principles on the Internet, developed by Dynamic Coalition on Human Rights and the principles of the Internet Governance Forum - a kind of analogue of the Universal Declaration of Human Rights with regard to the Internet. The need to take human rights instruments on the Internet determines the direction of the development of programs and policies in global Internet governance and role of the Internet Governance Forum in these processes.
In this paper we consider the problem of finding an optimal excess of loss reinsurance which maximizes the reliability (probability of no ruin) of the insurance company. We apply two approximate approaches to calculate the distribution of total payments. The first approach is based on normal approximation of the payments distribution. Using this approximation we have derived an integral equation on the optimal retention limit. The second approach is based on simulation techniques. To test the precision of our approaches we use an exact formula for the distribution of total payments known for the case when losses in one insured event are distributed uniformly.
Nowadays peer assessment is recognized as a crucial part of a wide range of active learning routines. Nevertheless, practitioners and educators speak of the complexity and high resource consumption for the implementation of this type of assessment. Undoubtedly, convenient software that supports peer assessment processes may substantially raise productivity of its participants. A review of educational literature and free software shows there are several bottlenecks in the business processes of peer assessment and key user roles. First, most of the programs examined are web-based and expand a set of tools for teachers and learners by extra interfaces. Moreover, this logically creates a new branch in the learning business process. Second, there is probably no peer assessment system which allows users to attach something other than the text to be reviewed. There is a gap in the market of free peer assessment software. This paper oﬀ ers a peer assessment system speciﬁ cation that attempts to eliminate these disadvantages in order to improve user experience and thus increase the use of information technologies in peer assessment. The speciﬁ cation is based on a thorough description of the peer assessment process involving complex artifacts and double-blinded peer review. Software called PASCA (peer assessment system for complex artifacts) is introduced to illustrate the speciﬁ cation achieved. PASCA uses habitual e-mail services and does not aﬀ ect other business processes. It supports standard features like blinding and randomization, and it provides a set of original features. They contain evaluation of arbitrary artifacts, creation of complex peer review forms with their validation and scoring, and easy analysis of data from peer assessment sessions.
Requirements prioritization is performed by business analysts in order to analyze stated requirements and to define the required capabilities of a potential solution that will fulfill stakeholder needs. During the analysis, the business analyst transforms needs and informal concerns of stakeholders into formal solution requirements which describe the behavior of solution components in sufficient detail. Furthermore, requirements analysis may be performed to develop models of the current state of an organization. These models can be used in order to validate the solution scope with business and stakeholders, to analyze the current state of an organization to identify opportunities for improvement, or to assist stakeholders in understanding that current state. The requirements prioritization task includes the following elements. First, these are business cases which state key goals and measures of success for a project or organization. Priorities should be aligned with those goals and objectives. Business needs can be used as an alternative to the business case if no business case has been defined. Second, the prioritization requires that these requirements have been stated by stakeholders. Third, the list of stakeholders, annotated with their levels of authority and influence, is used to determine which stakeholders need to participate in prioritization. As a result, the several techniques and recommendations stated in the BABOK® Guide have been applied for requirements prioritization in a case study of a conventional commercial bank. The business needs of the organization have been identified. The main problems of the communication management process have been formulated. Underlying sources of the problem have been illustrated on a fishbone diagram (also known as an Ishikawa or cause-and-effect diagram). The list of stakeholders and the requirements have been made. The MoSCoW technique has been applied in order to identify four groups of requirements, which differ from each other by the impact the results of their implementation have on the solution of the identified problems. The list of prioritized requirements should be used on the next stages of the project. It may be useful for the project manager when planning works on the solution implementation. The results of this work should also help the stakeholders develop a common point of view on the strategic goals of the project.
The resource efficiency of different implementations of the branch-and-bound method for the classical traveling salesman problem depends, inter alia, on ways to organize a search decision tree generated by this method. The classic «time-memory» dilemma is realized herein either by an option of storing reduced matrices at the points of the decision tree, which leads to reduction in the complexity with additional capacity cost, or matrix recalculation for the current node, which leads to an increase in complexity while saving memory. The subject of this paper is an experimental study of temporal characteristics of solving the traveling salesman problem by the branch-and-bound method to identify a real reduction of span time using additional memory in a selected structure of a decision tree. The ultimate objective of the research is to formulate recommendations for implementing the method in practical problems encountered in logistics and business informatics. On the basis of experimental data, this paper shows that both considered options of the classic algorithm for the traveling salesman problem by the branch-and-bound method generate software implementations with an exponential dependence on the execution time of the input length. The experimental results permit us to suggest that the applicability of an additional memory capacity of no more than 1 GB results in a significant (up to five times) reduction of the time span. The estimate of the resulting trend makes it possible to recommend practical application of the software implementation of the branch-and-bound algorithm with storage of matrices - with a really available 16 GB random-access memory and with limitation of the expected average computation time of about one minute on modern personal computers whereby problems having a dimension no more than 70 can be solved exactly.
This paperwork overviews core technologies implemented by comparably new products at information security market - web application firewalls. Web applications are a very wide-used and convenient way of presenting remote users with access to corporate information resources. It can however become single point of failure rendering all the information infrastructure unreachable for legitimate clients. To prevent malicious access attempts to endpoint information resources and, intermediately, to web server, a new class of information security solutions has been created. Web application firewalls function at the highest, seventh layer of ISO/OSI model and serves as a controlling tunnel for all the traffic heading to and from company’s web application server(s). To ensure decent levels of traffic monitoring and intrusion prevention web application firewalls are equipped with various mechanisms of data exchange session “normalness” control. These mechanisms include protocol check routines, machine learning techniques, traffic signature analysis and more dedicated means like denial of service, XSS injection and CRRF attack prevention. Ability to research and add user rules to be processed along with vendor-provided ones is important since every company has its own security policy and, therefore the web application firewall should provide security engineers with ways to tweak its rules to reflect the security policy more precisely. This research is based on wide practice experience integrating web application firewalls into security landscape of various organizations, their administration and customization. We illustrate our research of available filtering mechanisms and their implementations with example product features by market leaders, schemes and screenshots from real web application firewall systems.
This paper describes an approach for fast ad-hoc analysis of Big Data inside a relational data model. The approach strives to achieve maximal utilization of highly normalized temporary tables through the merge join algorithm. It is designed for the Anchor modeling technique, which requires a very high level of table normalization. Anchor modeling is a novel data warehouse modeling technique, designed for classical databases and adapted by the authors of the article for Big Data environment and a MPP database. Anchor modeling provides flexibility and high speed of data loading, where the presented approach adds support for fast ad-hoc analysis of Big Data sets (tens of terabytes). Different approaches to query plan optimization are described and estimated, for row-based and column-based databases. Theoretical estimations and results of real data experiments carried out in a column-based MPP environment (HP Vertica) are presented and compared. The results show that the approach is particularly favorable when the available RAM resources are scarce, so that a switch is made from pure in-memory processing to spilling over from hard disk, while executing ad-hoc queries. Scaling is also investigated by running the same analysis on different numbers of nodes in the MPP cluster. Configurations of 5, 10 and 12 nodes were tested, using click stream data of Avito, the biggest classified site of Russia.
The paper explores the genesis of the socio-economic processes of strategic development of social transformations in a transition economy. Author studies the system patterns and structural and dynamic aspects of developing socio-economic systems, defining the need for adequate modeling methods. Restrictive features of traditional mathematical modeling are analyzed. Synergetics is applied as a theoretical and methodological fundamental of the research and modeling developing socio-economic systems. Modeling social behavior, individual choice, various forms of social interaction and self-organization processes are recognized to be significant aspects. Searching for methods of model description is based on carrying out interdisciplinary research in the field of economic and social sciences, using modern paradigms (system dynamics, agent-based computer simulation) and simulation technologies and analysis of their possibilities in the study of the dynamic aspects of the development processes in socio-economic systems. General concept of simulation of developing socio-economic systems based on the principles of stratification is proposed: a micro-level reproduces individual decisions of social and economic agents, collective organizational forms, and a macro-level describes processes of evolution. The interpretation of the interactions between socio-economic configurations based on the analysis of causal dependencies and dynamic manifestations of the interpenetration of phenomena occurring in different strata of the social system. It offers the methods of combining composite system-dynamic and agent-based models, allowing us to investigate the dynamics of socio-economic processes by a cyclic interaction of processes of individual and group behavior of economic and social agents at the micro level with the basic processes of socio-economic system at the macro level. The basic directions of development of technologies of simulation in procedures and strategic decision support systems are defined as following: a computer dynamic scenario analysis based on a generalized simulation model of the control object, the model of balance of interests in the procedures of coordination of scenarios and interests of the participants of the process of social design, a stratified description of the model set with the use of ontologies, methods of parameterization of models of socio-economic systems and the specifications of agents.
This article is devoted to a problem of controlling implementation of multiscenario projects when it is necessary to provide not one, but several scenarios of performance diff ering from each other in structure of works. As a control method, we propose carrying out intermediate checks. The task is to determine after which works you have to carry out the checks. A heuristic method of the solution of this task is off ered on the basis of an information approach. The places of performance of checks (control points) are defi ned step by step. Each check is chosen so that it gives maximum information (according to Shannon) concerning the work from those already completed that has been performed incorrectly. In calculations, we consider not only previously established control points, but also probabilities of implementation of various scenarios of implementation of the project under examination.
The solution for two very important practical cases is proposed: when the number of admissible intermediate checks is set and when their number is not set but achievement of a certain level of information completeness of control is required. In practice, the number of intermediate checks is limited from above by the budget for costs of control which is selected by the sponsor of the project. Information completeness of the diagnosis, in turn, is inversely proportional to the risk that the wrong implementation of the project will be revealed only after it ends. In this regard, the project manager demands that information completeness of control be no lower than a certain level.
The results received are sought, fi rst of all, by heads of design offi ces of large construction companies realizing standard projects in various natural and climatic conditions (in their practice practically all projects are multiscenario). Results can also be requested in the practice of the Ministry of Emergency Situations.