This book constitutes the refereed proceedings of the 14th International Workshop on Enterprise and Organizational Modeling and Simulation, EOMAS 2018, held in Tallinn, Estonia, in June 2018. The main focus of EOMAS is on the role, importance, and application of modeling and simulation within the extended organizational and enterprise context. The 11 full papers presented in this volume were carefully reviewed and selected from 22 submissions. They were organized in topical sections on conceptual modeling, enterprise engineering, and formal methods.
The contemporary marketing practices methodology (CMP) attracts attention of a substantial number of researchers in the field of strategic marketing. In the past two decades there were more than fifty papers published in peer-reviewed outlets addressing the analytics of usage of contemporary marketing practices in a variety of countries and industries. In this note we discuss reliability of these studies with respect to the usage of specific analytic tools. First, we demonstrate that standard clustering analysis is relatively sensitive to small changes in the datasets with companies being assigned to different clusters at frequent rates. Second, the project national teams make use of different, often incompatible settings. Therefore, to make possible comparisons between the countries and across industries, the researchers must agree on a generic setup and procedures. We conclude the note sketching the basics of these common grounds.
This state-of-the-art survey is dedicated to the memory of Emmanuil Markovich Braverman (1931-1977), a pioneer in developing the machine learning theory. The 12 revised full papers and 4 short papers included in this volume were presented at the conference "Braverman Readings in Machine Learning: Key Ideas from Inception to Current State" held in Boston, MA, USA, in April 2017, commemorating the 40th anniversary of Emmanuil Braverman's decease. The papers present an overview of some of Braverman's ideas and approaches. The collection is divided in three parts. The first part bridges the past and the present. Its main contents relate to the concept of kernel function and its application to signal and image analysis as well as clustering. The second part presents a set of extensions of Braverman's work to issues of current interest both in theory and applications of machine learning. The third part includes short essays by a friend, a student, and a colleague.
Originally published in 1995. In securing the future of any democracy, it is vital that the education service should provide an effective introduction to citizenship by means of a high quality and empowering curriculum in educational institutions organized and administered according to democratic principles. In this volume, educators with a variety of backgrounds and experience gained in educational institutions in both Russia and western countries address the question of the conception, justification and implementation of the idea of 'education for democracy'. This is the first publication to emerge from a collaboration of Russian and Western educators in recent times and is an enthralling account of education in countries with wide social, political and historical differences yet having common ground to share over the creation and management of their school systems.
Higher Education in Federal Countries: A Comparative Study is a unique study of higher education in nine federal countries—the United States, Canada, Australia, Germany, Mexico, Brazil, Russia, China and India. In this book, leading international scholars discuss the role of federalism and how it shapes higher education in major nation-state actors on the world stage. The editors develop an overarching comparative analysis of the dynamics of central and regional power in higher education, and the national case studies explain how each federal and federal-like higher education system has evolved and how it functions in what are highly varied contexts.
The book makes a major contribution to higher education studies and defines a new field of comparative analysis. It also provides important insights into comparative governance and the study of federalism and federal arrangements, with their particular historical, political, legal and economic dimensions.
Since 2008 the neoliberal mainstream, which seemed to be steadfast, has been suffering both economic and political crisis. This makes people seek an alternative of right or left kind. In such circumstances, alternative political forces attempt to satisfy civil society’s needs suggesting new ideas that challenge the neoliberalism. Therefore, both leftist and rightist revolts have been a search for new growth drivers and new balance within societies that takes all the classes and their interests into consideration.
This process is closely tied to several major shifts. The privatization of state’s functions has been doubted and put the return of public capital on the agenda. The voice of people demanding a more equal access to public goods, for example, education gets louder.
There are new politicians able to get this message and obtain a broad popular support. They are, for instance, Jeremy Corbyn in the UK and Jean-Luc Melenchon in France. However, what is behind the leftist revolt? Does it have any chance to succeed?
Contributions in this volume focus on computationally efficient algorithms and rigorous mathematical theories for analyzing large-scale networks. Researchers and students in mathematics, economics, statistics, computer science and engineering will find this collection a valuable resource filled with the latest research in network analysis. Computational aspects and applications of large-scale networks in market models, neural networks, social networks, power transmission grids, maximum clique problem, telecommunication networks, and complexity graphs are included with new tools for efficient network analysis of large-scale networks.
This proceeding is a result of the 7th International Conference in Network Analysis, held at the Higher School of Economics, Nizhny Novgorod in June 2017. The conference brought together scientists, engineers, and researchers from academia, industry, and government.
Igor Pellicciari is a tenured professor at the State University of Urbino (Italy) and a senior fellow at the Higher School of Economics (Moscow). He is also contract professor at the Moscow State University and LUISS University (Rome). From 2005 to 2013 he has been a Senior Expert of the European Union for Institution Building Programs, done in cooperation with the Russian Presidential Administration and the Russian Federation Duma.
In order to understand modern Russia and not to fall into the current most common stereotypes (the first and most common one being the image of its current president as a modern Tsar), it should be a prerequisite to analyze the period of the substantial failure of the first Russian constitutionalization which preceded the Soviet government and the entire Soviet period.
This book aims to analyze this period (1905–1907) distinguished by the short but intense liberal era in Russia at the start of the 20th Century. Thanks to this, Russia experienced one of the latest and shortest liberal periods in Europe, in which, however, seeds were launched for the later modern political and institutional development of the country.
It is important to observe the revolution of 1905 and the following convocation of the First Russian Duma in 1906, which evolved into a lost opportunity for the Russian constitutionalization and ultimately ended up being a forgotten liberal revolution. Instead, throughout the decades became predominant Lenin’s narrative of the 1905 events as a general rehearsal towards the hailed and inevitable Glorious Bolshevik Revolution of October 1917, which by contrast, was considered the start of a new era and a strong new legitimate political regime.
Thus, the liberal and constitutionalization potential of the 105 revolution have been for almost a century banned from the official political history of the Soviet Russia.
Nonetheless, today all these events, and especially those generated by the parliamentary institutions, have been reevaluated in the light of their role in inspiring the constitutional transformation of the current post-soviet political system, and have a newly acquired practical significance for the modern institutional development of Russia.
From this perspective, it is more historically understandable the current effort of the Russian Federation to consolidate first and more liberalism and Rule of Law reforms before dealing with the issue of a full and true procedural democratization of the country.
Miscommunicating Social Change analyzes the discourses of three social movements and the alternative media associated with them, revealing that the Enlightenment narrative, though widely critiqued in academia, remains the dominant way of conceptualizing social change in the name of democratization in the post-Soviet terrain. The main argument of this book is that the “progressive” imaginary, which envisages progress in the unidirectional terms of catching up with the “more advanced” Western condition, is inherently anti-democratic and deeply antagonistic. Instead of fostering an inclusive democratic process in which all strata of populations holding different views are involved, it draws solid dividing frontiers between “progressive” and “retrograde” forces, deepening existing antagonisms and provoking new ones; it also naturalizes the hierarchies of the global neocolonial/neoliberal power of the West. Using case studies of the “White Ribbons” social movement for fair elections in Russia (2012), the Ukrainian Euromaidan (2013–2014), and anti-corruption protests in Russia organized by Alexei Navalny (2017) and drawing on the theories of Ernesto Laclau, Chantal Mouffe, and Nico Carpetntier, this book shows how “progressive” articulations by the social movements under consideration ended up undermining the basis of the democratic public sphere through the closure of democratic space.
During the past several decades, several “highly-resourced, accelerated research universities” have been established around the world to pursue—and achieve—academic and research excellence. These institutions are entirely new, not existing universities that were reconfigured. Accelerated Universities provides case studies of eight such universities and highlights the lessons to be learned from these examples. Each of the cases is written by someone involved with leadership at the early developmental stages of each university, and provides insights that only senior executives can illustrate. Accelerated Universitiesshows that visionary leadership and generous funding combined with innovative ideas can yield impressive results in a short time. Universities aspiring to recognition among the top tier of global institutions will find this book indispensable.
The coursebook is aimed at developing foreign language competence among university students and interlingual and intercultural communication in professional sphere. The book is a possibility to master phonetic, lexical and grammatical skills as well as listening, writing anf speaking on the basis of a documentay series "The History of the Kings and Queens of England". Students are provideв with various task types which assist in developing language, communicattive and cultural competences.
This book concludes The Industrialisation of Soviet Russia, an authoritative account of the Soviet Union’s industrial transformation between 1929 and 1939. The volume before this one covered the ‘good years’ (in economic terms) of 1934 to 1936. The present volume has a darker tone: beginning from the Great Terror, it ends with the Hitler-Stalin pact and the outbreak of World War II in Europe. During that time, Soviet society was repeatedly mobilised against internal and external enemies, and the economy provided one of the main arenas for the struggle. This was expressed in waves of repression, intensive rearmament, the increased regimentation of the workforce and the widespread use of forced labour.
What is it to be a work of art? Renowned author and critic Arthur C. Danto addresses this fundamental, complex question. Part philosophical monograph and part memoiristic meditation, What Art Is challenges the popular interpretation that art is an indefinable concept, instead bringing to light the properties that constitute universal meaning. Danto argues that despite varied approaches, a work of art is always defined by two essential criteria: meaning and embodiment, as well as one additional criterion contributed by the viewer: interpretation. Danto crafts his argument in an accessible manner that engages with both philosophy and art across genres and eras, beginning with Plato’s definition of art in The Republic, and continuing through the progress of art as a series of discoveries, including such innovations as perspective, chiaroscuro, and physiognomy. Danto concludes with a fascinating discussion of Andy Warhol’s famous shipping cartons, which are visually indistinguishable from the everyday objects they represent.
This book is the first to trace the origins and significance of positivism on a global scale. Taking their cues from Auguste Comte and John Stuart Mill, positivists pioneered a universal, experience-based culture of scientific inquiry for studying nature and society—a new science that would enlighten all of humankind. Positivists envisaged one world united by science, but their efforts spawned many. Uncovering these worlds of positivism, the volume ranges from India, the Ottoman Empire, and the Iberian Peninsula to Central Europe, Russia, and Brazil, examining positivism’s impact as one of the most far-reaching intellectual movements of the modern world. Positivists reinvented science, claiming it to be distinct from and superior to the humanities. They predicated political governance on their refashioned science of society, and as political activists, they sought and often failed to reconcile their universalism with the values of multiculturalism. Providing a genealogy of scientific governance that is sorely needed in an age of post-truth politics, this volume breaks new ground in the fields of intellectual and global history, the history of science, and philosophy.
Combining history of science and a history of universities with the new imperial history, Universities in Imperial Austria 1848–1918: A Social History of a Multilingual Space by Jan Surman analyzes the practice of scholarly migration and its lasting influence on the intellectual output in the Austrian part of the Habsburg Empire.
The Habsburg Empire and its successor states were home to developments that shaped Central Europe's scholarship well into the twentieth century. Universities became centers of both state- and nation-building, as well as of confessional resistance, placing scholars if not in conflict, then certainly at odds with the neutral international orientation of academe.
By going beyond national narratives, Surman reveals the Empire as a state with institutions divided by language but united by legislation, practices, and other influences. Such an approach allows readers a better view to how scholars turned gradually away from state-centric discourse to form distinct language communities after 1867; these influences affected scholarship, and by examining the scholarly record, Surman tracks the turn.
Drawing on archives in Austria, the Czech Republic, Poland, and Ukraine, Surman analyzes the careers of several thousand scholars from the faculties of philosophy and medicine of a number of Habsburg universities, thus covering various moments in the history of the Empire for the widest view. Universities in Imperial Austria 1848–1918 focuses on the tension between the political and linguistic spaces scholars occupied and shows that this tension did not lead to a gradual dissolution of the monarchy’s academia, but rather to an ongoing development of new strategies to cope with the cultural and linguistic multitude.
The interaction between the elements of the secondary structure is the key process, determining the spatial structure and activity of a membrane protein. The transmembrane (TM) helix-helix interaction is known to be especially important for the function of so-called type I or bitopic membrane proteins, which have small TM domains, consisting of a single ⍺-helix. In turn, the parameters of membrane environment is an important factor, influencing the free energy and mode of TM protein-protein and helix-helix contacts. However, to the date the studies of the lipid-related effects on the free energy and structural mode of TM helix-helix interactions are represented mainly by the computer simulations, performed mostly in the coarse-grained regime, which definitely need to be verified experimentally. In the present work, we provide the approach to study the helix-helix interactions in the TM domains of membrane proteins in various lipid environment using solution NMR spectroscopy and phospholipid bicelles. The technique is based on the properties of bicelles to form particles with the size, depending on the lipid/detergent ratio. To implement the approach, we report the experimental parameters of "ideal bicelle" models for four kinds of zwitterionic phospholipids, which can be also used in other structural studies. We show that size of bicelles and type of the rim-forming detergent do not affect substantially the spatial structure and stability of the model TM dimer. On the other hand, the effect of the bilayer thickness on the free energy of the dimer is dramatic, while the structure of the protein is unchanged in various lipids with fatty chains having length from 12 to 18 carbon atoms. The obtained data is analyzed from the viewpoint of hydrophobic mismatch and lipophobic effects, and sheds light on the folding determinants of α-helical membrane proteins.
We consider a signal-code construction for a special class of multiple access system over vector-disjunctive channel. This construction is based on interleaved Reed-Solomon codes with collaborative decoding. The considered methods of encoding and decoding on one hand have acceptable complexity for a wide range parameters and on other one a decoding algorithm is able to correct significantly more collisions than bounded-distance decoder. We estimate a maximum achievable relative sum-rate of this construction for fixed number of active users and total number of frequencies to transmit.
The main objective of the paper is to examine if 'Mode 3' universities represent a new and advanced type of an entrepreneurial university, perhaps transcending the entrepreneurial university, and identify the specific characteristics of 'Mode 3' universities. According to its definition, a 'Mode 3' university represents a type of organisation capable of higher order learning and in this regard a type of open, highly complex, and non-linear knowledge production system that seeks and realises creative ways of combining, recombining, and integrating different principles of knowledge production and knowledge application (e.g., 'Mode 1' and 'Mode 2'). Thus, 'Mode 3' universities clearly encourage diversity and heterogeneity, while they emphasise and engender creative and innovative organisational contexts for research, education, and innovation. Several examples are offered in this context in order to demonstrate how and why the concept of 'Mode 3' universities is better endowed for addressing the current and future challenges compared to a simple 'entrepreneurial university' approach. The full exploration of 'Mode 3' universities furthermore demands a strong linkage and contextualisation with (entrepreneurial) ecosystems.
This special issue examines corporate foresight and innovation management in contemporary organising. Contributing to a growing body of research on the other-centeredness and interconnectedness of foresight and innovation, the papers in the issue examine the practice of corporate foresight, how it may lead to the identification of opportunities for innovation, and the complex processes and conditions that enable (or impede) the capture of value from corporate foresight. Representing an interesting mix of empirical, conceptual, qualitative, and quantitative methodologies, the papers offer innovative theorising to extend our understanding of the logics of corporate foresight, their interactive effects and contribution to innovation management.
The relationships between online social networking (OSN) behaviour and users’ self-esteem are as important as well as ambiguous: Both positive and negative self-esteem can encourage users to engage in OSNs. This work examined whether personality traits and attitudes toward traits can explain this controversy. Data from 830 users of a local OSN were analysed. I hypothesised that extraversion and attitudes toward extraversion eliminated correlations between positive self-esteem and users’ popularity (the number of friends and likes). In contrast, neuroticism and attitudes toward neuroticism failed to eliminate a negative correlation between self-esteem and an indicator of users’ self-validation (the number of impersonal avatars). This association also remained significant when conscientiousness as well as negative attitudes toward conscientiousness and agreeableness were controlled. However, self-esteem did not correlate with the two other self-validation indicators―the number of posts and portraits. This study casts doubt on the possibility of direct associations between positive self-esteem and users’ popularity beyond such factors as extraversion. Nevertheless, it lends partial support to the association between negative self-esteem and users’ self-validation such as the use of impersonal avatars even when other personality characteristics are considered.
On a large dataset of Italian municipalities for the period 2003–2014, we investigate unexplored effects of fiscal consolidation in decentralized public finance. Based on a simple, realistic theoretical model, we show that municipalities increase arrears on committed public investment expenditure as a response to intergovernmental transfer cuts. Then, we test our predictions controlling for potential sources of endogeneity, and find that a reduction in central government transfers causes a significant increase in arrears, besides other usual adjustments to local fiscal policy (e.g., tax revenues). Our results highlight a perverse effect of fiscal consolidation packages implemented by centrally imposed fiscal restraints.
JEL classification: H30; H72; H77; C33; C36.
Regions are increasingly being viewed as eco‐systemic agglomerations of organizational and institutional entities or stakeholders with socio‐technical, socio‐economic, and socio‐political conflicting as well as converging (co‐opetitive) goals, priorities, expectations, and behaviors that they pursue via entrepreneurial development, exploration, exploitation, and deployment actions, reactions and interactions. In this context, our paper aims to explore and profile the nature and dynamics of the Quadruple/Quintuple Helix Innovation System Model or Framework (government, university, industry, civil society, environment) as an enabler and enactor of regional co‐opetitive entrepreneurial ecosystems which we conceptualize as fractal, multi‐level, multi‐modal, multi‐nodal, and multi‐lateral configurations of dynamic tangible and intangible assets within the resource‐based view and the new theory of the growth of the firm. Co‐opetitive fractal innovation and entrepreneurship ecosystems are defined and discussed, and examples of regional innovation policies and programs are presented. Furthermore, the concept of multi‐level innovation systems is analyzed, taking into account the existence of knowledge clusters and innovation networks, while alternative aggregations of multi‐level innovation systems are proposed based on their spatial (geographical) and non‐spatial (research‐based) functional properties.
We prove an isoperimetric inequality for the second non-zero eigenvalue of the Laplace–Beltrami operator on the real projective plane. For a metric of unit area this eigenvalue is not greater than 20π. 20π. This value is attained in the limit by a sequence of metrics of area one on the projective plane. The limiting metric is singular and could be realized as a union of the projective plane and the sphere touching at a point, with standard metrics and the ratio of the areas 3:2. It is also proven that the multiplicity of the second non-zero eigenvalue on the projective plane is at most 6.
This review is an attempt to read the main ideas of Catherine Malabou’s book Before Tomorrow: Epigenesis and Rationality, with a particular emphasis upon the problem of the modifiability of the transcendental and the rejection of the a priori dimension of subjectivity within scientific and philosophical thought of a materialist orientation. Malabou’s thesis of the epigenesis of pure reason evinces the dynamical dimension of the transcendental, integrating structural and evolutionary conceptions of reason. Epigenesis secures the stability of the phenomenal world and allows for the possibility of a contingent metamorphosis of reason, thereby establishing an economy of transcendental contingency. In general, Malabou’s work has many affinities with recent phenomenological thought, although it makes few explicit references to phenomenological philosophers as such.
The conflict measures induced by the conjunctive and disjunctive combining rules are studied in this paper in the framework of evidence theory. The coherence of conflict measures with combining rules is introduced and studied. In addition, the structure of conjunctive and disjunctive conflict measures is studied in the paper. In particular, it is shown that the metric and entropy components can be distinguished in such measures. Moreover, these components are changed differently after combining of the bodies of evidence.
The natural language structure can be viewed as weighted semantic network. Such representation gives an option to investigate the text corpus as the model of the subject domain. In this paper we propose the mechanism of the semantic network identification and construction. We apply the methodological instrument for the social media text analysis and trace the dynamics of the discussions about 1917 year within the internet communities. Network changes illustrate the changes of the interest to different topics. The proposed mechanism can be used for the monitoring of the different social processes and phenomenal in online social networks and media.
Online social networks (OSNs) play an increasingly important role in news dissemination and consumption, attracting such traditional media outlets as TV channels with growing online audiences. Online news streams require appropriate instruments for analysis. One of such tools is topic modeling (TM). However, TM has a set of limitations (the problem of topic number choice and the algorithm instability, among others) that must be addressed specifically for the task of sociological online news analysis. In this paper, we propose a full-cycle methodology for such study: from choosing the optimal topic number to the extraction of stable topics and analysis of TM results. We illustrate it with an analysis of online news stream of 164,426 messages formed by twelve national TV channels during a one-year period in a leading Russian OSN. We show that our method can easily reveal associations between news topics and user feedback, including sharing behavior. Additionally, we show how uneven distribution of document quantities and lengths over classes (TV channels) could affect TM results.
Market network analysis attracts a growing attention last decade. Important component of the market network is a model of stock returns distribution. Elliptically contoured distributions are popular as probability model of stock returns. The question of adequacy of this model to real market data is open. There are known results that reject such model and at the same time there are results that approve such model. Obtained results are concerned to testing some properties of elliptical model. In the paper another property of elliptical model namely property of symmetry condition of tails of 2-dimentional distribution is considered. Multiple statistical procedure for testing elliptical model for stock returns distribution is proposed. Sign symmetry conditions of tails distribution are chosen as individual hypotheses for multiple testing. Uniformly most powerful tests of Neyman structure are constructed for individual hypotheses testing. Associated stepwise multiple testing procedure is applied for the real market data. To visualize the results a rejection graph is constructed. The main result is that under some conditions tail symmetry hypothesis is not rejected if one remove a few number of hubs from the rejection graph.
The paper studies two-dimensional modal logics with additional connectives (so-called Segerberg squares) and can be regarded as a continuation of the author's earlier paper (2012). It gives a new simpler proof of the finite model property of minimal Segerberg squares using bismulation games. It proves the square finite model property for Segerberg squares of polymodal T and D. It also constructs a faithful embedding of Segerberg squares in the equational theory of relation algebras.
We prove completeness for some normal modal predicate logics in the standard Kripke semantics with expanding domains. We consider quantified versions of propositional logics with the axiom of density plus some others (transitivity, confluence). The method of proof modifies the technique developed for other cases (without density) by S. Ghilardi, G. Corsi and D. Skvorstov; but now we arrange the whole construction in a game-theoretic style.
Information technologies have evolved from its traditional back office role to a strategic resource role able not only to support but also to shape business strategies. For over a decade IT-business alignment has been ranked as a top-priority management concern and is widely covered in literature. However, conceptual studies dominate the field, while there is little research on practical ways to achieve the alignment. The aim of this paper is to formalize and verify the alignment assessment model developed in the previous research by integrating the traditional Strategic Alignment Model and EA framework TOGAF in an attempt to provide a practical approach to the alignment evaluation and implementation. The Alloy Language and Analyzer are used as a means of model formalization and verification.
This research aims to use an architectural approach to create an alignment of IT and business in a chosen industrial enterprise, based on Zachman model. All development processes, existing within the enterprise, their relations and interactions, which are needed to fulfill the enterprise mission, are represented in a chosen Enterprise Architecture framework. The framework does not just perform the main attributes and components of the organization, but it also provides the company with an opportunity to understand and analyze crucial weaknesses and inconsistencies that needed to be identified and rectified. Nowadays enterprises use a wide range of established Enterprise Architecture frameworks, some of which were developed for specific fields, while others can be applied broadly. One of the frameworks with such functionality is the Zachman Enterprise Architecture Framework, a unique tool to create an architectural description and apply solutions for overcoming challenges that have been identified for a considered enterprise. So, the main goal of the study is to provide a practical guidance, which enables the alignment between business and IT, based on the Zachman Enterprise Architecture Framework.
This paper is a continuation of the author’s previous study on methods of velocity measurements of navigation receivers and devoted to their comparative analysis.
Each method of measuring velocity has a few parameters. Let us fix all parameters except for one (main) and vary this parameter. The value of each varied parameter corresponds to some noise error of velocity measurements which can be characterized by standard deviation, or SD (cm/s). A dynamic model of GNSS receiver motion determines dynamic errors. Maximal dynamic error (MDE) (cm/s) is of interest in this case. This error depends on “maneuver phase”, i.e., a shift of the maneuver start time from the starting point of PLL control period and also the starting point of the secondary processing period. The maximal value of MDE is of interest in these shifts.
So, for each value of the varied parameter there is a pair of numbers: SD and MDE. Let us arrange these numbers in plane of the coordinate system: x-axis is MDE, and y-axis is SD. Connect nearest points and obtain a curve which is called an exchange diagram. Since SD and MDE vary within a wide range, the diagrams should be built in logarithmic scale, that is in dB relative to 1 cm/s. Let us call them logarithmic exchange diagrams (LED). Different LED were plotted for tough and soft dynamic scenarios for different methods of velocity measurements including the conventional one frequently discussed in the literature.
As a result of the analysis, a method of generating frequency estimates of the input signal and their further filtering using an after-satellite second order tracking filter, and a method based on quasi-optimal estimates of the input signal phase and further after-satellite filtration using the third order tracking filter have been recommended for tougher dynamic conditions. Under more favorable conditions in addition to the two above, a method of generating coordinate increments over one period with further after-coordinate filtration using the second order tracking filter, and a method of generating local coordinates with further aftercoordinate filtration using the third order tracking filter have been also recommended. In conclusion, a law of velocity estimate SD variation for one of the best recommended methods was investigated in the process of varying method parameters.