Book chapter
Development of the Mechanism of Assessing Cyber Risks in the Internet of Things Projects
We developed the mechanism of assessing cyber risks for Internet of Things (IoT) projects. The relevance of this topic is explained by growing sophistication of cyber-attacks, the speed of new threats emergence and increasing damage from the attacks. The paper addresses decreasing efficiencies of existing mechanisms of cyber risk assessment and fills the research gaps in this area. Results include development of the mechanism’s concept, its block diagram, the specification and description of its comprising tools and the case study. Unlike peers, the mechanism provided holistic approach to cyber risk assessment; integrated and coordinated all related activities and tools. It simulated the confidence interval of project return on investments (ROI) and showing the chances to go above risk appetite. It makes cyber risk assessment dynamic, iterative, responsive to changes in cyber environment. These advantages let us conclude that the mechanism should have a significant scientific and practical use.
In book

This book constitutes the joint refereed proceedings of the 20th International Conference on Next Generation Teletraffic and Wired/Wireless Advanced Networks and Systems, NEW2AN 2020, and the 13th Conference on Internet of Things and Smart Spaces, ruSMART 2020. The conference was held virtually due to the COVID-19 pandemic.
The 79 revised full papers presented were carefully reviewed and selected from 225 submissions. The papers of NEW2AN address various aspects of next-generation data networks, with special attention to advanced wireless networking and applications. In particular, they deal with novel and innovative approaches to performance and efficiency analysis of 5G and beyond systems, employed game-theoretical formulations, advanced queuing theory, and stochastic geometry, while also covering the Internet of Things, cyber security, optics, signal processing, as well as business aspects. ruSMART 2020, provides a forum for academic and industrial researchers to discuss new ideas and trends in the emerging areas.
Development of information and communication technologies (ICT) is a key aspect of modernising the Russian economy. Russia is gradually approaching developed countries in terms of ICT infrastructure and Internet access. Along with opportunities opened by the rapid development and proliferation of ICT, systemic threats to security of critical components of public and private infrastructures are becoming increasingly more serious. Accordingly, achieving an acceptable level of supply chains’ cyber security, and managing relevant risks, are turning into top priorities of the national policy.
The article reviews the main events of the Third International Summer School on Cyber Law, organized by the Laboratory of information law (National Research University Higher School of Economics, Russia).
This year applications for participation in the summer school were submitted from the UK, Italy, Germany, Slovakia, Armenia, India, Belarus, Kyrgyzstan and from different cities of the Russian Federation. In the framework of the summer school the most current research trends in the field of information law and intellectual property law were touched, new problems and new issues were raised, and solutions were suggested. Among the guests of the summer school were representatives of IBM, Yandex, Google, MegaFon, Wargaming.net, Kaspersky lab, as well as professors of foreign universities.
Intense program of the summer school included a discussions on legal aspects of development and introduction of cognitive systems, legal regulation in the field of computer games, novelties of the Russian information legislation, relevant issues of telecommunication law and copyright, legal aspects of cyber security, as well as other important legal issues in IT/IP sphere.
Large attention of participants was paid to the problems of enforcing the new Russian legislation on the requirements to the information dissemination organizers on the Internet and popular bloggers. In light of enacting this legislation the questions of websites blocking were raised again. In the field of telecommunications law the issues of legal regulation of OTT-services were the most disputable. The legal aspects of the computer games industry which were discussed in the summer school include the issues of legal protection of computer games as objects of intellectual property, as well as the issues of e-commerce in the area of online computer games.
A special event in the framework of the summer school program was the master-class of foreign professors on how to write articles in English to international peer-reviewed journals.
It is well-known that the Dolev-Yao adversary is a powerful adversary. Besides acting as the network, intercepting, sending, and composing messages, he can remember as much information as he needs. That is, his memory is unbounded.
We recently proposed a weaker Dolev-Yao like adversary, which also acts as the network, but whose memory is bounded. We showed that this Bounded Memory Dolev-Yao adversary, when given enough memory, can carry out many existing protocol anomalies. In particular, the known anomalies arise for bounded memory protocols, where there is only a bounded number of concurrent sessions and the honest participants of the protocol cannot remember an unbounded number of facts nor an unbounded number of nonces at a time. This led us to the question of whether it is possible to infer an upper-bound on the memory required by the Dolev-Yao adversary to carry out an anomaly from the memory restrictions of the bounded protocol. This paper answers this question negatively (Theorem 2).
The second contribution of this paper is the formalization of Progressing Collaborative Systems that may create fresh values, such as nonces. In this setting there is no unbounded adversary, although bounded memory adversaries may be present. We prove the NP-completeness of the reachability problem for Progressing Collaborative Systems that may create fresh values.
Recent work on structure-preserving signatures studies optimality of these schemes in terms of the number of group elements needed in the verification key and the signature, and the number of pairing-product equations in the verification algorithm. While the size of keys and signatures is crucial for many applications, another important aspect to consider for performance is the time it takes to verify a given signature. By far, the most expensive operation during verification is the computation of pairings. However, the concrete number of pairings that one needs to compute is not captured by the number of pairing-product equations considered in earlier work. To fill this gap, we consider the question of what is the minimal number of pairings that one needs to compute in the verification of structure-preserving signatures. First, we prove lower bounds for schemes in the Type II setting that are secure under chosen message attacks in the generic group model, and we show that three pairings are necessary and that at most one of these pairings can be precomputed. We also extend our lower bound proof to schemes secure under random message attacks and show that in this case two pairings are still necessary. Second, we build an automated tool to search for schemes matching our lower bounds. The tool can generate automatically and exhaustively all valid structure-preserving signatures within a user-specified search space, and analyze their (bounded) security in the generic group model. Interestingly, using this tool, we find a new randomizable structure-preserving signature scheme in the Type II setting that is optimal with respect to the lower bound on the number of pairings, and also minimal with respect to the number of group operations that have to be computed during verification.
Many security protocols rely on the assumptions on the physical properties in which its protocol sessions will be carried out. For instance, Distance Bounding Protocols take into account the round trip time of messages and the transmission velocity to infer an upper bound of the distance between two agents. We classify such security protocols as Cyber-Physical. Time plays a key role in design and analysis of many of these protocols. This paper investigates the foundational differences and the impacts on the analysis when using models with discrete time and models with dense time. We show that there are attacks that can be found by models using dense time, but not when using discrete time. We illustrate this with a novel attack that can be carried out on most distance bounding protocols. In this attack, one exploits the execution delay of instructions during one clock cycle to convince a verifier that he is in a location different from his actual position. We propose a Multiset Rewriting model with dense time suitable for specifying cyber-physical security protocols. We introduce Circle-Configurations and show that they can be used to symbolically solve the reachability problem for our model. Finally, we show that for the important class of balanced theories the reachability problem is PSPACE-complete.
Activities such as clinical investigations (CIs) or financial processes are subject to regulations to ensure quality of results and avoid negative consequences. Regulations may be imposed by multiple governmental agencies as well as by institutional policies and protocols. Due to the complexity of both regulations and activities, there is great potential for violation due to human error, misunderstanding, or even intent. Executable formal models of regulations, protocols and activities can form the foundation for automated assistants to aid planning, monitoring and compliance checking. We propose a model based on multiset rewriting where time is discrete and is specified by timestamps attached to facts. Actions, as well as initial, goal and critical states may be constrained by means of relative time constraints. Moreover, actions may have non-deterministic effects, i.e. they may have different outcomes whenever applied. We present a formal semantics of our model based on focused proofs of linear logic with definitions. We also determine the computational complexity of various planning problems. Plan compliance problem, for example, is the problem of finding a plan that leads from an initial state to a desired goal state without reaching any undesired critical state. We consider all actions to be balanced, i.e. their pre- and post-conditions have the same number of facts. Under this assumption on actions, we show that the plan compliance problem is PSPACE-complete when all actions have only deterministic effects and is EXPTIME-complete when actions may have non-deterministic effects. Finally, we show that the restrictions on the form of actions and time constraints taken in the specification of our model are necessary for decidability of the planning problems.