Гл. 8. Теоретические основы проектирования информационных систем. 8.5. Проектирование базы данных
In this article, the problem of Big Data is examined from the standpoint of civil law in the context of the question whether the existing mechanisms are sufficient for the purposes of civil regulation of Big Data or whether a qualitative review of the system of objects of civil objects, including intellectual property, is required. In the frame of civil discussion, it is proposed to consider Big Data in close connection with new knowledge formation, including on the basis of its analysis, for the purposes of using it in one’s own activity or selling it on the market and, as a result, to qualify Big Data as a special service based on Big Data technology. An emphasis on the “service” focuses attention on the “dynamics” of relations and the subject of regulations. Equally, the inclusion in the concept of indications of “information and analytical” nature and “Big Data technology” highlights the relevant specific features. Commenting on the characteristics of various objects of civil rights, the authors note the impossibility of extending the existing legal regimes to Big Data and suggest the expediency of recognising Big Data as a new non-traditional object of intellectual property. The proposed approach, according to the authors, allows to take into account not only the differentiation of objects of intellectual property in the broadest sense, but also their inherent unity, which is manifested in the granting of special — exclusive — rights to intangible objects being the results of the activity in question.
An array DBMS streamlines large N-d array management. A large portion of such arrays originates from the geospatial domain. The arrays often natively come as raster files while standalone command line tools are one of the most popular ways for processing these files. Decades of development and feedback resulted in numerous feature-rich, elaborate, free and quality-assured tools optimized mostly for a single machine. ChronosDB partially delegates in situ data processing to such tools and offers a formal N-d array data model to abstract from the files and the tools. ChronosDB readily provides a rich collection of array operations at scale and outperforms SciDB by up to 75× on average.