Second International Conference on Formal Structures for Computation and Deduction, FSCD 2017
Lambek calculus is a logical foundation of categorial grammar, a linguistic paradigm of grammar as logic and parsing as deduction. Pentus (2010) gave a polynomial-time algorithm for determining provability of bounded depth formulas in L* , the Lambek calculus with empty antecedents allowed. Pentus’ algorithm is based on tabularisation of proof nets. Lambek calculus with brackets is a conservative extension of Lambek calculus with bracket modalities, suitable for the modeling of syntactical domains. In this paper we give an algorithm for provability in Lb* , the Lambek calculus with brackets allowing empty antecedents. Our algorithm runs in polynomial time when both the formula depth and the bracket nesting depth are bounded. It combines a Pentus-style tabularisation of proof nets with an automata-theoretic treatment of bracketing.
Heaps are well-studied fundamental data structures, having myriads of applications, both theoretical and practical. We consider the problem of designing a heap with an “optimal” extract-min operation. Assuming an arbitrary linear ordering of keys, a heap with n elements typically takes O(log n) time to extract the min-imum. Extracting all elements faster is impossible as this would violate the Ω(n log n) bound for comparison-based sorting. It is known, however, that is takes only O(n + k log k) time to sort just k smallest elements out of n given, which prompts that there might be a faster heap, whose extract-min performance depends on the number of elements extracted so far. In this paper we show that is indeed the case. We present a version of heap that performs insert in O(1) time and takes only O(log ∗ n + log k) time to carry out the k-th extraction (where log ∗ denotes the iterated logarithm). All the above bounds are worst-case.
The article examines the problems of defining the term computer simulations of scientific experiments. The first part analyzes the original method for classifying variations of terms proposed by Duran as the most successful for demonstrating significant existing contradictions among philosophers regarding the place and role of computer simulations in the philosophy of science. In the second part of the article, the term itself is formulated by the author through the identification of the main features of computer simulations as a result of studying the nature of experimental data as transferring traces of an experiment from a graphematical space to a representative one. Following the concept of transposition, the author derives a relevant term from the essence of computer simulations revealed by him, claiming a new epistemological significance for such kind of scientific experiments for the philosophy of science.
Modern cybernetics and computer engineering papers and topics are presented in the proceedings. This proceedings is a Vol. 3 of the Computer Science On-line Conference proceedings. Papers in this part discuss modern cybernetics and applied informatics in technical systems. This book constitutes the refereed proceedings of the Applied Informatics and Cybernetics in Intelligent Systems section of the 9th Computer Science On-line Conference 2020 (CSOC 2020), held on-line in April 2020. CSOC 2020 has received (all sections) more than 270 submissions from more than 35 countries. More than 65% of accepted submissions were received from Europe, 21% from Asia, 8% from Africa, 4% from America and 2% from Australia. CSOC 2020 conference intends to provide an international forum for the discussion of the latest high-quality research results in all areas related to Computer Science. Computer Science On-line Conference is held on-line, and modern communication technology, which are broadly used, improves the traditional concept of scientific conferences. It brings equal opportunity to participate for all researchers around the world.
Data management and analysis is one of the fastest growing and most challenging areas of research and development in both academia and industry. Numerous types of applications and services have been studied and re-examined in this field resulting in this edited volume which includes chapters on effective approaches for dealing with the inherent complexity within data management and analysis. This edited volume contains practical case studies, and will appeal to students, researchers and professionals working in data management and analysis in the business, education, healthcare, and bioinformatics areas.
We re-examine the problem of existential import by using classical predicate logic. Our problem is: How to distribute the existential import among the quantified propositions in order for all the relations of the logical square to be valid? After defining existential import and scrutinizing the available solutions, we distinguish between three possible cases: explicit import, implicit non-import, explicit negative import and formalize the propositions accordingly. Then, we examine the 16 combinations between the 8 propositions having the first two kinds of import, the third one being trivial and rule out the squares where at least one relation does not hold. This leads to the following results: (1) three squares are valid when the domain is non-empty; (2) one of them is valid even in the empty domain: the square can thus be saved in arbitrary domains and (3) the aforementioned eight propositions give rise to a cube, which contains two more (non-classical) valid squares and several hexagons. A classical solution to the problem of existential import is thus possible, without resorting to deviant systems and merely relying upon the symbolism of First-order Logic (FOL). Aristotle's system appears then as a fragment of a broader system which can be developed by using FOL.
Relativisation involves dependencies which, although unbounded, are constrained with respect to certain island domains. The Lambek calculus L can provide a very rudimentary account of relativisation limited to unbounded peripheral extraction; the Lambek calculus with bracket modalities Lb can further condition this account according to island domains. However in naïve parsing/theorem-proving by backward chaining sequent proof search for Lb the bracketed island domains, which can be indefinitely nested, have to be specified in the linguistic input. In realistic parsing word order is given but such hierarchical bracketing structure cannot be assumed to be given. In this paper we show how parsing can be realised which induces the bracketing structure in backward chaining sequent proof search with Lb.
Logical frameworks allow the specification of deductive systems using the same logical machinery. Linear logical frameworks have been successfully used for the specification of a number of computational, logics and proof systems. Its success relies on the fact that formulas can be distinguished as linear, which behave intuitively as resources, and unbounded, which behave intuitionistically. Commutative subexponentials enhance the expressiveness of linear logic frameworks by allowing the distinction of multiple contexts. These contexts may behave as multisets of formulas or sets of formulas. Motivated by applications in distributed systems and in type-logical grammar, we propose a linear logical framework containing both commutative and non-commutative subexponentials. Non-commutative subexponentials can be used to specify contexts which behave as lists, not multisets, of formulas. In addition, motivated by our applications in type-logical grammar, where the weakenening rule is disallowed, we investigate the proof theory of formulas that can only contract, but not weaken. In fact, our contraction is non-local. We demonstrate that under some conditions such formulas may be treated as unbounded formulas, which behave intuitionistically.
The following paper considers the debate on disciplinary boundaries of logic in German philosophy of the early 19th century. It is supposed to distinguish four competing views on understanding of the logical knowledge. The analysis of the controversy enables to adjust the location of the Hegelian idea of the "Science of Logic," project and to clarify the historical context of the emergence of formal logic as a discipline.