• A
• A
• A
• ABC
• ABC
• ABC
• А
• А
• А
• А
• А
Regular version of the site
Of all publications in the section: 16
Sort:
by name
by year
Article
Beaudou L., Nourine L., Mary A. Theoretical Computer Science. 2017. Vol. 658. P. 391-398.

In this paper we consider k-meet-semidistributive lattices and we are interested in the computation of the set-colored poset associated to an implicational base. The parameter k is of interest since for any finite lattice there exists an integer k for which is k-meet-semidistributive. When

they are known as meet-semidistributive lattices.

We first give a polynomial time algorithm to compute an implicational base of a k-meet-semidistributive lattice from its associated colored poset. In other words, for a fixed k, finding a minimal implicational base of a k-meet-semidistributive lattice L from a context (FCA literature) of L can be done not just in output-polynomial time (which is open in the general case) but in polynomial time in the size of the input. This result generalizes that in . Second, we derive an algorithm to compute a set-colored poset from an implicational base which is based on the enumeration of minimal transversals of a hypergraph and turns out to be in polynomial time for k-meet-semidistributive lattices , . Finally, we show that checking whether a given implicational base describes a k-meet-semidistributive lattice can be done in polynomial time.

Article
Belova T., Bliznets I. Theoretical Computer Science. 2020. No. 803. P. 222-233.

In this paper, we consider the (n,3)-MAXSAT problem. The problem is a special case of the Maximum Satisfiability problem with an additional requirement that in the input formula each variable appears at most three times. Here, we improve previous upper bounds for (n,3)-MAXSAT in terms of n (number of variables) and in terms of k (number of clauses that we are required to satisfy). Moreover, we prove that satisfying more clauses than the simple all true assignment is an NP-hard problem.

Article
Durand B., Gamard G., Grandjean A. Theoretical Computer Science. 2017. Vol. 666. P. 36-47.
Article
Shitov Y. Theoretical Computer Science. 2017. Vol. 671. P. 116-118.

For any integer k>2, we consider the following decision problem. Given a simple graph, does there exist a partition of its vertices into two disjoint sets such that every simple k-cycle of G contains vertices in both of these sets? This problem is NP-hard because it admits a polynomial reduction from NAE 3-SAT. We construct a reduction that is polynomial both in the length of the instance and in k, which answers a recent question of Karpinski.

Article
Korpelainen N., Lozin V. V., Malyshev D. et al. Theoretical Computer Science. 2011. No. 412. P. 3545-3554.

The notion of a boundary graph property was recently introduced as a relaxation of that of a minimal property and was applied to several problems of both algorithmic and combinatorial nature. In the present paper, we first survey recent results related to this notion and then apply it to two algorithmic graph problems: Hamiltonian cycle and Vertex k-colorability. In particular, we discover the first two boundary classes for the Hamiltonian cycle problem and prove that for any k > 3 there is a continuum of boundary classes for Vertex k-colorability.

Article
Maxim Babenko, Gawrychowski P., Kociumaka T. et al. Theoretical Computer Science. 2016. Vol. 638. P. 112-121.

We consider the problems of computing the maximal and the minimal non-empty suffixes of substrings of a longer text of length . n. For the minimal suffix problem we show that for every . τ, . 1≤τ≤logn, there exists a linear-space data structure with . O(τ) query time and . O(nlogn/τ) preprocessing time. As a sample application, we show that this data structure can be used to compute the Lyndon decomposition of any substring of the text in . O(kτ) time, where . k is the number of distinct factors in the decomposition. For the maximal suffix problem, we give a linear-space structure with . O(1) query time and . O(n) preprocessing time. In other words, we simultaneously achieve both the optimal query time and the optimal construction time. © 2015 Elsevier B.V.

Article
Vereshchagin N. Theoretical Computer Science. 2020. Vol. 809. P. 531-537.

The purpose of this paper is   to answer two questions left open in [B. Durand, A. Shen, and N. Vereshchagin, Descriptive Complexity of Computable Sequences, Theoretical Computer Science 171 (2001), pp. 47--58].   Namely, we consider   the following two   complexities of an infinite computable 0-1-sequence $\alpha$:   $C^{0'}(\alpha )$, defined as   the minimal length of a program with oracle $0'$ that prints   $\alpha$,   and  $\MM(\alpha)$,  defined as $\limsup C(\alpha_{1:n}|n)$, where   $\alpha_{1:n}$ denotes the length-$n$ prefix of $\alpha$   and $C(x|y)$ stands for conditional Kolmogorov complexity.   We show that $C^{0'}(\alpha )\le \MM(\alpha)+O(1)$ and   $\MM(\alpha)$ is not bounded by any computable function of $C^{0'}(\alpha )$,   even   on the domain of computable sequences.

Article
Babin M. A., Kuznetsov S. Theoretical Computer Science. 2017. Vol. Volume 658, Part B. No. 7 January. P. 316-326.

Dualization of a monotone Boolean function on a finite lattice can be represented by transforming the set of its minimal 1 values to the set of its maximal 0 values. In this paper we consider finite lattices given by ordered sets of their meet and join irreducibles (i.e., as a concept lattice of a formal context). We show that in this case dualization is equivalent to the enumeration of so-called minimal hypotheses. In contrast to usual dualization setting, where a lattice is given by the ordered set of its elements, dualization in this case is shown to be impossible in output polynomial time unless P = NP. However, if the lattice is distributive, dualization is shown to be possible in subexponential time.

Article
Slavnov S. A. Theoretical Computer Science. 2006. Vol. 357. No. 1-3. P. 215-229.
Article
Zhivotovskiy N., Hanneke S. Theoretical Computer Science. 2018. Vol. 9. No. 742. P. 27-49.

In statistical learning the excess risk of empirical risk minimization (ERM) is controlled by (COMPn(F)n)α, where n is a size of a learning sample, COMPn(F) is a complexity term associated with a given class F and α∈[12,1] interpolates between slow and fast learning rates. In this paper we introduce an alternative localization approach for binary classificationthat leads to a novel complexity measure: fixed points of the local empirical entropy. We show that this complexity measure gives a tight control over COMPn(F) in the upper bounds under bounded noise. Our results are accompanied by a minimax lower bound that involves the same quantity. In particular, we practically answer the question of optimality of ERM under bounded noise for general VC classes.

Article
Lozin V. V., Malyshev D., Mosca R. et al. Theoretical Computer Science. 2017. Vol. 700. P. 63-74.

Weighted independent domination is an NP-hard graph problem, which remains computationally intractable in many restricted graph classes. In particular, the problem is NP-hard in the classes of sat-graphs and chordal graphs. We strengthen these results by showing that the problem is NP-hard in a proper subclass of the intersection of sat-graphs and chordal graphs. On the other hand, we identify two new classes of graphs where the problem admits polynomial-time solutions.

Article
Beaudou L., Coupechoux P., Dailly A. et al. Theoretical Computer Science. 2018. Vol. 746. P. 19-35.

Octal games are a well-defined family of two-player games played on heaps of counters, in which the players remove alternately a certain number of counters from a heap, sometimes being allowed to split a heap into two nonempty heaps, until no counter can be removed anymore.

We extend the definition of octal games to play them on graphs: heaps are replaced by connected components and counters by vertices. Thus, playing an octal game on a path

is equivalent to playing the same octal game on a heap of n counters.

We study one of the simplest octal games, called 0.33, in which the players can remove one vertex or two adjacent vertices without disconnecting the graph. We study this game on trees and give a complete resolution of this game on subdivided stars and bistars.

Article
Shitov Y. Theoretical Computer Science. 2017. Vol. 660. P. 102-104.

Let G be a simple graph whose vertices are partitioned into two subsets, called ‘filled’ vertices and ‘empty’ vertices. A vertex v is said to be forced by a filled vertex u if v is a unique empty neighbor of u. If we can fill all the vertices of G by repeatedly filling the forced ones, then we call an initial set of filled vertices a forcing set. We discuss the so-called failed forcing number of a graph, which is the largest cardinality of a set which is not forcing. Answering the recent question of Ansill, Jacob, Penzellna, Saavedra, we prove that this quantity is NP-hard to compute. Our proof also works for a related graph invariant which is called the skew failed forcing number.

Article
Obiedkov S. Theoretical Computer Science. 2017. Vol. 658. No. Part B. P. 375-390.

We consider a propositional language for describing parameterized ceteris paribus preferences over atomic conjunctions. Such preferences are only required to hold when the alternatives being compared agree on a specified subset of propositional variables. Regarding the expressivity of the language in question, we show that a parameterized preference statement is equivalent to a conjunction of an exponential number of classical, non-parameterized, ceteris paribus statements. Next, we present an inference system for parameterized statements and prove that the problem of checking the semantic consequence relation for such statements is coNP-complete. We propose an approach based on formal concept analysis to learning preferences from data by showing that ceteris paribus preferences valid in a particular model correspond to implications of a special formal context derived from this model. The computation of a complete preference set is then reducible to the computation of minimal hypergraph transversals. Finally, we adapt a polynomial-time algorithm for abduction using Horn clauses represented by their characteristic models to the problem of determining preferences over new alternatives from preferences over given alternatives (with ceteris paribus preferences as the underlying model).