A Linear-Time Simulation of Deterministic d-Limited Automata
A d-limited automaton is a nondeterministic Turing machine that uses only the cells with the input word (and end-markers) and rewrites symbols only in the first d visits. This model was introduced by T. Hibbard in 1967 and he showed that d-limited automata recognize context-free languages for each 𝑑⩾2d⩾2. He also proved that languages recognizable by deterministic d-limited automata form a hierarchy and it was shown later by Pighizzini and Pisoni that it begins with deterministic context-free languages (DCFLs) (for 𝑑=2d=2).
As well-known, DCFLs are widely used in practice, especially in compilers since they are linear-time recognizable and have the corresponding CF-grammars subclass (LR(1)LR(1)-grammars). In this paper we present a linear time recognition algorithm for deterministic d-limited automata (in the RAM model) which opens an opportunity for their possible practical applications.