1 Introduction
The seminal paper of Karp Karp67 defines the state complexity of an (infinite) automaton as a function associating with the number of states reachable by reading a word of length at most . For a function , a language has state complexity if there exists an automaton recognising of state complexity at most .
For the case of deterministic automata, the state complexity is fully characterised by the celebrated MyhillNerode theorem Nerode58 , which states the existence of a canonical minimal (potentially infinite) automaton for a given language based on the notion of left quotients. Nevertheless, it is sometimes complicated to understand the structure of this automaton, as demonstrated by the case of the language of prime numbers written in binary: a series of papers culminates in a result of Hartmanis and Shank HartmanisShank69 showing that this language has asymptotically maximal (i.e., exponential) deterministic state complexity.
Our first aim is to investigate the deterministic state complexity of probabilistic automata, a simple probabilistic model of computation introduced by Rabin in his seminal paper Rabin63 . This study is motivated by the section “approximate calculation of matrix products” in this paper; in the end of this section, Rabin states a result, without proof; we substantiate this claim, i.e. formalise and prove the result.
We then initiate the study of alternating state complexity, which uses Karp’s definition instantiated with (infinite) alternating automata. We first motivate the model with some examples and later discuss its relevance. Formal definitions are given in the next section; we stick to intuitive explanations in this introduction.
Consider the language
consisting of words having the same number of ’s, ’s and ’s. (We let denote the number of letters in .) This language is not regular, but we claim that it is recognised by a deterministic automaton of quadratic state complexity. Indeed, we construct an automaton whose set of states is , interpreted as two counters. They are initialised to each and maintain the value . To this end, the letter acts as , the letter as , the letter as . The only accepting state is . This automaton is of quadratic state complexity: after reading the word the automaton is in the state , which means that the set of states reachable by words of length at most has size .
Consider now the language
consisting of two words over the alphabet separated by the letter such that is different from . One can easily see that this language does not have subexponential deterministic state complexity: after reading two different words and , any deterministic automaton recognising NotEq must be in two different states.
However, it is recognised by a nondeterministic automaton of linear state complexity. Note that there are three ways to have : either is longer than , or is shorter than , or there exists a position at which they differ. At the beginning the automaton guesses which of these three situations occur. We focus on the third possibility for the informal explanation. The automaton guesses a position in the first word, stores in the state the position together with the letter at this position, and checks whether the corresponding position in the second word indeed differs. To this end, after reading the letter , it decrements the position until reaching , and checks whether the letter is indeed different than the letter stored in the state.
Our third example is the language
consisting of two words over the alphabet separated by the letter such that is lexicographically smaller than . One can see that this language does not have subexponential nondeterministic state complexity (we do not substantiate this claim here). However, we will now explain that it is recognised by an alternating automaton of linear state complexity.
The notion of alternating (Turing) machines was introduced by Chandra, Kozen and Stockmeyer
ChandraStockmeyer76 ; Kozen76 ; ChandraKozenStockmeyer81 . A nondeterministic automaton makes guesses about the word, and the computation is accepting if there exists a sequence of correct guesses. In other words, these guesses are disjunctive choices; the alternating model restores the symmetry by introducing disjunctive and conjunctive choices. Whenever the automaton makes a choice, we say that it creates independent copies of itself, one for each alternatives; if the choice was disjunctive, the computation is accepted if some copy accepts, and if the choice was conjunctive, the computation is accepted if all copies accept.We illustrate this notion by constructing an alternating automaton for the language Lexicographic. We unravel the inductive definition of the lexicographic order: if and only if
Here is the first letter of , and is the word stripped of its first letter. Upon reading the first letter , the automaton makes a disjunctive guess corresponding to the disjunction in the definition: either both and , or both and . In the latter case, the automaton makes a further choice, conjunctive this time, checking with one copy that and with another that .
Alternating automata are succinct. It is wellknown that finite deterministic, nondeterministic and alternating automata are equivalent. As hinted by the examples discussed above, for infinite automata we do not have such an equivalence. Some classical constructions still apply, for instance the powerset construction to determinise automata which increases the state complexity exponentially. Similarly one can transform alternating automata into deterministic ones increasing the state complexity by a twofold exponential. Hence one can see alternating automata as a class of succinctly represented deterministic automata, whose inner boolean structure is made explicit.
Alternating automata are distributed. Another appeal of alternating automata is as a model of distributed computation. Indeed, in the course of its computation, an alternating automaton produces copies of itself that can be run independently on a distributed architecture. The final output is then computed by boolean combinations of the answers of each copy. This point of view echoes the recent work of Reiter Reiter15 , which combines ideas from distributed algorithms and alternating automata.
Applications. The notion of state complexity is used as a complexity measure to evaluate how complicated some operations on languages are. We refer to the surveys Yu01 ; Yu02 ; GMRY17 for more details on this long line of work. The other natural use of state complexity is as a tool for separating models of computations. For instance, the paper of Dawar and Kreutzer DK07 generalises the notion of automaticity (see related works) to relational structures and uses it for separating several modal and nonmodal fixedpoint logics.
Contributions of the paper. We devise a generic lower bound technique for alternating state complexity based on boundedly generated lattices of languages.
We give the basic definitions and show some examples in Section 2. The Section 3 is devoted to substantiating Rabin’s claim about the deterministic state complexity of probabilistic languages. We discuss related works in Section 4. We describe our lower bound technique in Section 5, and give two applications:

Hierarchy theorem: in Section 6, we prove a hierarchy theorem: for each natural number greater than or equal to , there exists a language having alternating state complexity but not for any .

Prime numbers: in Section 7, we look at the language of prime numbers written in binary. The works of Hartmanis and Shank culminated in showing that it does not have subexponential deterministic state complexity HartmanisShank69 . We consider the stronger model of alternating automata, and first observe that Hartmanis and Shank’s techniques imply a logarithmic lower bound on the alternating state complexity. Our contribution is to strengthen this result by showing a linear lower bound, which is thus an exponential improvement.
2 Definitions
2.1 State Complexity
We fix an alphabet , which is a finite set of letters. A word is a finite sequence of letters , where the are letters from the alphabet , i.e., . We say that has length , and write for the length of . The empty word is . We let denote the set of all words and the set of words of length at most . A language, typically denoted by , is a set of words.
For a set , we let denote the set of boolean formulae over , i.e., using conjunctions and disjunctions. Throughout the paper we only consider positive boolean combinations. For instance, if , an element of is . A conjunctive formula uses only conjunctions, and a disjunctive formula only disjunctions. For and , we write if is true when setting the elements of to true and the others to false.
Definition 1 (Alternating Automata ChandraStockmeyer76 ; Kozen76 ; ChandraKozenStockmeyer81 ).
An alternating automaton is given by a (potentially infinite) set of states, an initial state , a transition function and a set of accepting states .
We use acceptance games to define the semantics of alternating automata. Consider an alternating automaton and an word , we define the acceptance game as follows: it has two players, Eve and Adam. Eve claims that the word should be accepted, and Adam challenges this claim.
The game starts from the initial state , and with each letter of read from left to right, a state is chosen through the interaction of the two players. If in a state and reading a letter , Eve and Adam look at the boolean formula ; Eve chooses which clause is satisfied in a disjunction, and Adam does the same for conjunctions. This leads to a new state , from which the computation continues. A play is won by Eve if it ends up in an accepting state.
The word is accepted by if Eve has a winning strategy in the acceptance game . The language recognised by is the set of words accepted by .
As special cases, an automaton is

nondeterministic if for all in , in , is a disjunctive formula,

universal if for all in , in , is a conjunctive formula,

deterministic if for all in , in , is an atomic formula, i.e., if .
Definition 2 (State Complexity Classes Karp67 ).
Fix a function . The language is in if there exists an alternating automaton recognising and a constant such that for all in :
Similarly, we define for nondeterministic automata and for deterministic automata.
We write for the function , so for instance is the class of languages having linear alternating state complexity. We say that has sublinear (respectively subexponential) alternating state complexity if it is recognised by an alternating automaton of state complexity at most , where (respectively ).
We let denote the class of regular languages, i.e., those recognised by finite automata. Then
i.e., a language has constant state complexity if and only if it is regular.
We remark that is the class of all languages. Indeed, consider a language , we construct a deterministic automaton recognising of exponential state complexity. Its set of states is , the initial state is and the transition function is defined by . The set of accepting states is simply itself. The number of different states reachable by all words of length at most is the number of words of length at most , i.e., .
It follows that the asymptotical maximal state complexity of a language is exponential, and the state complexity classes are relevant for functions smaller than exponential.
2.2 The MyhillNerode Theorem
We present an equivalent point of view on the deterministic state complexity based on MyhillNerode equivalence relation.
Let be a finite word, define its left quotient with respect to by
A well known result from automata theory states that for all regular languages, there exists a minimal deterministic finite automaton, called the syntactic automaton, whose states is the set of left quotients.
This construction extends mutatis mutandis when dropping the assumption that the automaton has finitely many states. The statement gives precise lower bounds on the deterministic state complexity of the language.
Formally, consider a language , we define the syntactic automaton of , denoted , as follows. We define the set of states as the set of all left quotients: . The initial state is , and the transition function is defined by . Finally, the set of accepting states is .
Let defined by
Theorem 1 (Reformulation of MyhillNerode Theorem Nerode58 ).

recognises , so ,

for all , if , then .
The first item is routinely proved. For the second item, we prove an even stronger property. Assume towards contradiction that there exists an automaton of state complexity recognising and such that there exists such that . Since , there exists two words and of length at most such that but in the words and lead to the same state. The left quotients being different, there exists a word such that and , or the other way around. But since the words and lead to the same state and is deterministic, this state must be both accepting and rejecting, contradiction.
2.3 Probabilistic Automata
Let be a finite set of states. A distribution over is a function such that . We denote the set of distributions over .
Definition 3 (Probabilistic Automaton).
A probabilistic automaton is given by a finite set of states , a transition function , an initial state , and a set of final states .
In a transition function , the quantity
is the probability to go from the state
to the state reading the letter . A transition function naturally induces a morphism . We denote the probability to go from a state to a state reading on the automaton , i.e. . The acceptance probability of a word by is , which we denote .The following threshold semantics was introduced by Rabin Rabin63 .
Definition 4 (Probabilistic Language).
Let be a probabilistic automaton and a threshold in , it induces the probabilistic language
3 Substantiating the Claim of Rabin
In the section called “approximate calculation of matrix products” in the paper introducing probabilistic automata Rabin63 , Rabin asks the following question: is it possible, given a probabilistic automaton, to construct an algorithm which reads words and compute the acceptance probability in an online fashion?
He first shows that this is possible under some restrictions on the probabilistic automaton, and concludes the section by stating that “an example due to R. E. Stearns shows that without assumptions, a computational procedure need not exist”. The example is not given, and to the best of the author’s knowledge, has never been published anywhere.
In this section we substantiate this claim using the framework of deterministic state complexity. Whether this exactly fleshes out Rabin’s claim is subject to discussions, since Rabin asks whether the acceptance probability can be computed up to a given precision; in our setting, the acceptance probability is not actually computed, but only compared to a fixed threshold, following Rabin’s definition of probabilistic languages.
The following result shows that there exists a probabilistic automaton defining a language of asymptotically maximal (exponential) deterministic state complexity.
Theorem 2.
There exists a probabilistic automaton such that does not have subexponential deterministic state complexity.
In the original paper introducing probabilistic automata, Rabin Rabin63 gave an example of a probabilistic automaton computing the binary decomposition function (over the alphabet ), denoted , i.e. , defined by
(i.e. in binary). We show that adding one letter and one transition to this probabilistic automaton induces a language which does not have subexponential deterministic state complexity.
The automaton is represented in Figure 1. The alphabet is . The only difference between the automaton proposed by Rabin Rabin63 and this one is the transition over from to . As observed by Rabin, a simple induction shows that for in , we have .
Let , it decomposes uniquely into , where . Observe that .
Consider an automaton recognising and fix . The binary decomposition function maps words of length to rationals of the form , for . Consider two different words and in of length , we show that .
Without loss of generality assume ; observe that . There exists in such that and : it suffices to choose such that is in , which exists by density of the dyadic numbers in . Thus, , and we exhibited exponentially many words having pairwise distinct left quotients.
It follows from Theorem 1 that does not have subexponential deterministic state complexity.
We note that expanding on these ideas we gave a simple proof of the undecidability of the regularity problem for probabilistic languages FS15 , which can be easily adapted to show that deciding the deterministic state complexity of a probabilistic language is undecidable.
4 Related Works
The definition of state complexity is due to Karp Karp67 , and the first result proved in that paper is that nonregular languages have at least linear deterministic state complexity. Hartmanis and Shank considered the language of prime numbers written in binary, and showed in HartmanisShank69 that it does not have subexponential deterministic state complexity. We pursue this question in this paper by considering the alternating state complexity of the prime numbers.
Automaticity was defined by Shallit and Breitbart and studied in depth in a series of four papers ShallitBreitbart96 ; PomeranceRobsonShallit97 ; GlaisterShallit98 ; Shallit96 .
Definition 5.
The automaticity of a language is the function which associates with the size of the smallest deterministic automaton which agrees with on all words of length at most .
The conceptual difference is that automaticity is a nonuniform notion, since there is a finite automaton for each , whereas state complexity is uniform, since it considers one infinite automaton. For this reason, the two measures behave completely differently.
For instance, consider the language
In words: the prefix of of length repeats just after the unique letter .
The automaticity of this language is linear, i.e., rather small: . Indeed, given , the automaton stores the prefix up to , waits for the letter , and compares it to the word starting after .
On the other hand, the deterministic state complexity of is asymptotically maximal, meaning exponential: indeed, since the automaton has no information on how long the prefix to be repeated may be, it has to store the whole word. More formally, for any two words , any deterministic automaton recognising must be in two different states after reading and after reading .
Note that replacing by a very slow growing function yields examples showing that the gap between automaticity and deterministic state complexity is arbitrarily large.
Another interesting point to make here is the difference between finite and infinite automata. Indeed, studying the state complexity of finite alternating automata can be reduced to the state complexity of finite deterministic automata by reversing the words. The notation stands for the reverse of :
We extend it to languages: . The following result is a variant of Brzozowski’s minimization by reversal technique B63 , and a classical result in automata theory.
Lemma 1 (Rs97 ; Fjy90 ).

If is recognised by an alternating automaton with states, then is recognised by a deterministic automaton with states.

If is recognised by a deterministic automaton with states, then is recognised by an alternating automaton with states.
In other words, the number of states of the smallest finite deterministic automaton recognising is (almost) exactly , where is the number of states of the smallest finite alternating automaton recognising .
This result does not extend to state complexity for infinite automata: indeed, since every language has exponential deterministic state complexity, this would imply that every language also has linear alternating state complexity. That does not hold: we exhibit in Subsection 5.3 a language which does not have subexponential alternating state complexity.
Two notions share some features with alternating state complexity.
The first is boolean circuits; the resemblance is only superficial, as circuits do not process the input from left to right. For instance, one can observe that the language Parity, which is hard to compute with a circuit (not in for instance), is actually a regular language, so trivial with respect to state complexity.
The second notion is alternating communication complexity, developed by Babai, Frankl and Simon BabaiFranklSimon86 . In this setting, Alice has an input in , Bob an input in , and they want to determine for a given boolean function known by all. Alice and Bob are referees in a discussion involving two individuals, Eve and Adam. Eve tries to convince Alice and Bob that , and Adam aims at the opposite. A protocol of exchanging messages depending on the inputs is agreed upon by everyone beforehand. Then the input is revealed to Alice and to Bob. Eve and Adam both know the two inputs and exchange messages whose conformity to the inputs is checked by Alice and Bob. The cost of the protocol is the number of bits exchanged.
The main difference between alternating communication complexity and state complexity is that protocols do not have to extract information from the inputs sequentially as an automaton does. For instance, swapping the inputs of Alice and Bob does not make any difference for communication complexity but can completely change the state complexity.
As an example, consider the following language studied in Subsection 5.3.
Alice receives of length and Bob receives , and they want to check whether there exists such that . A simple protocol is for Eve to send , and then for Adam to send together with the letter , to which Eve answers with the letter . If the two letters match the exchange is a success, otherwise it is a failure.
An alternating automaton cannot simulate this protocol, because it would need to choose at the beginning, even before reading . The formal proof of this intuition is that this language does not have subexponential alternating complexity, as proved in Subsection 5.3.
However, if we swap the two inputs, i.e., the automaton reads before , then it can simulate the protocol: when reading it nondeterministically decides to store , and later checks using universal guesses that .
This example shows that using alternating communication complexity would not yield strong lower bounds for alternating state complexity. Building on the ideas behind the language one can obtain arbitrary gaps between the two notions.
5 A Lower Bound Technique
In this section, we develop a generic lower bound technique for alternating state complexity. It is based on the size of generating families for some lattices of languages; we describe it in Subsection 5.1, and a concrete approach to use it, based on query tables, is developed in Subsection 5.2. We apply it to an example in Subsection 5.3.
5.1 Boundedly Generated Lattices of Languages
Let be a language and a word. Recall that the left quotient of with respect to is
If has length at most , we say that is a left quotient of of order .
A lattice of languages is a set of languages closed under union and intersection. Given a family of languages, the lattice it generates is the smallest lattice containing this family.
Theorem 3.
If is in , then there exists a constant such that for all , there exists a family of at most languages whose generated lattice contains all the left quotients of of order .
To some extent, Theorem 3 draws from the classical MyhillNerode theorem Nerode58 . However, since there is no notion of minimal alternating automaton, the situation is more complicated here. In particular, the converse of Theorem 3 may not hold.
Theorem 3 reduces the question of finding lower bounds for alternating state complexity to the following one: given a finite lattice of languages, what is the size of the smallest set of generators for this lattice?
Proof.
Let be an alternating automaton recognising of state complexity at most .
Fix . Let denote the set of states reachable by some word of length at most ; by assumption is at most for some constant . For in , let be the language recognised by taking as initial state, and the family of these languages.
We prove by induction over that all left quotients of of order can be obtained as boolean combinations of languages in .
The case is clear, since .
Consider a word of length , write . We are interested in , so let us start by considering . By the induction hypothesis, can be obtained as a boolean combination of languages in : write , meaning that is a boolean formula whose atoms are languages in .
Now consider . Observe that the left quotient operation respects both unions and intersections, i.e.,
and
It follows that ; this notation means that the atoms are languages of the form for in , i.e., for in .
To finish the proof, we remark that can be obtained as a boolean combination of the languages , where are the states that appear in . To be more precise, we introduce the notation , on an example: if , then . With this notation, . Thus, for in , we have that can be obtained as a boolean combination of languages in .
Putting everything together, it implies that can be obtained as a boolean combination of languages in , finishing the inductive proof. ∎
5.2 The Query Table Method
Thanks to Theorem 3, we are now looking at the size of the smallest set of generators for a given finite lattice of languages. To study this quantity we define the notion of query tables.
Definition 6 (Query Table).
Consider a family of languages . Given a word , its profile with respect to , or
profile, is the boolean vector stating whether
belongs to , for each in . The size of the query table of is the number of different profiles, when considering all words.For a language , its query table of order is the query table of the left quotients of of order .
The name query table comes from the following image, illustrated in Figure 2: the query table of is the infinite table whose columns are indexed by languages in and rows by words (so, there are infinitely many rows). The cell corresponding to a word and a language in is the boolean indicating whether is in . Thus the profile of is the row corresponding to in the query table of .
Lemma 2.
Consider a lattice of languages generated by languages. The query table of has size at most .
Indeed, there are at most different profiles with respect to .
Theorem 4.
Let in . There exists a constant such that for all , the query table of of order has size at most .
The proof of Theorem 4 relies on the following lemma.
Lemma 3.
Consider two families of languages and . If , then the size of the query table of is smaller than or equal to the size of the query table of .
Proof.
It suffices to observe that the query table of is “included” in the query table of . More formally, consider in the query table of the subtable which consists of columns corresponding to languages in : this is the query table of . This implies the claim. ∎
We now prove Theorem 4. Thanks to Theorem 3, the family of left quotients of of order is contained in a lattice generated by a family of size at most . It follows from Lemma 3 that the size of the query table of of order is smaller than or equal to the size of the query table of a lattice generated by at most languages, which by Lemma 2 is at most .
Our lower bound apparatus is now complete: thanks to Theorem 4, to prove a lower bound on the alternating state complexity of a language , it is sufficient to prove lower bounds on the size of the query tables of .
5.3 A First Application of the Query Table Method
As a first application of our technique, we exhibit a language which has asymptotically maximal (i.e., exponential) alternating state complexity. Surprisingly, this language is simple in the sense that it is contextfree and definable in Presburger arithmetic, i.e., in firstorder logic with the addition predicate.
Recall that has subexponential alternating state complexity if for some such that for all . Thanks to Theorem 4, to prove that does not have subexponential alternating state complexity, it is enough to exhibit a constant such that for infinitely many , the query table of the left quotients of of order has size at least .
Theorem 5.
There exists a language which does not have subexponential alternating state complexity, yet is both contextfree and definable in Presburger arithmetic.
Proof.
Let
Recall that the notation stands for the reverse of . Note, and this is very important here, the number of words is not bounded: is arbitrary.
It is easy to see that is both contextfree and definable in Presburger arithmetic, i.e., in firstorder logic with the addition predicate (the use of reversed words in the definition of is only there to make contextfree).
We show that does not have subexponential alternating state complexity. We prove that for all , the query table of the left quotients of of order has size at least . Thanks to Theorem 4, this implies the result.
Fix . Let be the set of all words in . It has cardinality . Consider a subset of . We argue that there exists a word such that if is in , then the following equivalence holds:
This shows the existence of different profiles with respect to the left quotients of order , as claimed.
Let be the words in . Consider
The word clearly satisfies the claim above. ∎
6 A Hierarchy Theorem for Languages of Polynomial Alternating State Complexity
Theorem 6.
For each , there exists a language such that:

is in ,

is not in for any .
Consider the alphabet .
Let , and
We note that unlike the language used for proving Theorem 5, the value of is here bounded by .
Proof.
We construct an alternating automaton of state complexity . The automaton has three consecutive phases:

First, a nondeterministic guessing phase while reading , which passes onto the second phase a number in .
Formally, the set of states for this phase is , the initial state is and the transitions are
The automaton for this phase has state complexity .

Second, a universal phase while reading . For each in , the automaton launches one copy storing the position , the letter and the number guessed in the first phase.
Formally, the set of states for this phase is
The first component is the length of the word read so far (in this phase), the second component stores the letter read, where the letter stands for undeclared, and the last component is the number .
The initial state is . The transitions are
The automaton for this phase has quadratic state complexity.

Third, a deterministic phase while reading
It starts from a state of the form . It checks whether . Localising is achieved by decrementing the number by one each time a letter is read. In the corresponding localising the position is achieved by decrementing the first component by one at a time.
The automaton for this phase has quadratic state complexity.
We now prove the lower bound.
We prove that for all , the size of the query table of of order is at least . Thanks to Theorem 4, this implies that is not in for any .
Fix . Let be the set of all words in . It has cardinality .
Observe that belongs to if and only if there exists in such that .
Consider any subset of , we argue that there exists a word which satisfies that if is in , then the following equivalence holds:
This shows the existence of different profiles with respect to the left quotients of order , as claimed.
Let be the words in . Consider
The word clearly satisfies the claim above. ∎
7 The Alternating State Complexity of Prime Numbers
In this section, we give lower bounds on the alternating state complexity of the language of prime numbers written in binary:
By definition ; note that the least significant digit is on the left.
The complexity of this language has long been investigated; many efforts have been put in finding upper and lower bounds. In 1976, Miller gave a first conditional polynomial time algorithm, assuming the generalised Riemann hypothesis Miller76 . In 2002, Agrawal, Kayal and Saxena obtained the same results, but nonconditional, i.e., not predicated on unproven numbertheoretic conjectures AKS02 .
The first lower bounds were obtained by Hartmanis and Shank in 1968, who proved that checking primality requires at least logarithmic deterministic space HartmanisShank68 , conditional on numbertheoretic assumptions. It was shown by Hartmanis and Berman in 1976 that if the number is presented in unary, then logarithmic deterministic space is necessary and sufficient HartmanisBerman76 . The best lower bound from circuit complexity is due to Allender, Saks and Shparlinski: they proved unconditionally in 2001 that Primes is not in for any prime ASS01 .
The results above are incomparable to our setting, as we are here interested in state complexity. The first and only result to date about the state complexity of Primes is due to Hartmanis and Shank in 1969:
Theorem 7 (HartmanisShank69 ).
The set of prime numbers written in binary does not have subexponential deterministic state complexity.
Their result is unconditional, and makes use of Dirichlet’s theorem on arithmetic progressions of prime numbers. A related and stronger result has been proved by Shallit Shallit96 , which says that the deterministic automaticity of the prime numbers is not subexponential.
Hartmanis and Shank proved the following result.
Lemma 4 (HartmanisShank69 ).
Fix , and consider and two different words of length starting with a . Then the left quotients and are different.
Lemma 4 directly implies Theorem 7 HartmanisShank69 . It also yields a lower bound of on the size of the query table of Primes of order . Thus, together with Theorem 4, this proves that Primes does not have sublogarithmic alternating state complexity.
Corollary 1.
The set of prime numbers written in binary does not have sublogarithmic alternating state complexity.
Our contribution in this section is to extend this result by showing that Primes does not have sublinear alternating state complexity, which is an exponential improvement.
Theorem 8.
The set of prime numbers written in binary does not have sublinear alternating state complexity.
Our result is unconditional, but it relies on the following advanced theorem from number theory, which can be derived from the results obtained by Maier and Pomerance MP90 . Note that their results are more general; we state a corollary fitting our needs. Simply put, this result says that in any (reasonable) arithmetic progression and for any , there exists a prime number in this progression at distance at least from all other prime numbers.
Theorem 9 (Mp90 ).
For every arithmetic progression such that and are coprime, for every , there exists a number such that is the only prime number in .
We proceed to the proof of Theorem 8.
Proof.
We show that for all , the query table of Primes of order has size at least . Thanks to Theorem 4, this implies the result.
Fix . Let be the set of all words of length starting with a . Equivalently, we see
as a set of numbers; it contains all the odd numbers smaller than
. It has cardinality .We argue that for all in , there exists a word such that for all in , is in if and only if . In other words the profile of is everywhere but on the column . Let in ; write . Consider the arithmetic progression ; note that and are coprime. Thanks to Theorem 9, for , there exists a number such that is the only prime number in . Let be a word such that . We show that for all in , we have the following equivalence: is in if and only if .
Indeed, . Observe that
Since is the only prime number in , the equivalence follows.
We constructed words each having a different profile, implying the claimed lower bound. ∎
Theorem 8 proves a linear lower bound on the alternating state complexity of Primes. We do not know of any nontrivial upper bound, and believe that there are none, meaning that Primes does not have subexponential alternating state complexity.
An evidence for this is the following probabilistic argument. Consider the distribution of languages over such that a word in thrown into the language with probability . It is a common (yet flawed) assumption that the prime numbers satisfy this distribution, as witnessed for instance by the prime number theorem. One can show that with high probability such a language does not have subexponential alternating state complexity, the reason being that two different words are very likely to induce different profiles in the query table. Thus it is reasonable to expect that Primes does not have subexponential alternating state complexity.
We dwell on the possibility of proving stronger lower bounds for the alternating state complexity of Primes. Theorem 9 fleshes out the sparsity of prime numbers: it constructs isolated prime numbers in any arithmetic progression, and allows us to show that the query table of Primes contains all profiles with all but one boolean value set to false.
To populate the query table of Primes further, one needs results witnessing the density of prime numbers, i.e., to prove the existence of clusters of prime numbers. This is in essence the contents of the Twin Prime conjecture, or more generally of Dickson’s conjecture, which are both longstanding open problems in number theory, suggesting that proving better lower bounds is a very challenging objective. Dickson’s conjecture reads (we use the equivalent statement given by Ribenboim in Ribenboim96 , called ):
Conjecture 1 (Dickson’s Conjecture).
Fix and
such that there exists no prime number which divides
for every in . Then there exists a number such that
are consecutive prime numbers.
Theorem 10.
Assuming Conjecture 1 holds true, the set of prime numbers written in binary does not have subexponential alternating state complexity.
Proof.
We show that for infinitely many , the query table of Primes of order has size doublyexponential in . Thanks to Theorem 4, this implies the result.
Fix . As above, let be the set of all words of length starting with a , i.e., odd numbers. For a subset
of , let denote the property that there exists no prime number which divides for every in .
Let be a subset of satisfying . Thanks to Conjecture 1, there exists a number such that for , the number is prime if and only if is in . Let be a word such that . It clearly satisfies the condition above. In other words the profile of for the columns between and is on the columns corresponding to , and everywhere else. For each subset satisfying with the same extremal elements ( and ) we constructed a word such that these words have pairwise different profiles.
To finish the proof, we need to explain why this induces doublyexponentially many different profiles. For any , the set of odd numbers such that is a prime number satisfies . This follows from the remark that no prime number can divide both and
. Thanks to the prime number theorem estimating the proportion of prime numbers, we know that for infinitely many
the set contains a number smaller than and a number larger than . Now, each subset of gives rise to a different profile, which yields doublyexponentially many of them. ∎Conclusion
Our first result is to show that probabilistic languages can have arbitrarily high deterministic state complexity, substantiating a claim by Rabin. Our main technical contributions concerns the alternating state complexity, for which we have developed a generic lower bound technique and applied it to two problems. The first result is to give languages of arbitrary high polynomial alternating state complexity. The second result is to give lower bounds on the alternating state complexity of the language of prime numbers; we show that it is not sublinear, which is an exponential improvement over the previous result. However, the exact complexity is left open; we conjecture that it is not subexponential, but obtaining this result might require major advances in number theory.
We leave three questions open, motivating further research:

What is the alternating state complexity of probabilistic languages? We conjecture that the probabilistic language we introduced does not have subexponential alternating state complexity, but our lower bound technique does not suffice to prove this result.

Is the converse of Theorem 3 true, or in other words does the size of the query table completely characterise the alternating state complexity (as it does in the deterministic case)? We believe the answer is “no”, but proving it would require using a stronger lower bound technique to separate alternating state complexity from size of the query table.

Can we find a notion of reduction between languages which respects the alternating state complexity, inducing a definition of completeness for alternating state complexity classes? The sequence of languages for are good candidates for complete languages in the polynomial hierarchy.
References
References
 (1) N. Fijalkow, The online space complexity of probabilistic languages, in: LFCS’2016, 2016, pp. 1–12.

(2)
N. Fijalkow, The state
complexity of alternating automata, in: LICS’18, 2018, pp. 414–421.
doi:10.1145/3209108.3209167.
URL https://doi.org/10.1145/3209108.3209167  (3) R. M. Karp, Some bounds on the storage requirements of sequential machines and Turing machines, Journal of the ACM 14 (3).
 (4) A. Nerode, Linear automaton transformations, Proceedings of the American Mathematical Society 9 (4) (1958) 541–544.
 (5) J. Hartmanis, H. Shank, Two memory bounds for the recognition of primes by automata, Mathematical Systems Theory 3 (2).
 (6) M. O. Rabin, Probabilistic automata, Information and Control 6 (3) (1963) 230–245.
 (7) A. K. Chandra, L. J. Stockmeyer, Alternation, in: FOCS’76, 1976, pp. 1–12.
 (8) D. Kozen, On parallelism in Turing machines, in: FOCS’76, 1976, pp. 89–97.
 (9) A. K. Chandra, D. Kozen, L. J. Stockmeyer, Alternation, Journal of the ACM 28 (1) (1981) 114–133.
 (10) F. Reiter, Distributed graph automata, in: LICS, 2015, pp. 1–12.
 (11) S. Yu, State complexity of regular languages, Journal of Automata, Languages and Combinatorics 6 (2) (2001) 221.
 (12) S. Yu, State complexity of finite and infinite regular languages, Bulletin of the EATCS 76 (2002) 142–152.
 (13) Y. Gao, N. Moreira, R. Reis, S. Yu, A survey on operational state complexity, Journal of Automata, Languages and Combinatorics 21 (4) (2017) 251–310.
 (14) A. Dawar, S. Kreutzer, Generalising automaticity to modal properties of finite structures, Theoretical Computer Science 379 (12) (2007) 266–285.
 (15) N. Fijalkow, M. Skrzypczak, Irregular behaviours for probabilistic automata, in: RP’2015, 2015, pp. 33–36.
 (16) J. Shallit, Y. Breitbart, Automaticity I: properties of a measure of descriptional complexity, Journal of Computer and System Sciences 53 (1) (1996) 10–25.
 (17) C. Pomerance, J. M. Robson, J. Shallit, Automaticity II: descriptional complexity in the unary case, Theoretical Computer Science 180 (12) (1997) 181–201.
 (18) I. Glaister, J. Shallit, Automaticity III: polynomial automaticity and contextfree languages, Computational Complexity 7 (4) (1998) 371–387.
 (19) J. Shallit, Automaticity IV: sequences, sets, and diversity, Journal de Théorie des Nombres de Bordeaux 8 (2) (1996) 347–367.
 (20) J. Brzozowski, Canonical regular expressions and minimal state graphs for definite events, Symposium on Mathematical Theory of Automata 12 (1963) 529–561.
 (21) G. Rozenberg, A. Salomaa, Handbook of Formal Languages, SpringerVerlag Berlin Heidelberg, Springer, 1997.
 (22) A. Fellah, H. Jürgensen, S. Yu, Constructions for alternating finite automata, International Journal of Computer Mathematics 35 (1) (1990) 117–132.
 (23) L. Babai, P. Frankl, J. Simon, Complexity classes in communication complexity theory (preliminary version), in: FOCS’86, 1986, pp. 1–12.
 (24) G. L. Miller, Riemann’s hypothesis and tests for primality, Journal of Computer and System Sciences 13 (3) (1976) 300–317.
 (25) M. Agrawal, N. Kayal, N. Saxena, Primes is in P, Annals of Mathematics 2 (2002) 781–793.
 (26) J. Hartmanis, H. Shank, On the recognition of primes by automata, Journal of the ACM 15 (3) (1968) 382–389.
 (27) J. Hartmanis, L. Berman, On tape bounds for single letter alphabet language processing, Theoretical Computer Science 3 (2) (1976) 213–224.
 (28) E. Allender, M. E. Saks, I. Shparlinski, A lower bound for primality, Journal of Computer and System Sciences 62 (2) (2001) 356–366.
 (29) H. Maier, C. Pomerance, Unusually large gaps between consecutive primes, Transactions of the American Mathematical Society 322 (1) (1990) 201–237.
 (30) P. Ribenboim, The Book of Prime Number Records, Discrete Mathematics, SpringerVerlag New York, 1996.
Comments
There are no comments yet.