1 Introduction
As part of the long history of research in robotic minimalism, a recent thread has devised methods that aim to automatically reduce and reason about robots’ resource footprints. That work fits within the larger context of methodologies and formalisms for tackling robot design problems, being useful for designing robots subject to resource limits [censi17co, pervan2018low, saberifar18hardness]. But, more fundamentally, the associated algorithms also help identify the information requirements of certain robot tasks. The methods have the potential to provide insights about the interplay of sensing, state, and actuation within the context of particular tasks. One class of objects where the problem of resource minimization can be clearly posed is in the case of combinatorial filters [lavalle10sensing]. These are discrete variants of the probabilistic estimators and recursive Bayesian filters widely adopted for practical use in robots. Combinatorial filters process a stream of discrete sensor inputs and integrate information via transitions between states. The natural question, studied in [o2017concise], then is: How few states are needed to realize specified filter functionality? In this paper, we define a more general class of filters and ask the same question.
We start with a simple motivating scenario where the generalization we introduce is exactly what is needed. Figure 1 shows a driving drone patrolling a house.^{1}^{1}1Such bizarre chimera robots are not our invention, e.g., see the Syma X9 Flying Car. The drone can either drive or fly, but its choice must satisfy navigability constraints. Its wheels can’t drive on grass (F+Y) nor in the pantry (P), owing to spills. Spinning propellers, on the other hand, will disturb the tranquil bedroom (B). Otherwise, either means may be chosen (see inset map pair marking regions in brown/blue for driving/flying). The robot is equipped with an ambient light sensor that is useful because the living room and kitchen are lighter than the bedroom and pantry, while the outdoors is lightest of all.
We wish to construct a filter for the drone to determine how to navigate, with the inputs being brightness changes, and the filter’s output providing some valid mode of locomotion. It is easy to give a valid filter by using one state for each location — this naïve filter is depicted in Figure 1(a). In the living room and kitchen, the filter lists two outputs since both modes are applicable there (both locations are covered by the brown and blue choices). Now consider the question of the smallest filter. If we opt to fly in both the living room and kitchen, then the smallest filter is shown in Figure 1(b) with states. But when choosing to fly in the living room but drive in the kitchen, the minimal filter requires only states (in Figure 1(c)).
This last filter is also the globally minimal filter. The crux is that states with multiple valid outputs introduce a new degree of freedom which influences the size of the minimal filter. These arise, for instance, whenever there are ‘don’tcare’ options. The flexibility of such states must be retained to truly minimize the number of states.
2 Preliminary Definitions and Problem Description
To begin, we define the filter minimization problem in the most general form, where the input is allowed to be nondeterministic and each state may have multiple outputs. This is captured by the procrustean filter (pfilter) formalism [setlabelrss].
2.1 Pfilters and their minimization
We firstly introduce the notion of pfilter:
Definition 1 (procrustean filter [setlabelrss]).
A procrustean filter, pfilter or filter for short, is a tuple with:

a finite set of states , a nonempty initial set of states , and a set of possible observations ,

a transition function ,

a set , which we call the output space, and

an output function .
The states, initial states and observations for pfilter will be denoted , and . Without loss of generality, we will also treat a pfilter as a graph with states as its vertices and transitions as directed edges.
A sequence of observations can be traced on the pfilter:
Definition 2 (reached).
Given any pfilter , a sequence of observations , and states , we say that is a state reached by some sequence from in (or reaches from ), if there exists a sequence of states in , such that . We denote the set of all states reached by from state in as . For simplicity, we use , without the subscript, to denote the set of all verticies reached when starting from any state in , i.e., . Note that holds only when sequence crashes in starting from .
For convenience, we will denote the set of sequences reaching vertex from some initial state by .
Definition 3 (extensions, executions and interaction language).
An extension of a state on a pfilter is a finite sequence of events that does not crash when traced from , i.e., . An extension of any initial state is also called an execution or a string on . The set of all extensions of a state on is called the extensions of , written as . The extensions of all initial vertices on is also called the interaction language (or, briefly, just language) of , and is written .
Note in particular that the empty string belongs to the extensions of any state on the filter, and belongs to the language of the filter as well.
Definition 4 (filter output).
Given any pfilter , a string and an output , we say that is a filter output with input string , if is an output from the state reached by , i.e., . We denote the set of all filter outputs for string as .
Specifically, for the empty string , we have .
Definition 5 (output simulating).
Given any pfilter , a pfilter output simulates if , and .
Plainly in words: for one pfilter to output simulate another, it has to generate some of the outputs of the other, for every string the other admits.
We are interested in practicable pfilters with deterministic behavior:
Definition 6 (deterministic).
A pfilter is deterministic or statedetermined, if , and for every with , .
Any nondeterministic pfilter can be statedeterminized, denoted , following Algorithm in [saberifar18pgraph].
Then the pfilter minimization problem can be formalized as follows:
Problem: Pfilter Minimization (pfm) Input: A deterministic pfilter . Output: A deterministic pfilter with fewest states, such that output simulates .
‘pf’ denotes that the input is a pfilter, and ‘m’ denotes that we are interested in finding a deterministic minimal filter as a solution.
2.2 Complexity of pfilter minimization problems
Definition 7 (state singleoutputting and multioutputting).
A pfilter is state singleoutputting or singleoutputting for short, if only maps to singletons, i.e., . Otherwise, we say that is multioutputting.
Depending on whether the state in the input pfilter (pf) is singleoutputting (so) or multioutputting (mo), we further categorize the problem pfm into the following problems: sofm and mofm.
Lemma 8.
The filter minimization problem fm of [o2017concise] is sofm.
Lemma 9.
sofm is NPComplete.
Theorem 10.
mofm is NPComplete.
Proof.
Firstly, sofm is a special case of mofm problems. These mofm problems are at least as hard as sofm. Hence, mofm are in NPhard. On the other hand, a solution for mofm can be verified in polynomial time. (Change the equality check on line 7 of Algorithm 1 in [o2017concise] to a subset check.) Therefore, mofm is NPComplete. ∎
3 Related work: Prior filter minimization ideas (sofm)
Several elements come together in this section and Figure 3 attempts to show the interrelationships graphically. The original question of minimizing state in filtering is first alluded to by LaValle [lavalle10sensing] as an open problem, who suggested that it is ‘similar to Nerode equivalence classes’. The problem of filter reduction, i.e., sofm in our terms, was formalized and shown to differ in complexity class from the automata problem in [o2017concise]
. That paper also proposed a heuristic algorithm, which served as a starting point for subsequent work. That algorithm uses conflict graphs to capture the vertices that cannot be merged (are conflicting), then iteratively refines the conflict graphs and decides whether to merge two vertices or not via a graph coloring subroutine. A conjecture in
[o2017concise] was that this algorithm is guaranteed to find a minimal filter if the graph coloring subroutine gives a minimal coloring. (Put another way: the inexactness in arriving at a minimal filter can be traced to the graph coloring giving a suboptimal result.) But this conjecture was later proved to be false by Saberifar et al. [saberifar2017combinatorial]. They showed there may exist multiple optimal solutions to the graph coloring subproblem, only some of which will lead to the minimal filter (see Theorem in [saberifar2017combinatorial]). They refined the conjecture, giving the following statement of existence:Idea 1 (see Theorem in [saberifar2017combinatorial]).
In O’Kane and Shell’s heuristic algorithm [o2017concise] for sofm, there always exists some optimal coloring for each conflict graph in the stepwise conflict refinement process, such that it generates a minimal filter.
Lemma 11.
Idea 1 is false.
Proof.
This is simply shown with a counterexample. Consider the problem of minimizing the input filter shown in Figure 3(a), the heuristic algorithm will first initialize the colors of the vertices with their output. Next, it identifies the vertices that disagree on the outputs of extensions with length as shown in Figure 3(b), and then refines the colors of the vertices as shown in Figure 3(c) following a minimal graph coloring solution on the conflict graph. Then it further identifies the conflicts on extensions with length with the conflict graph shown in Figure 3(d), and the vertices colors are further refined as shown in Figure 3(e). Now, no further conflicts can be found. A filter, with states, is then obtained by merging the states with the same color. However, there exists a minimal filter, with states, shown in Figure 3(f), that can be found by choosing coloring solution for the conflict graph shown in Figure 3(b). That coloring is suboptimal. ∎
This appears to indicate a sort of ‘local optima’ arising via subproblems associated with incremental (or stepwise) reduction. In order to avoid this, we introduce a notion of compatibility that can be computed more ‘globally’ before making any decisions to reduce the filter:
Definition 12 (compatibility relation).
Let be a deterministic pfilter. We say a pair of vertices are compatible, denoted , if they agree on the outputs of all their extensions, i.e., .
Via this compatibility relation, we get a undirected compatibility graph:
Definition 13 (compatibility graph).
Given a deterministic filter , its compatibility graph is an unlabeled undirected graph constructed by creating a vertex associated with each state in , and building an edge between the pair of vertices associated with two compatible states.
This compatibility graph can be constructed in polynomial time. As every filter state and associated compatibility graph state are onetoone, to simplify notation we’ll use the same symbol for both and context to resolve any ambiguity.
The second idea relates to the type of the output one obtains after merging states that are compatible or not in conflict. Importantly, the filter minimization problem sofm requires that give a minimal filter which is deterministic.
Idea 2.
By merging the states that are compatible, the heuristic algorithm always produces a deterministic pfilter.
The definition of the reduction problems within [o2017concise, saberifar2017combinatorial, rahmani2018relationship] are specified so as to require that the output obtained be deterministic. But this postcondition is never shown formally established. In fact, it does not always hold.
Lemma 14.
Idea 2 is false.
Proof.
We show that the existing algorithm may produce a nondeterministic filter, which does not output simulate the input filter, and is thus not a valid solution. Consider the filter shown in Figure 4(a) as an input. The vertices with the same color are compatible with each other, with the following exception for , and . Vertex is compatible with , vertex is compatible with , but is not compatible with . The minimal filter found by the existing algorithm is shown in Figure 4(b). The string suffices to shows the nondeterminism, reaching both orange and cyan vertices. It fails to output simulate the input because cyan should never be produced. ∎
If determinism can’t be taken for granted, we might constrain the output to ensure the result will be a deterministic filter. To do this, we introduce an auxiliary constraint when merging compatible states:
Definition 15 (auxiliary constraint).
In the compatibility graph of filter , if there exists a set of mutually compatible states , where every pair has , then they can only be selected to be merged if they always transition to a set of states that are also selected to be merged. For any sets of mutually compatible states and some observation , we will create an auxiliary constraint expressed as a pair if . We denote the set of all auxiliary constraints on the compatibility with the symbol .
A third idea is used by O’Kane and Shell’s heuristic algorithm and is also stated, rather more explicitly, by Saberifar et al. (see Lemma 5 in [saberifar2017combinatorial] and Lemma in [rahmani2018relationship]). It indicates that we can obtain a minimal filter via merging operations on the compatible states.
Idea 3.
Some equivalence relation induces a minimal filter.
Before examining this, we rigorously define the notion of an induced relation:
Definition 16 (induced relation).
Given a filter and another filter , if output simulates , then induces a relation , where if and only if there exists a vertex such that and . We also say that and corresponds to state .
Lemma 17.
Idea 3 is false.
Proof.
It is enough to scrutinize the previous counterexample closely. The minimization problem sofm for the input filter shown in Figure 4(a), is shown in Figure 5(a). It is obtained by () splitting vertex into an upper part reached by and a lower part reached by , () merging the upper part of with , the lower part of with , and other vertices with those of the same color. This does not induce an equivalence relation, since corresponds to two different vertices in the minimal filter. ∎
In light of this, for some filter minimization problems, there may be no quotient operation that produces a minimal filter and an exact algorithm for minimizing filters requires that we look beyond equivalence relations.
Some strings that reach a single state in an input filter may reach multiple states in a minimal pfilter (e.g., and on Figure 4(a) and 5(a)). On the other hand, strings that reach different states in the input pfilter may reach the same state in the minimal filter (e.g., and on those same filters). We say that a state from the input filter corresponds to a state in the minimal filter of there exists some string reaching both of them and, hence, this correspondence is manytomany. An important observation is this: for each state in some hypothetical minimal filter, suppose we collect all those states in the input filter that correspond with . When we examine the associated states in the compatibility graph for that collection, they must all form a clique. Were it not so, the minimal filter could have more than one output associated for some strings owing to nondeterminism. But this causes it to fail to output simulate the input pfilter.
After firming up and developing these intuitions, the next section introduces the concept of a clique cover which enables representation of a search space that includes relations more general than equivalence relations. Based on this new representation, we propose a graph problem use of auxiliary constraints, and prove it to be equivalent to filter minimization.
4 A new graph problem that is equivalent to sofm
By building the correspondence between the input pfilter in Figure 4(a) and the minimal result in Figure 5(a), one obtains the set of cliques in the compatibility graph shown visually in Figure 5(b). Like previous approaches that make state merges by analyzing the compatibility graph, we interpret each clique as a set of states to be merged into one state in the minimal filter. The clique containing and in Figure 5(b) gives rise to in the minimal filter in Figure 5(a) (and and yields , and so on). However, states may further be shared across multiple cliques. We observe that was merged with in the minimal filter to give , and also merged with to give . The former has an incoming edge labeled with an , while the latter has an incoming edge labeled . The vertex , being shared by multiple cliques, is split into different copies and each copy merged separately.
Generalizing this observation, we turn to searching for the smallest set of cliques that cover all vertices in the compatibility graph. Further, to guarantee that the set of cliques induces a deterministic filter, we must ensure they respect the auxiliary constraints. It will turn out that a solution of this new constrained minimum clique cover problem always induces a minimal filter for sofm, and a minimal filter for sofm always induces a solution for this new problem. The final step is to reduce any mcca problem to a SAT instance, and leverage SAT solvers to find a minimal filter for sofm.
4.1 A new minimum clique cover problem
To begin, we extend the preceding argument from the compatibility clique associated to single state , over to all the states in the minimal filter. This leads one to observe that the collection of all cliques for each state in the minimal pfilter forms a clique cover:
Definition 18 (induced clique cover).
Given a pfilter and another pfilter , we say that a vertex in corresponds to a vertex in if . Then, denoting the subset of vertices of corresponding to in with , we form the collection of all such sets, , for where . When output simulates , then the form cliques in the compatibility graph . Further, when this collection of sets covers all vertices in , i.e., , we say that is an induced clique cover.
It is worth repeating: the size of filter (in terms of number of vertices) and the size of the induced clique cover (number of sets) are equal.
Without loss of generality, here and henceforth we only consider the pfilter with all vertices reachable from the initial state, since the ones that can never be reached will be deleted during filter minimization anyway.
Each clique of the clique cover represents the states that can be potentially merged. But the auxiliary constraint, to enforce determinism, requires that the set of vertices to be merged should always transition under the same observation to the ones that can also be merged. Hence, the auxiliary constraints (of Definition 15) can be evaluated across whole covers:
Definition 19.
A clique cover satisfies the set of auxiliary constraints , when for every auxiliary constraint , if there exist a clique , such that , then there exists another clique such that .
Now, we have our new graph problem, mcca.
Problem: Minimum clique cover with auxiliary constraints (mcca) Input: A compatibility graph , a set of auxiliary constraints Aux. Output: A minimal cardinality clique cover of satisfying Aux.
4.2 From minimal clique covers to filters
Given a minimal cover that solves mcca, we construct a filter by merging the states in the same clique and choosing edges between these cliques appropriately:
Definition 20 (induced filter).
Given a clique cover on the compatibility graph of deterministic pfilter , if satisfies all the auxiliary constraints in , then it induces a filter by treating cliques as vertices:

Create a new filter with vertices, where each vertex is associated with a clique in ;

Add each vertex in to iff the associated clique contains an initial state in ;

The output of every in , with associated clique , is the set of common outputs for all states in , i.e., .

For any pair of and in , inherit all transitions between states in the cliques of and , i.e., .

For each vertex in with multiple outgoing edges labeled , keep only the single edge to the vertex , such that all vertices transition to under are included in . This edge must exist since satisfies all .
The size of the cover (in terms of number of sets) and size of the induced filter (number of vertices) are equal.
Notice that the earlier intuition is mirrored by this formal construction: states belonging to the same clique are merged when constructing the induced filter; states in multiple cliques are split when we make the edge choice in step 5. Next, we establish that the induced filter indeed supplies the goods:
lemmadeterministicouputsimulating Given any clique cover on the compatibility graph of a deterministic pfilter , if satisfies the auxiliary constraints and covers all vertices of , then the induced filter is deterministic and output simulates . Proofs appear in the supplementary material, Section A.
A surprising aspect of the preceding is how the auxiliary constraints —which are imposed to ensure that a deterministic filter is produced— enforce outputsimulating behavior, albeit indirectly, too. One might have expected that this separate property would demand a second type of constraint, but this is not so.
On the other hand, needing to satisfy the auxiliary constraints of the input filter does not entail the imposition of any gratuitous requirements:
lemmacliqueaux Given any deterministic pfilters and , if output simulates , then the induced clique cover on the compatibility graph of satisfies all auxiliary constraints in . The proofs appear in the supplementary material, Section A.
4.3 A proof of equivalence between mcca and sofm
To establish the equivalence between mcca and sofm, we will show that the induced filter from the solution of mcca is a minimal filter for sofm, and the induced clique cover from a minimal filter is a solution for mcca.
Lemma 21.
Minimal clique covers for mcca induce minimal filters for sofm.
Proof.
Given any minimal clique cover as a solution for problem mcca with input pfilter , construct pfilter . Since satisfies the auxiliary constraints , is deterministic and output simulates according to Lemma 20. To show that is a minimal deterministic filter for sofm, suppose the contrary. Then there exists a minimal deterministic filter with fewer states, i.e., . Hence, induces a clique cover with fewer cliques than . Since is deterministic, satisfies all via Lemma 20. But then satisfies all the requirements to be a solution for mcca, and has fewer cliques than , contradicting the assumption. ∎
Lemma 22.
A minimal filter for sofm with input induces a clique cover that solves mcca with compatibility graph and auxiliary constraints of .
Proof.
Given minimal filter as a solution for sofm with input filter , we can construct a clique cover from the minimal filter. For this cover to be a solution for mcca with compatibility graph and auxiliary constraints , first, it must satisfy all constraints in . Lemma 20 affirms this fact. Second, we must show it to be minimal among all the covers satisfying those constraints. Supposing is not a minimal, there must exist a clique cover with satisfying . Then, consider the induced filter . Since satisfies all the auxiliary constraints , is deterministic and will output simulate (Lemma 20). But , contravening the fact that is a minimal filter. Hence is minimal. ∎
Together, they establish the theorem.
Theorem 23.
The solution for mcca with compatibility graph and auxiliary constraints of a filter induces a solution for sofm with input filter , and vice versa.
4.4 Reduction from mcca to SAT
Prior algorithms for filter minimization used multiple stages to find a set of vertices to merge, solving a graph coloring problem repeatedly as more constraints are identified. In contrast, an interesting aspect of mcca is that it tackles filter minimization as a constrained optimization problem with all constraints established upfront. Thus the clique perspective gives an optimization problem which is tangible and easy to visualize. Still, being a new invention, there are no solvers readily available for direct use. But reducing mcca to Boolean satisfaction (SAT) enables the use of stateoftheart solvers to find minimum cliques.
We follow the standard practice for treating optimization problems via a decision problem oracle, viz. define a mcca problem, asking for the existence of a clique cover with size satisfying the auxiliary constraints; one then decreases to find the minimum clique cover. Each mcca problem can be written as a logic formula in conjunctive normal form (CNF), polynomial in the size of the mcca instance, and solved. Detailed explanation of the CNF generation from the mcca problem must be deferred to the supplementary material, Section B.
5 Generalizing to mofm
Finally, we generalize the previous algorithm to multioutputting filters. In mofm problems, the input pfilter is deterministic but states in the pfilter may have multiple outputs. One straightforward if unsophisticated approach is to enumerate all filters under different output choices for the states with multiple outputs, and then solve very one the resulting deterministic singleoutputting filters as instances of sofm. The filter with the fewest states among all the minimizers could then be treated as a minimal one for the mofm problem.
Unfortunately, this is too simplistic. Prematurely committing to an output choice is detrimental. Consider the input filter shown in Figure 6(a), it has two multioutputting states ( and ). If we choose to have both and give the same output, the sofm minimal filter, shown in Figure 6(b), has states. If we choose distinct outputs for and , the sofm minimal filter, shown in Figure 6(c), now has states. But neither is the minimal mofm filter. The true minimizer appears in Figure 6(d), with only states. It is obtained by splitting both and into two copies, each copy giving a different output.
The idea underlying a correct approach is that output choices should be made together with the splitting and merging operations during filter minimization. Multioutputting vertices may introduce additional split operations, but these split operations can still be treated via clique covers on the compatibility graph. This requires that we define a new compatibility relation—it is only slightly more general than before:
Definition 24 (compatibility relation).
Let be a deterministic pfilter. We say a pair of vertices are compatible, denoted as , if they agree on the outputs of all their extensions, i.e., .
Using this definition, the minimization of a deterministic multioutputting filter can also be written and solved as an mcca problem.
6 Experimental results
The method described was implemented by leveraging a Python implementation of 2018 SAT Competition winner, MapleLCMDistChronoBT [nadel2018maple, immssat18]. Their solver will return a solution if it solves the mcca problem before timing out. If it finds a satisfying assignment, we decrease , and try again. Once no satisfying assignment can be found, we construct a minimal filter from the solution with minimum .
First, as a sanity check, we examined the minimization problems for the inputs shown in Figure 1(a), Figure 4(a) and Figure 6(a). Our implementation takes , and , respectively, to reduce those filters. All filters found have exactly the minimum number of states, as reported above.
Next, designed a test where we could examine scalability aspects of the method. We generalized the input filter shown in Figure 6(a) to produce a family of instances, each described by two parameters: the input filter has rows and states at each row. (Figure 6(a) is the version.) Just like the original filter, the states in the same row share the same color, but the states in different rows have different colors. The initial state outputs a single unique color; the last two states, and , output any of the colors. In this example, the states in the same row, together with and , are compatible with each other.
The time to construct the auxiliary constraints, prepare the formulas and the time used by the SAT solver were recorded. We also measured the number of auxiliary constraints found by our algorithm. Figure 8 summarizes the data for . The result shows that about of the time is used in preparing the logical formula, with the SAT solver and construction of the auxiliary constraints accounting for only a very small fraction of time.
In light of this, to further dissect the computational costs of different phases, we tested a robot in the square grid environment shown in Figure 8(a). The robot starts from the bottom left cell, and moves to some adjacent cell at each time step. The robot only receives observations indicating its row number at each step. We are interested in small filter allowing the robot to recognize whether it has reached a cell with an exit (at the inner side or outer side). States with both inner and outer exits have multiple outputs. To search for a minimal filter, we firstly start with deterministic input filters for a grid world with size , , , , , , and then minimize these filters. We collected the total time spent in different stages of filter minimization, including the construction of auxiliary constraints, SAT formula generation and resolution of SAT formula by the SAT solver. The results are summarized visually in Figure 8(b).
In this problem, the number of states in the input filter scales linearly with the size of the square. So does the minimal filter. But particular problem has an important additional property: it represents a worstcase in a certain sense because there are no auxiliary constraints. We do not indicate this fact to the algorithm, so the construction of auxiliary constraints examines many cliques, determining that none apply. The results highlight that the construction of the auxiliary constraints quickly grows to overtake the time to generate the logical formula — even though, in this case, the auxiliary constraint set is empty.
The preceding hints toward our direction of current research: the construction of by naïvely following Definition 15 is costly. And, though the SAT formula is polynomial in the size of the mcca instance, that instance can be very large. On the other hand, the need for an auxiliary constraint can be detected when the output produced fails to be deterministic. Hence, our future work will look at how to generate these constraints lazily.
7 Conclusion
With an eye toward generalizing combinatorial filters, we introduced a new class of filter, the cover filters. Then, in order to reduce the state complexity of such filters, we reexamined earlier treatments of traditional filter minimization; this paper has shown some prior ideas to be mistaken. Building on these insights, we formulate the minimization problem via compatibility graphs, examining covers comprised of cliques formed thereon. We present an exact algorithm that generalizes from the traditional filter minimization problem to cover filters elegantly.
References
Appendix A Lemmas and proofs
*
Proof.
Suppose that does not satisfy all auxiliary constraints in . Specifically, let be the auxiliary constraint that is violated, where each vertex in transitions from some vertex in under observation . Then there exists a clique , such that , but there is no clique that . According to the construction of the induced cover, there exists a vertex , such that corresponds to . For any vertex , let . Then is also a string in both and since transitions to some vertex in under observation in and is output simulating . Let . (It is a singleton set as is deterministic.) Hence corresponds to on common string . Similarly, each vertex corresponds to on some string ending with . Let the clique corresponding to be , and we have . But that is a contradiction. ∎
*
Proof.
For any string , let the vertex reached by string in be . Then must belong at least one clique in , where all vertices in this clique can be viewed as merged into a new vertex in . Hence, should reach at least one vertex in and this vertex yield the same output . Since satisfies the auxiliary constraints , the induced filter must be deterministic since there is no vertex that has any nondeterministic outgoing edges bearing the same label. Because is deterministic, , reaches a single vertex in . In addition, this vertex in shares the same output . Therefore, also output simulates . ∎
Appendix B Reduction from mcca to SAT
To find the minimum clique cover for problem mcca, we firstly introduce mcca problem, which is to find a clique cover with no more than cliques. Each mcca problem is encoded as a SAT problem and solved by offtheshelf SAT solver. Next, we will initialize as the number of states in the input filter and decrease it until no solution can be found for mcca problem.
Firstly, the mcca problem is formalized as follows:
Problem: Minimum clique cover with auxiliary constraints (mcca) Input: A compatibility graph , a set of auxiliary constraints Aux, maximum number of cliques Output: A clique cover with no more than cliques on that satisfies all auxiliary constraints in Aux
Next, we will represent the clique cover as choices to assign each vertex in the compatibility graph to a clique , with . To represent these choices, we create a boolean variable to represent the fact that is assigned to clique , and its negation to represent its inverse. The clique cover is captured by such variables.
A clique cover for problem mcca should guarantee that each vertex in is assigned to at least one clique, i.e.,
(1) 
For simplicity, we will use “” and “” for logic or, “” and “” for logic and, “” for logic equivalence.
Let the edges for compatibility graph be . Then for any pair of disconnected vertices in , i.e., and , they should never be assigned to the same clique, i.e.,
(2) 
To satisfy each auxiliary constraint (with and ), if is assigned to a clique (), then there must exist another clique (), such that all vertices in is assigned to clique , i.e.,
Let and . Then
For each , we have
Therefore,
(3)  
To solve mcca, we need to leverage offtheshelf SAT solvers to find an assignment of the variables such that formulas in (), () and () are satisfied. For any vertex in the compatibility graph, we create variables. For each auxiliary constraint, we need to create variables. Suppose that there are vertices in the input filter and auxiliary constraints. Then, we need variables. Similarly, () gives us clause for each vertex, () gives us at most clauses in total, () gives for each auxiliary constraint. Hence, the number of clauses to solve a mcca is .
To find the minimum solution for mcca, we will solve mcca with equals the number of states in the input filter, and the decrease until we cannot find a solution for mcca (The SAT solver times out).