1 Introduction
We describe a compilation language for representing boolean functions which can be viewed as a generalization of several concepts appearing in the literature. A boolean function is represented using a structure consisting of a monotone circuit satisfying the decomposability property and whose inputs called leaves are associated with propagation complete (PC) or unit refutation complete (URC) encodings of some simpler functions. We call this structure backdoor decomposable monotone circuit (BDMC), because it is a generalization of backdoor trees introduced in [25]. This structure generalizes also other concepts appearing in literature. A DNNF [9] and a disjunction of URC encodings [3] are both special cases of BDMCs. We distinguish two versions, a PCBDMC which has PC encodings in leaves and URCBDMC which has URC encodings in leaves. If we consider circuits with only one node, we obtain that PCBDMC sentences generalize PC formulas introduced in [4] and URCBDMC sentences generalize URC formulas introduced in [11]. Since the sizes of URC and PC encodings are polynomially related by Theorem 1 in [2], the same is true for URCBDMCs and PCBDMCs. The main result of this paper is that we can compile a PCBDMC or URCBDMC into a PC or URC encoding of size polynomial with respect to the total size of the input BDMC.
Combining the results of [5] or [7] with the fact that both DNNFs and PC encodings are special cases of PCBDMCs, we obtain that the language of PCBDMCs is strictly more succinct than the language of DNNFs. We also present an example of a CNF formula such that every backdoor tree with respect to the base class of renamable Horn formulas has exponential size, although the function can be represented by a DNNF sentence, so also by a URCBDMC or PCBDMC sentence of linear size.
A smooth DNNF can be compiled into a propagation complete encoding of linear size with respect to the size of by techniques described in [16, 12]. We generalize this result to a more general structure, where the leaves contain URC or PC encodings instead of single literals and smoothness is not required. On the other hand, the method of the transformation is different from the method used in [16, 12] and the size of the output is not bounded by a linear function of the size of the input, although it is still polynomial.
The authors of [3] studied properties of unit refutation complete encodings and proved, in particular, that the disjunction closure can be computed in polynomial time for unit refutation complete encodings. Our result generalizes this in two directions. We describe a polynomial time transformation of an arbitrary URCBDMC sentence, which is a more general structure built on top of a collection of URC encodings than disjunction, into a single URC encoding. Moreover, our approach generalizes to PCBDMCs and propagation complete encodings instead of unit refutation complete encodings. Similarly as in [3], our construction uses a Tseitin transformation of the BDMC in the first step and then simulates the unit propagation under conditions represented by additional literals. In particular, for a URCBDMC sentence that is a disjunction of URC encodings, the construction is essentially the same as in [3] up to the naming of the variables.
Consider a boolean function which is represented by a CNF . Since backdoor trees are a special case of BDMCs, our result implies that has a PC encoding of size , where is a function which depends only on the size of a smallest backdoor tree with base classes of Horn, renamable Horn CNF formulas, or 2CNF formulas.
Let us consider a CNF representing a boolean function . It is known that the size of a DNNF representing can be parameterized by incidence treewidth of ([6, 23]). It follows by construction described in [12] that the size of a PC encoding of can be parameterized by incidence treewidth of as well. The previous paragraph implies that the size of a PC encoding can be parameterized also by the size of a backdoor tree with some of the base classes listed above.
2 Definitions and Notation
In this section we recall definitions and notation used throughout the text.
2.1 CNF Encoding
We work with formulas in conjunctive normal form (CNF formulas). Namely, a literal is a variable (positive literal), or its negation (negative literal). If is a variable, then we denote . If
is a vector of variables, then we denote by
the union of over . For simplicity, we write if is a variable that occurs in , so is considered as a set here, although, the order of the variables in is important. Given a literal , the term denotes the variable in the literal , that is, for .A clause is a disjunction of a set of literals which does not contain a complementary pair of literals. A formula is in conjunctive normal form (CNF) if it is a conjunction of a set of clauses. A CNF formula consists only of clauses of length at most . We treat a clause as a set of literals and a CNF formula as a set of clauses. In particular, denotes the number of literals in a clause and denotes the number of clauses in a CNF formula . We denote the length of a CNF formula .
Clause is Horn if it contains at most one positive literal, it is definite Horn, if it contains exactly one positive literal. A definite Horn clause represents the implication and we use both kinds of notation interchangeably. The set of variables in the assumption of a definite Horn clause is called its source set, variable is called its target. A (definite) Horn CNF formula consists only of (definite) Horn clauses. Consider a CNF formula on variables and a boolean vector . Renaming of according to is defined as the formula obtained from by replacing each occurence of a literal , such that , by . We say that is renamable Horn if there is a renaming such that is a Horn formula. Such a renaming can be found in linear time [14, 18] by reducing the problem to satisfiability of a specific 2CNF formula.
A partial assignment of variables is a subset of that does not contain a complementary pair of literals, so we have for each . By we denote the formula obtained from by the partial setting of the variables defined by . We identify a set of literals (in particular a partial assignment) with the conjunction of these literals if is used in a formula such as . If is a vector of variables, a mapping is called a full assignment of values to . We identify a full assignment with the vector of its values and it can be viewed as a special case of a partial assignment, however, in some cases we need to differentiate between these two notions.
In this paper we consider encodings of boolean functions defined as follows.
Definition 2.1 (Encoding).
Let be a boolean function on variables . Let be a CNF formula on variables where . We call a CNF encoding of if for every we have
(1) 
The variables in and are called input variables and auxiliary variables, respectively.
2.2 Propagation and Unit Refutation Complete Encodings
We are interested in encodings which are propagation complete or at least unit refutation complete. These notions rely on unit resolution which is a special case of general resolution. We say that two clauses are resolvable, if there is exactly one literal such that and . The resolvent of these clauses is then defined as . If one of and is a unit clause, we say that is derived by unit resolution from and . We say that a clause can be derived from by unit resolution (or unit propagation), if can be derived from by a series of unit resolutions. We denote this fact with . The notion of propagation complete CNF formulas was introduced in [4] as a generalization of unit refutation complete CNF formulas introduced in [11]. We use the following more general notions of propagation complete and unit refutation complete encodings. Let us point out that unit refutation complete encodings are denoted by URCC in [3].
Definition 2.2.
Let be a boolean function on variables . Let be a CNF encoding of with input variables and auxiliary variables .

We say that is a unit refutation complete encoding (URC encoding) of if the following equivalence holds for every partial assignment :
(2) 
We say that is a propagation complete encoding (PC encoding) of if for every partial assignment and for each , such that
(3) we have
(4)
Note that the definition of a propagation complete encoding is less restrictive than requiring that formula is propagation complete as defined in [4]. The difference is that in a PC encoding we only consider literals on input variables as assumptions and consequences in (4). The definition of a propagation complete formula [4] assumes that is the function represented by , so we do not distinguish input and auxiliary variables and the implication from (3) to (4) is required for the literals on all the variables.
It was shown in [1] that a prime 2CNF formula is always propagation complete, thus the same holds for 2CNF encodings. On the other hand, Horn and renamable Horn formulas are unit refutation complete [11].
In some cases it is advantageous to have a PC or URC encoding in CNF for a fixed constant . Given a CNF encoding of a boolean function, we obtain a CNF encoding of the same function by the standard technique to transform a CNF into a CNF. Namely, we split the long clauses and use new variables to link the parts together. It is not hard to see that if this technique is applied to a PC or URC encoding, we obtain a PC or URC encoding, respectively. In order to refer to this property later, we formulate it as a lemma.
Lemma 2.3.
Let be a CNF encoding of a function and let be a constant. Then there is a CNF encoding with , and . Moreover if is a PC (or URC resp.) encoding of , then so is .
2.3 Dnnf
Let us briefly recall the notion of DNNF [9].
Definition 2.4.
A sentence in NNF is a rooted, directed acyclic graph (DAG) where each leaf node is labeled with , , or , where is a set of input variables. Each internal node is labeled with or and can have arbitrarily many children.
Assume is a NNF with input variables and nodes . We always assume that the inputs of a gate precede it in the list of nodes. Hence, if is an input to , then . For every , let denote the set of input variables from which the node is reachable by a directed path. Each node , represents a function on variables . Given this notation we can now define the language of DNNF sentences as follows.
Definition 2.5.
We say that a NNF is decomposable (DNNF), if every AND gate with inputs satisfies that are pairwise disjoint. In other words, the inputs to each AND gate have pairwise disjoint sets of variables they depend on syntactically.
2.4 Backdoor Trees
We first recall the concept of backdoor sets introduced in [27, 26]. As a base class for a backdoor set we consider a class of CNF formulas for which the satisfiability and the membership problem can be solved in polynomial time. Let be a base class and let be a CNF formula on variables . Then is a strong backdoor set of , if for every full assignment we have that is a formula in . Finding smallest strong Hornbackdoor sets and strong 2CNFbackdoor sets is fixedparameter tractable with respect to the size of a smallest backdoor set [21]. Other classes of CNF formulas were considered as base classes in literature, let us mention backdoors to heterogeneous class of SAT [13].
Backdoor sets were generalized in [25] to backdoor trees. A splitting tree is a rooted binary tree where every node which is not a leaf has exactly two child nodes. Each nonleaf vertex is labeled with a variable and the two edges leaving are labeled with and
. No variable appears more than once on a single path from the root to a leaf. The notion of splitting tree is closely related to the notion of decision tree which is obtained by assigning constants to the leaves of a splitting tree and represents a boolean function in a natural way. A splitting tree
is a representation of a set of restrictions of a formula in such a way that each leaf of is assigned the formula , where is the partial assignment defined by the labels of edges on the path from the root to .Assume, is a CNF formula on variables . A backdoor tree of is a splitting tree on a set of variables which satisfies that for every leaf the formula belongs to class where is the partial assignment associated with leaf . We denote the number of leaves in . The size of is defined as so that it is comparable with the sizes of backdoor sets. In particular, it was observed in [25] that if is the size of a smallest backdoor set of a CNF , then the number of leaves in a smallest backdoor tree of satisfies and the size of satisfies . It was shown in [25] that finding a backdoor tree of a given size is fixedparameter tractable for classes of Horn and 2CNF formulas.
3 Backdoor Decomposable Monotone Circuits
In this section we introduce a language of backdoor decomposable monotone circuits (BDMC) which consists of sentences formed by a combination of a decomposable monotone circuit with CNF formulas from a suitable base class at the leaves. For presenting the transformation of a BDMC into a PC or URC encoding, we use the base classes of PC or URC encodings which are the largest possible for the presented proofs. These classes admit a polynomial time satisfiability test. However, the corresponding membership tests are coNPcomplete, since it is coNP complete to check if a formula is URC [8, 17] or PC [1]. For this reason, when the complexity of algorithms searching for a BDMC for a given function is in consideration, we assume that a suitable subclass with a polynomial time membership test is used. However, this is outside the scope of this paper, although we prove some results concering renamable Horn BDMCs in this section.
Definition 3.1 (Backdoor Decomposable Monotone Circuit).
Let be a base class of CNF encodings. A sentence in the language of backdoor decomposable monotone circuits with respect to base class (BDMC) is a triple where

is a rooted, directed acyclic graph with the set of nodes .

Each internal node of is labeled with and and can have arbitrarily many children.

is a function assigning each leaf of a CNF encoding from class of a function , .

For each node , let be the union of for all leaves , such that there is a path from to . In particular, if is a gate with inputs , then .

Each node represents a function . For inner nodes, the function is given by first evaluating the leaves and then the circuit rooted at . Since the function depends only on the variables in , it can also be written as , if needed.

The function represented by is the function represented by its root.

Nodes labeled by satisfy the decomposability property: If for a set of indices , then the sets of variables are pairwise disjoint.
Given a BDMC with nodes , we always assume that the children of a node precede it in the list. In particular, if is a child of , then . Node is then the root of . Given two different leaves and with associated CNF encodings and , we always assume that , i.e. the sets of auxiliary variables of encodings in different leaves are pairwise disjoint. We can make this assumption without loss of generality as it can be achieved by renaming the auxiliary variables, if it is not the case.
In Section 4 we consider the language BDMC with equal to the class of PC encodings and in Section 5 we consider the language BDMC with equal to the class of URC encodings.
Note that a decision node in a splitting tree can be rewritten as a disjunction of two decomposable conjunctions. Consequently, backdoor trees with respect to any base class form a special case of BDMCs. On the other hand, Theorem 3.5 implies that if is the class of renamable Horn formulas, then the size of BDMC can be exponentially smaller than the size of a backdoor tree.
By the results of [5] and [7], there are classes of monotone CNF formulas such that for each , the DNNF size of is . In particular, [7] presents a class with the above property consisting of monotone 3CNF formulas and [5] presents a class consisting of monotone 2CNF formulas. In both cases, the proof of existence of the corresponding class is nonconstructive. Every irredundant monotone CNF is in prime implicate form, which means that it is formed by all the prime implicates of the represented function. Such a formula is clearly propagation complete, see [1] for more detail. Together with the known fact that PC encodings are at least as succinct as DNNFs, the lower bounds on DNNF size from [5] and [7] imply the following.
Corollary 3.2.
The language of PC encodings is strictly more succinct than the language of DNNF sentences.
The language of PCBDMCs and also the language of 2CNFBDMCs contains the language of DNNFs as a subset consisting of BDMCs with the literals in the leaves. Hence, the lower bound on DNNF size from [5] implies also the following.
Corollary 3.3.
The language of PCBDMCs and even the language of CNFBDMCs is strictly more succinct than the language of DNNF sentences.
Let us also point out that Theorem 4.6 proven below implies the following.
Proposition 3.4.
The languages of PC encodings and of PCBDMC sentences are equally succinct.
Proof.
PC encodings are a special case of PCBDMC with one node. The opposite direction follows from Theorem 4.6. ∎
One of the reasons for introducing the language of URCBDMC sentences is that it provides an alternative way of compilation of a CNF, if the splitting process used for the compilation into a DNNF leads to a too large structure. If the target structure is an URCBDMC, then a branch can be closed not only if it leads to a literal or a constant, but also if it leads to a Horn or renamable Horn formula. This can be recognized in polynomial time and if all the leaves of the obtained structure satisfy this, we have an instance of URCBDMC and it can be compiled into a URC or PC encoding instead of a DNNF by the results of Section 4 and 5.
Let us consider BDMCs, where is the class of renamable Horn formulas, and let us compare the succinctness of them with the backdoor trees with respect to base class . When using a backdoor tree as a representation of a function, then the whole structure consists of the backdoor tree itself and the original formula. However, since we prove a lower bound on the size of the representation and the original formula has polynomial size in the number of the variables, it is sufficient to formulate the bound in terms of the number of the leaves of the backdoor tree.
Theorem 3.5 below can be viewed as a stronger version of the second part of Proposition 9 in [25] reformulated for comparing renamable Horn BDMCs to renamable Horn backdoor trees. In the proof, we use the same construction as the authors of [25], however, the obtained lower bound is larger.
Theorem 3.5.
For every divisible by , there is a boolean function of variables with the following properties:

is expressible by a CNF formula of size ,

is expressible by a renamable Horn BDMC and even a DNNF of size ,

for every CNF formula representing , every backdoor tree for with respect to the base class of renamable Horn formulas has at least leaves.
Proof.
We use the same construction as the one which is used in the proof of Proposition 9 in [25]. Given , define for each
and let us consider the function on variables defined by
For any , it can be easily checked that the function represented by is not renamable Horn, however, it can be expressed by a DNF of size . If is replaced by this DNF for each , the formula becomes a DNNF of size for and it can be interpreted also as a renamable Horn BDMC for of size with the literals in the leaves.
Let us prove that any renamable Horn backdoor tree of any CNF formula equivalent to contains at least nodes. Consider a backdoor tree with respect to which has renamable Horn formulas in the leaves. We prove that every leaf of is visited by at most satisfying assignments of . Since has satisfying assignments, the tree has at least leaves. Consider a leaf with an associated partial assignment . One can prove that either changes to the zero function for at least one index or fixes at least one variable in for every . In the first case, the leaf is not visited by any satisfying assignment of . In the second case, the leaf is visited by a set of satisfying assignments each of which is a combination of satisfying assignments of for each . Moreover, all the elements of can be obtained by selecting for each at most different satisfying assignments of consistent with and considering all of the combinations of these assignments. It follows that as required. ∎
4 PC Encoding of a PCBDMC
In this section we describe a construction of a PC encoding of a function which is represented by a PCBDMC. The construction uses the following two elements: Formulas in leaves are encoded using a variant of the wellknown dual rail encoding. Then we use Tseitin encoding to propagate values of literals from the leaves to the root.
Consider a PCBDMC representing a function . We use also the additional notation introduced in Definition 3.1. In Section 4.1, we introduce metavariables used in the construction. In Section 4.2, we describe the dual rail encoding in the form which we use. In Section 4.3 we describe the construction of a PC encoding of a given PCBDMC. Finally, in Section 4.4
we estimate the size of a PC encoding obtained by the construction.
4.1 Metavariables
The wellknown dual rail encoding uses new variables representing the literals on the variables of the original encoding. In addition to this, we associate a special variable with the contradiction. These new variables will be called metavariables and denoted as follows. The metavariable associated with a literal will be denoted , the metavariable associated with will be denoted , and the set of the metavariables corresponding to a vector of variables will be denoted
In the next subsection, we describe the dualrail encoding using the metavariables in this form. For notational convenience, we extend the above notation also to sets of literals that are meant as a conjunction, especially to partial assignments. If is a set of literals, then denotes the set of metavariables associated with the literals in , thus
If is used in a formula such as , we identify this set of literals with the conjunction of them, similarly as is interpeted in .
In order to construct a PC encoding from a PCBDMC in Section 4.3, we first construct a definite Horn formula representing derivations of literals on the input and auxiliary variables. These derivations have to be done separately in each node of . Hence, besides of the metavariables described above, we use also the copies of the metavariables in each of the nodes of denoted as follows. For every and every , we denote the metavariable associated with in node . For every leaf , we moreover consider metavariables associated with literals . Using this notation, the set of auxiliary variables used in is as follows:
(5) 
4.2 Dual rail encoding
The construction of the formula in Section 4.3 starts with forming the wellknown dual rail encoding [2, 3, 15, 19] for the formulas in the leaves. The dual rail encoding transforms an encoding of a function into a Horn formula simulating the unit propagation in the original encoding.
Dual rail encoding presented in (6) below represents unit resolution in a general formula using definite Horn clauses on the metavariables . More precisely, the first type of Horn clauses represents a derivation of a literal from a clause of and the negations of all the remaining literals in this clause as a single step. The second type of Horn clauses represents the derivation of a contradiction from two complementary literals. Unit propagation in can also derive a contradiction using a clause from and the negations of all the literals in it. We do not include Horn clauses representing this, since the formula (6) is used only if all clauses in are nonempty. In this case, the direct derivation of the contradiction can be replaced by deriving one of the literals in the clause and together with , we obtain a contradiction in the next step.
Definition 4.1 (Dual rail encoding).
Let be an arbitrary CNF formula. If contains the empty clause, then . Otherwise, the dual rail encoding is the definite Horn formula on metavariables defined as follows.
(6) 
Written as a CNF we get
(7) 
The following lemma captures the basic property of the dual rail encoding. We omit the proof, since it is wellknown, although different authors use different notation for the variables representing the literals and the contradiction is frequently represented by an empty set and not by a specific literal. An application of dual rail encoding with an explicit representation of the contradiction can be found, for example, in the first part of the proof of Theorem 1 in [2]. The notation in [2] relates to the notation in this paper for a variable by the identities , , and for the contradiction by the identity .
Lemma 4.2.
Let be a CNF not containing the empty clause and let . Then for every we have
(8) 
We use the dual rail encoding of the PC encoding associated with a leaf of a PCBDMC using the metavariables specific to the node . To differentiate between dual rail encodings associated with different leaves, we introduce the following notation: We denote the dual rail encoding of formula which uses metavariables in place of for .
4.3 Constructing the Encoding
Table 1 describes a set of Horn clauses which together form a Horn formula . To simplify the presentation of clauses of group 1, we use shortcuts as described in the table.
group  clause  condition 

Clauses for a leaf node ,  
(g1)  
(g2)  
(g3)  
Clauses for node  
(g4)  
(g5)  
(g6)  
Clauses for node  
(g7)  
(g8)  
Additional clauses for the root node  
(g9) 
Formula is a definite Horn formula. We use this formula to derive positive literals with unit propagation when presented with only positive literals in the assumption. Such a form of unit propagation is also called forward chaining and we sometimes use this notion when we want to express that the unit propagation is used in the above sense. By the following theorem proven at the end of this subsection, the formula derives the literals implied by using forward chaining.
Theorem 4.3.
For every and , we have
(9) 
Given a Horn formula satisfying the equivalence (9), we can form a PC encoding of by simply substituting metavariables in with the respective literals or based on the following proposition.
Lemma 4.4.
Let be obtained from by substituting metavariable with for all . Then is a PC encoding of .
Proof.
By Theorem 4.3, satisfies the equivalence (9). First, assume a full assignment , such that , and let us prove that the formula is satisfiable. Let be the set of literals on variables from satisfied by . Since , we have by (9) that
It follows that does not derive for any . Indeed, assume for some . By (9) we get . Since , clearly and thus together we have which is a contradiction.
Consider the assignment of values to variables obtained by setting to all the variables derived by forward chaining in the formula and setting to all the remaining variables. Clearly, we have . Moreover, for every , we have and . It follows that we can construct an assignment of the variables that agrees with on the variables and satisfies for all . In particular, extends . Every clause of is satisfied by . Let us consider the following cases.

If is satisfied by a literal on a variable from , this literal is unchanged by the substitution and satisfies also the corresponding clause of using .

If is satisfied by the literal , the clause is removed from by the substitution.

If is satisfied by a literal , where , then and contains satisfied by and .

If is satisfied by a literal , where , then and contains satisfied by and .
It follows that is satisfiable. In order to prove that is an encoding of , it remains to prove that it is unsatisfiable, if . This is a consequence of propagation completeness proven below.
Let and , such that . By (9) we have
(10) 
We prove that either
(11) 
or
(12) 
by the following argument. Let us fix a minimal forward chaining derivation of either , or from . Let , , be the sequence of positive literals on the metavariables in the order given by the fixed derivation. In particular and for . Let , , be obtained from by the substitution given by the assumption, i.e. if for some , then , otherwise . In particular is either or . Let us prove by induction over that either for all
(13) 
Let be the Horn clause of used to derive . Note that by the choice of the derivation, does not contain in its tail. Let be the set of literals obtained from by the substitution from the assumption. If contains in its head, then the head becomes and is skipped in . If contains complementary literals and , then one can verify by case inspection that , contains negative literals and , and both of the literals and occur in the sequence , . In this case, the corresponding literals are and and we obtain (12) by unit propagation from them. If does not contain complementary literals, it is a clause of . The clause is a Horn clause with the head and if its tail is nonempty, it contains negations of some literals with indices . By induction hypothesis (13), the literals can be derived from the formula before and unit propagation using derives . Note that is either a literal included in or and besides , contains only negations of previously derived literals. Altogether, we obtain (13) or (12) implying (11) or (12). It follows that is a PC encoding of . ∎
The main step of the proof of Theorem 4.3 is the following lemma proven by induction.
Lemma 4.5.
Proof.
Let us first assume that is a leaf, i.e. . Assume first (14). Since is a PC encoding of , we have by (4) that or . By Lemma 4.2 and clauses in groups 1 to 1 we get that .
Assume now (15). Due to acyclicity of we have that (15) implies that only clauses in groups 1 to 1 for leaf are needed in the derivation of . Clauses of group 1 can only be used to propagate to for each literal in . Thus, we have
It follows that or . By Lemma 4.2 we get that or . Since is a PC encoding of , we get that as required.
Let us now assume that and let us assume that the equivalence between (14) and (15) holds for nodes . Assume first (14) holds for . Since is decomposable, it follows that for some . If actually , we get by induction hypothesis that . Using clauses of groups 1 and 1 we get (15). Otherwise we have that and . By induction hypothesis we get that . Using the appropriate clause of group 1 we get (15).
Let us now assume (15). The only clauses which can be used to derive from are in groups 1 to 1. By inspecting these clauses we get that there is satisfying that or . By induction hypothesis we get that . Thus as well and we get (14).
Finally, let us assume that and let us assume that the equivalence between (14) and (15) holds for nodes . Assume first (14) holds for . It follows that for every . By induction hypothesis and using the convention that in case we get that for every . Using clause 1 (if ) or the appropriate clause from group 1 (if ) we get (15).
We are now ready to show Theorem 4.3.
Proof of Theorem 4.3.
Let us first assume that . Since we get by Lemma 4.5 that . Using the appropriate clause from group 1 we get that .
Let us on the other hand assume that and let us show that . For this purpose, consider the following set of literals and :
Considering the fact that we get by Lemma 4.5 that
In particular, and if , then . Denote by the set of literals derived by forward chaining from . Using the fact that is not the target of any of the clauses in , we have that if and only if which is equivalent to . Clearly, for every we have that if and only if . Together with Lemma 4.5 we thus have for every that the following four conditions are equivalent
This implies that if the source set of a clause of group 1 is contained in , then also its target is in . It follows that is the set of literals derived from . Together with the assumption , we obtain . By the equivalences above, as required. ∎
4.4 Size Estimate
The main result of this section is contained in the following theorem.
Theorem 4.6.
Let be a PCBDMC sentence representing function with input variables . Assume that has nodes with leaves and edges. Let us denote the PC encoding of function associated with a leaf . Let us further denote the total number of auxiliary variables, the total length of all PC encodings associated with the leaves of , and the maximum length of a clause in any of the encodings associated with the leaves of . Then has a PC encoding satisfying
(16)  
(17)  
(18) 
Comments
There are no comments yet.