History-Preserving Bisimulations on Reversible Calculus of Communicating Systems

04/27/2018 ∙ by Clément Aubert, et al. ∙ 0

History-and hereditary history-preserving bisimulation (HPB and HHPB) are equivalences relations for denotational models of concurrency. Finding their counterpart in process algebras is an open problem, with some partial successes: there exists in calculus of communicating systems (CCS) an equivalence based on causal trees that corresponds to HPB. In Reversible CSS (RCCS), there is a bisimulation that corresponds to HHPB, but it considers only processes without auto-concurrency. We propose equivalences on CCS with auto-concurrency that correspond to HPB and HHPB, and their so-called "weak" variants. The equivalences exploit not only reversibility but also the memory mechanism of RCCS.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

Reversing Concurrent Computation

Implementing reversibility in a programming language often requires to record the history of the execution. Ideally, this history should be complete, so that every forward step can be unrolled, and minimal, so that only the relevant information is saved. Concurrent programming languages have a third requirement: the history should be distributed, to avoid centralization of information. To fulfill those requirements, Reversible CCS [6, 7] uses memories attached to the threads of a process.

Equivalences for Reversible Processes

A theory of reversible concurrent computation relies not only on a syntax, but also on “meaningful” behavioral equivalences. In this paper we study behavioral equivalences defined on configuration structures [14], which are denotational models for concurrency. In configuration structures, an event represents an execution step, and a configuration—a set of events that occurred—represents a state. A forward transition is then represented as moving from a configuration to one of its superset, and backward transitions have a “built-in” representation: it suffices to move from a configuration to one of its subset. Many behavioral equivalences have been defined for configuration structures ; some of them, like history- and hereditary history-preserving bisimulations (HPB and HHPB), use that “built-in” notion of reversibility.

Encoding Reversible Processes in Configuration Structures

Figure 1: The encoding of , of the future and of the past of a reversible .

An ongoing research effort [1, 10] is to transfer equivalences defined in denotational models, which are by construction adapted for reversibility, back into the reversible process algebra. Of course, showing that an equivalence on configuration structures corresponds to one on RCCS processes depends on the encoding of RCCS terms into configuration structures. One of them uses the fact that we are typically interested only in reachable reversible processes—processes which can backtrack to a process with an empty memory— called its origin. Then, a natural choice is to consider —the encoding of the origin of , using the common mapping for CCS processes [14]—, and to identify in it the configuration corresponding to the current state of the reversible process. In this set-up, the encoding of is one configuration, , in the configuration structure : every configuration “below” is the “past” of , every configuration “above”, its “future” (Fig. 1).

Contribution

This paper improves on previous results by defining relations on CCS processes that correspond to HPB, HHPB, and their “weak function” variants. The result does not require to consider a restricted class of processes. We introduce an encoding of memories independent of the rest of the process and show that, as expected, the “past” of a process corresponds to the encoding of its memory. The memories attached to a process are no longer only a syntactic layer to implement reversibility, but become essential for defining equivalences. This result gives an insight on the expressiveness of reversibility, as the back-and-forth moves of a process are not enough to capture HHPB.

Related work

The correspondence between HHPB and back-and-forth bisimulations for processes without auto-concurrency [1, 10] motivated some of the work presented here. Our approach shares similarity with causal trees—in the sense that we encode only part of the execution in a denotational representation—where some bisimulations corresponds to HPB [8].

Outline

We start by recalling the definitions of configuration structures (Sect. 2.1), of the encoding of CCS in configuration structures (Sect. 2.2), of (hereditary) history-preserving bisimulations (Sect. 2.3), of RCCS (Sect. 2.4) and of related notions. We also recall previous result on HHPB (Theorem 2.1). We consider the reader familiar with CCS, in particular with its congruence relations and reduction rules.

Sect. 3 starts by defining a structure slightly richer than configuration structures, that we call “identified configuration structures” (Sect. 3.1), and defines basic operations on them. Sect. 3.2 defines and illustrates with numerous examples how identified configuration structures can encode memories. Finally, Sect. 3.3 uses this encoding to define relations on RCCS and CSS processes that are then stated to correspond to HPB and HHPB on configuration structures.

Sect. 4 concludes, and Appendix 0.A gathers the proofs and establishes the robustness of the tools introduced.

2 Preliminary Definitions

We recall the definitions of configuration structures, auto-concurrency (Sect. 2.1), how to encode CCS processes into configuration structures (Sect. 2.2) and the history-preserving bisimulations (Sect. 2.3).

We write the set inclusion, the power set, the set difference, the cardinal, the composition of functions, the set of functions between and , the partial functions and the restriction of to .

Let be a set of names and its co-names. The complement of a (co-)name is given by a bijection , whose inverse is also denoted by . We write for a list of names . We define the sets of labels , let, and use (resp. ) to range over (resp. ).

2.1 Configuration Structures

Definition 1 (Configuration structures)

A configuration structure is a tuple where is a set of events, is a set of labels, is a labeling function and is a set of subsets satisfying:

(Finiteness)
(Coincidence Freeness)
(Finite Completness)
(Stability)

We denote the configuration structure with , and write and for such that .

For the rest of this paper, we often omit , let range over configurations, and assume that we are always given and , for .

Definition 2 (Causality, Concurrency, and Maximality)

For and , the causality relation on is given by iff and , where iff for all with , we have . The concurrency relation on  [9, Definition 5.6] is given by iff . Finally, is maximal if , or .

(a)

(b)

(c)
Figure 2: Examples of configuration structures
Example 1

Consider the configuration structures of Fig. 2, where the set of events and of configurations can be read from the diagram, and where we make the abuse of notation of writing the events as their labels (with a subscript if multiple events have the same label). Note that two events with complement names can happen at the same time (Fig. 2), in which case they are labeled with and called silent transition, as it is usual in CCS (Sect. 2.2).

Definition 3 (Category of configuration structures)

We define the category of configuration structure, where objects are configuration structures, and a morphism from to is a triple such that

  • preserves labels: , for ;

  • is defined as .

If there exists an isomorphism , then we write .

We omit the part of the morphisms when it is the identity morphism.

We now recall how process algebra constructors are defined on configuration structures [13]. The definition below may seem technical, but 6 should make it clear that they capture the right notion.

This definition uses the product of the category of sets and partial functions [13, Appendix A]: letting denote undefined for a partial function, for a set , we define, for two sets and ,

with and .

Definition 4 (Operations on configuration structures [1, 12])

[labelsep=.3333em]

The product

of and is . Define the projections and the configurations such that:

The labeling function is

The relabeling

of along is .

The restriction

of to is , where and . The restriction of to a name is where . For a list of names, we define similarly for .

The parallel composition

of and is , with

  • is the product;

  • with defined as follows:

  • , where .

The coproduct

of and is , where and . The labeling function is defined as when .

The prefixing

of by the name is , for where , ; and , .

Definition 5 (Auto-concurrency [9, Definition 9.5])

If , , and implies , then is without auto-concurrency.

Any configuration structure where configurations have at most one event (like Fig. 2) are without auto-concurrency. Fig. 2, on the other hand, is a configuration structure with auto-concurrency: for we have that , , and yet .

2.2 CCS and its Encoding in Configuration Structures

The set of CCS processes is inductively defined:

(CCS Processes)

In the category of configuration structures (3) one can “match” the process constructors of CCS with the categorical operations of 4.

Definition 6 (Encoding a CCS process [15, p. 57])

Given a CCS process , its encoding as a configuration structure is built inductively:

For now on we assume that all structures use the same set of labels .

Definition 7 (Auto-concurrency in CCS)

A process is without auto-concurrency if is.

2.3 (Hereditary) History-Preserving Bisimulations

HPB [11, 10], [2, Theorem 4] and HHPB [3, Definition 1.4], [2, Theorem 1] are equivalences on configuration structures that use label- and order-preserving bijections between the events of the two configuration structures.

Definition 8 (Label- and order-preserving functions)

A function , for , is label-preserving if for all . It is order-preserving if , for all .

Definition 9 (Hpb and Hhpb)

A relation such that , and if , then is a label- and order-preserving bijection between and and (1) and (2) (resp. (14)) hold is called a history- (resp. hereditary history-) preserving bisimulation between and .

(1)
(2)
(3)
(4)

Note that the bijection on events is preserved from one step to the next. This condition can be weakened, and we call the corresponding relations the weak-function HPB and weak-function HHPB [3, Definition1.4], [10, Definition3.11]111The names weak-HPB and weak-HHPB are more common [3, 10], but can be confused with the weak equivalences of process algebra, which refers to ignoring -transitions..

Definition 10 (wfHpb)

A weak-function history-preserving bisimulation between and is a relation such that and if , then is a label- and order-preserving bijection between and and

Similarly one defines wfHHPB. If there is a HPB between and , we just write that and are HPB, and similarly for HHPB, wfHPB and wfHHPB.

Figure 3: There is no HHPB relation between and .
Example 2

Observe that and , presented in Fig. 3, are HPB, but not HHPB. Any HHPB relation would have to associate the maximal configurations of the two structures and to construct a bijection: taking wouldn’t work, since can backtrack on and cannot. Taking the other bijection, , fails too, since can backtrack on and cannot.

Example 3 ([9])

The processes and is another example of processes whose encodings are HPB but not HHPB.

Example 4

Finally, observe that and are not structuraly congruent in CCS, but the encoding of the two processes are HHPB.

2.4 Reversible CCS and Coherent Memories

Let be a set of identifiers, and range over elements of . The set of RCCS processes is built on top of the set of CCS processes (Sect. 2.1):

(Memory Events)
(Memory Stacks)
(Reversible Thread)
(RCCS Processes)

We denote (resp. , ) the set of identifiers occurring in (resp. , ), and always take . A structural congruence can be defined on RCCS terms [1, Definition 5], the only rule we will use here is distribution of memory: . We also note that , for any reversible process and for some names , memories and CCS processes , writing the -ary parallel composition.

[left label=]1[act.](m ⊳λ. P + Q) -.65 OcFTS ⟨i, λ, Q⟩ . m ⊳P R -.65 OcFTS R’ S -.65 OcFTS S’ 2[syn.]R ∣S -.65 OcFTS R’ ∣S’
[left label=]1[act. ]⟨i, λ, Q⟩ . m ⊳P -.65 OcFTS m ⊳(λ. P + Q) R -.65 OcFTS R’ S -.65 OcFTS S’ 2[syn.]R ∣S -.65 OcFTS R’ ∣S’
R -.65 OcFTS R’ [left label=]1[par.]R ∣S -.65 OcFTS R’ ∣S R -.65 OcFTS R’ a ∉α 2[res.] R\a -.65 OcFTS R’\a R_1 ≡R -.65 OcFTS R’ ≡R_1’ 1[]R_1 -.65 OcFTS R_1’

Figure 4: Rules of the labeled transition system

The labeled transition system for RCCS is given by the rules of Fig. 4. We use as a wildcard for (forward) or (backward transition), and if there are indices and labels such that , then we write . If there is a CCS term such that , we say that is reachable, that is the origin of  [1, Lemma 1] and write . Similarly to what we did in 7, we will write that is without auto-concurrency if is. An example of the execution of a reversible process is given at the beginning of 7.

Note that we cannot work up to -renaming of the identifiers: concurrent, distributed computation is about splitting threads between independent units of computation. If one of unit were to re-tag a memory event as , and another were to try to backtrack on the memory event , then the trace of the synchronization would be lost, and backtracking made impossible. Since we don’t want to keep a “global index” which would go against the benefits of distributed computation, the only option is to forbid -renaming of identifiers.

Memory coherence [6, Definition 1] was defined for RCCS processes with less structured memory events (i.e., without identifiers), but can be adapted.

1[em.]∅⌢∅    m ⌢m’ [left label=]1[ev.]e.m ⌢m’
m ⌢m’ [left label=]1[syn.]⟨i, λ, P⟩.m ⌢⟨i, ¯λ, Q⟩.m’    m ⌢∅ 1[fo.]⋎.m ⌢⋎.m

Figure 5: Rules for the coherence relation
Definition 11 (Coherence relation)

Coherence, written , is the smallest symmetric relation on memory stacks such that rules of Fig. 5 hold.

Note that is not reflexive, and hence not an equivalence, nor anti-reflexive. For the rest of this paper, we will just write “memory” for “memory stack”.

Definition 12 (Coherent processes, [6, Definition 2])

A RCCS process is coherent if all of its memories are pairwise coherent, or if its only memory is coherent with .

We require the memory to be coherent with to make it impossible to have as a memory in a coherent process.

Lemma 1 ([7, Lemma 5])

If and is coherent then so is .

Corollary 1

For every reachable and , occurs once in .

Note that the property above holds for reversible threads, and not for RCCS processes in general: indeed, we actually want memory events to have the same identifiers if they result from a synchronization or a fork.

Definition 13 (Back-and-forth bisimulation)

A back-and-forth bisimulation in RCCS is a relation such that if

Example 5

The processes and are in a back-and-forth bisimulation.

Theorem 2.1 ([1, Theorem 2],[10])

Back-and-forth bisimulation on RCCS processes without auto-concurrency corresponds to HHPB on their encoding.

Note that this result does not hold for the processes in 5 (see also 2). This does not contradict the theorem, as the processes in 5 are with auto-concurrency.

3 Lifting the Restrictions

To define a bisimulation on RCCS (with auto-concurrency) that corresponds to HHPB we first have to encode the memories of a reversible process into a structure similar to the configuration structures, called identified configuration structures (Sect. 3.1). We can then define the encoding (Sect. 3.2), and the equivalences in RCCS that use this encoding of memories (Sect. 3.3).

3.1 Identified Configuration Structures

Definition 14 (Identified configuration structure)

An identified configuration structure, or -structure, is a configuration structure endowed with a set of identifiers and a function such that,

(Collision Freeness)

We call the underlying configuration structure of and write . We write for the identified configuration structure with .

For the rest of this paper, we omit , and assume that we are always given and , for .

Example 6

Fig. 2, with , , and , is a -structure. Note that it is possible to have fewer identifiers than events: take and , and .

For the following remark, we need to suppose that every configuration structure is endowed with a total ordering on its events.

Remark 1

Every configuration structure can be mapped to a -structure.

The mapping is trivial: take to be , and define to follow the ordering on . Note that, in this case, is a bijection.

Definition 15 (Category of -structures)

We define the category of identified configuration structure, where objects are -structures, and a morphism from to is a triple such that

  • is a morphism in from to ;

  • preserves identifiers: .

We denote the forgetful functor.

Definition 16 (Operations on -structures)

[labelsep=.3333em]

The product

of and is :

  • is the product in the category of configuration structures with projections ;

  • , for , is defined as

    with the projections .

Define the projections as the pair .

The relabeling

of along is