Interactive coding resilient to an unknown number of erasures

We consider distributed computations between two parties carried out over a noisy channel that may erase messages. Following a noise model proposed by Dani et al. (2018), the noise level observed by the parties during the computation in our setting is arbitrary and a priory unknown to the parties. We develop interactive coding schemes that adapt to the actual level of noise and correctly execute any two-party computation. Namely, in case the channel erases T transmissions, the coding scheme will take N+2T transmissions (using alphabet of size 4) to correctly simulate any binary protocol that takes N transmissions assuming a noiseless channel. We can further reduce the communication to N+T if we relax the communication model in a similar way to the adaptive setting of Agrawal et al. (2016), and allow the parties to remain silent rather than transmitting a message in each and every round of the coding scheme. Our coding schemes are efficient, deterministic, have linear overhead both in their communication and round complexity, and succeed (with probability 1) regardless of the amount of erasures T.



page 1

page 2

page 3

page 4


Efficient Multiparty Interactive Coding for Insertions, Deletions and Substitutions

In the field of interactive coding, two or more parties wish to carry ou...

Coding for Interactive Communication with Small Memory and Applications to Robust Circuits

Classically, coding theory has been concerned with the problem of transm...

Multiparty Interactive Coding over Networks of Intersecting Broadcast Links

We consider computations over networks with multiple broadcast channels ...

Optimal Short-Circuit Resilient Formulas

We consider fault-tolerant boolean formulas in which the output of a fau...

Capacity Approaching Coding for Low Noise Interactive Quantum Communication, Part I: Large Alphabets

We consider the problem of implementing two-party interactive quantum co...

Universal interactive Gaussian quantization with side information

We consider universal quantization with side information for Gaussian ob...

The Price of Uncertain Priors in Source Coding

We consider the problem of one-way communication when the recipient does...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

Consider two remote parties that use a communication channel in order to perform some distributed computation. One main obstacle they are facing is possible noise added by the channel. In the early 90’s, Schulman [Sch92, Sch96] initiated the field of interactive coding where the parties use coding techniques in order to complete their computation despite possible noise.

The channel may introduce different types of noise. Among the common noise types are substitution noise, where the channel changes the content of messages (e.g., it flips communicated bits), insertions and deletions, where the channel may introduce a new message or completely remove a transmission, and erasures where the channel erases transmissions, i.e., replacing them with an erasure mark. Throughout the last several years many interactive coding schemes were developed, allowing parties to perform computations over the various channels and noise types, e.g., [BR14, EGH16, SW17], see related work below and [Gel17] for a survey.

Naturally, some bound on the noise must be given. For instance, in the case of substitution noise, a noise fraction of is maximal [BR14]. That is, as long as the noise rate is below of the communication, the coding scheme will succeed. However, if the noise exceeds this level, any coding scheme for a general computation is bound to fail. Similarly, noise rate of is maximal (and achievable) for insertion and deletion noise [BGMO17, SW17], and noise rate of is maximal (and achievable) for erasure noise [FGOS15, EGH16].

In the above coding schemes, the noise level is limited in advance to be below the allowed noise rate. That is, the coding scheme is given as an input a noise threshold, say, for some small . Given this parameter and the length of the computation to be performed , the coding scheme determines how many transmissions it should take in order to perform the computation, say, transmissions. It is then guaranteed that as long as the noise corrupts at most transmissions, the computation succeeds. In a recent work, Dani et al. [DHM18] considered the case where the noise amount may be arbitrary and a priory unknown to the scheme. That is, the channel may corrupt up to some fixed amount  of noise, where is independent of the other parameters of the scheme and the computation to be performed. The work of Dani et al. [DHM18] considered substitution noise and showed a scheme that succeeds with high probability and, if the channel corrupts  transmissions during the execution of the scheme, then the scheme takes transmissions to conclude.

In this work we focus on erasure noise. Such a noise naturally appears in many practical situations, e.g., when an Ethernet packet gets corrupted yet this corruption is detected by the CRC checksum mechanism [IEE16]. In this case the packet is considered invalid and will be dropped. This situation is equivalent to an erasure of that transmission.

Our main result is an efficient and deterministic coding scheme that is resilient to arbitrary and a priori unknown erasures, succeeds with probability  and takes transmissions to complete. The result is summarized in the following theorem.

Theorem 1.1 (main, informal).

Given any two-party binary interactive protocol of length , there exists a coding scheme that simulates  over a 4-ary channel that introduces an arbitrary and a priory unknown number  of erasures. Furthermore, it holds that

We emphasize that the above communication complexity assumes communicates bits, and communicates symbols from an alphabet of size . Note that symbol-wise, our scheme communicates symbols.

Since the amount of noise, , is unknown to begin with, the length of the coding scheme must adapt to the actual noise that occurs during the simulation. Such coding schemes are called adaptive [AGS16]. Adaptivity raises several issues that must be dealt with appropriately. The main issue is termination. Since the coding scheme adapts its length to the observed noise, and since the different parties observe different noise patterns, their termination is not necessarily synchronized. In particular, it is possible that one party terminates while the other party continues to send (and receive) transmissions as dictated by the coding scheme. See Appendix A for an elaborated discussion about unsynchronized termination. In this case, the model should clearly specify what happens in those rounds where only one party is active and the other has terminated. Specifically, it should specify what messages the active party receives in this case.

Following [AGS16], we define a special symbol we call silence. Our setting assumes that silence is (implicitly) communicated by a terminated party. That is, the still active party hears silence until it terminates. We note that silence is corruptible—the channel may erase transmissions after a party terminates and the active party will see an erasure mark instead of a silence. On the other hand, these implicit silent transmissions are not considered part of the communication of the protocol.

Another subtlety in adaptive protocols regards this silence symbol: are the parties allowed to keep silent during the coding scheme? In the setting of Theorem 1.1 we do not allow the parties to keep silent—in each round a party is set to speak it must communicate a valid message. However, in today’s networks, especially in networks with multiple parties, it is very common that parties send messages only if they have information to send, and keep silent otherwise. A similar approach was taken in [AGS16].111This setting is called in [AGS16].

Our second result extends Theorem 1.1 to the setting of [AGS16], namely, where parties are allowed to either speak or remain silent at every round. In this case, termination becomes even more tricky. Recall that the coding scheme of Theorem 1.1 uses silence as an indicator for termination. In the AGS setting, we can avoid using silence throughout the protocol and use it towards termination only, however, this would effectively reduce the AGS setting to the one of Theorem 1.1, and lead to a suboptimal scheme.

Instead, we take a different approach that reduces the communication at the expense of not being able to identify termination at times. In particular, one of the parties may remain active indefinitely. However, that party will remain silent after the other party has terminated, and moreover, it will hold the correct output of the computation. We call this situation semi-termination (see also Definition 4.1). Our result for this setting is as follows.

Theorem 1.2 (AGS setting, informal).

Given any two-party interactive protocol of length , there exists a coding scheme in the AGS setting with semi-termination, that simulates  over a channel that introduces an arbitrary and a priory unknown number  of erasures. Furthermore, it holds that

Moreover, the round complexity of  is given by .

In this setting, it is significant to bound also the round complexity. Although the parties may remain silent, so the communication is low, we don’t want the protocol to last a long time. As stated in the above theorem, the round complexity of the coding  is linear in the round complexity of the noiseless protocol , namely, .

1.1 Coding Scheme Overview

Erasure noise has two interesting properties we utilize towards our scheme. The first is that, if there was a corruption, the receiver is aware of this event; the second property is that, if there was no corruption, the received message is the correct one. Our scheme follows a technique by Efremenko et al. [EGH16], where the parties simulate the noiseless protocol  bit by bit. As long as there is no noise, they can carry out the computation identically to . However, in the case of an erasure the receiver needs to signal the other side that it did not receive the last message and request a retransmission. The main problem is that this request message may get erased as well, making both sides confused regarding to what should be sent next.

In [EGH16] this issue is solved by adding to each message two bits that indicate the round number that is currently being simulated. It is proven that the parties may simulate different rounds however the discrepancy in the round number is bounded by . Hence, a round number modulus  is required in [EGH16] to indicate whether Alice is ahead, Bob is ahead, or they are at the same round.

Our scheme combines the above technique with a challenge-response technique employed by Dani et al. in [DHM18]. Our coding scheme is not symmetric, but rather Alice always begins a round by sending a “challenge” message, followed by Bob replying with a response. Alice then determines whether the challenge-response round was successful: if both messages were not erased, Alice would see the correct response from Bob and would deduce that both messages were received correctly. If Alice received an erasure, or if she received the wrong response, she would deduce that an erasure has occurred during this round and the round should be re-simulated.

Bob, similarly but not identically, gets the challenge message from Alice and verifies that it belongs to a new round (that is, the challenge differs from the previous round). If this is the case, he replies with the next bit. Otherwise, i.e., if Alice’s challenge was erased or if Bob receives the challenge of the previous round, he replies with the response of the previous round. Note that if Alice did send a new challenge and it was erased, she will now get the response of the previous round and realize there is a mismatch.

It is not too difficult to see that the “challenge” suffices to be a single bit—the current simulated round number in , modulus  (which we call the parity from now on). Since the scheme is not symmetric, it can never happen that Alice has advanced to the next round while Bob has not. The other case, where Bob is ahead of Alice by a single round is still possible. Therefore, one bit of information suffices to distinguish between these two cases.

In more details, Alice begins a round by sending Bob the next bit of the simulation of , along with the parity of the round number (in ) of that bit. If Bob receives this message and the parity matches the round number he expects, he records the bit sent by Alice and replies with the next bit of  using the same parity. If this message reaches Alice correctly, she knows the round is over and advances to the next round. If Bob does not receive Alice’s message (i.e., it gets erased), or if the parity is incorrect, Bob replies with the bit of the previous round along with the parity of that round. Similarly, if Alice receives a message with a wrong parity or an erased message, she keeps re-simulating the same round, until she gets the proper reply from Bob.

Note that a single erasure delays the progress of the simulation by a single round (2 transmissions). However, once there is a round in which both messages are not erased, the simulation correctly continues and the succeeding two bits of  are correctly simulated. That is, as long as there is noise, the simulation just hangs, and when the noise ceases, the simulation continues from exactly the same place it stopped.

Once Alice completed simulating the last round of , she quits. Recall that Bob is always ahead of Alice, thus if Alice completed the simulation, so has Bob. After Alice terminates, Bob receives silence unless erased by the channel. When Bob hears a silence, he learns that Alice has terminated and quits the scheme as well. The noise may delay Bob’s time of termination by corrupting the silence, however once the noise is over, Bob will learn that the simulation has completed and will quit as well.

Simulation in the AGS Scheme.

Our second scheme works in the AGS model, where parties are allowed to remain silent if they wish, and they can take advantage of this fact in order to communicate information in an optimized manner, and reduce communication complexity. Specifically, consider the above idea of challenge-response, where a party replies with the wrong response in order to indicate there was an error and a round should be re-simulated. In the AGS setting, we can use silence to signal this retransmission request.

The idea is as follows. Similar to the scheme above, Alice begins a round by sending her bit to Bob (along with the parity). If the transmission is received correctly, Bob replies with his next bit. If Bob receives an erasure instead, he remains silent. This signals Alice that an error has happened and that she should re-simulate the last round. Similarly, if Alice sees an erasure she keeps silent. In all other cases, i.e., receiving a silence or receiving a wrong parity, the parties re-transmit their last message as before.

The effect of a party keeping silent for asking a retransmission is reducing the communication complexity. Note that each erasure causes the recipient to remain silent for one single round, instead of sending a message that indicates an erasure. Then, the simulation continues from the point it stopped. On the other hand, the analysis becomes slightly more difficult in this case since silence may be erased as well, causing the other side to remain silent and signal there was an erasure. This may cause the first party to repeat its message while the other side should have actually resent its message in order to advance the simulation. Luckily, this issue doesn’t falsify the correctness of the simulation—as a result of sending the parity, this extra retransmission is simply ignored. Furthermore, it doesn’t increase the communication since this extra transmission happens only when multiple erasures have occured.

When silence has a meaning of requesting a retransmission, we cannot use it anymore to indicate termination. Note that whatever Alice sends Bob to inform him she is going to terminate may get erased, so if Alice terminates Bob will not be aware of this fact, and remain active. If Alice waits to hear a confirmation from Bob that he received the indication and learned that Alice is about to terminate—this confirmation may get erased. Bob never learns if Alice has received his acknowledgment or not; then, if Bob assumes that Alice has terminated and terminates himself, it will be Alice who hangs in the protocol waiting for Bob’s confirmation, and so on. Our approach to this conundrum is to allow the parties not to terminate as long as there exists a point in time beyond which the parties remain silent and both hold the correct output. In our scheme, Alice will actually terminate once she learns that the simulation is done222It is possible to let Alice send Bob a “termination message” right before she quits in order to signal Bob she is about to terminate. In case this message is not erased, Bob will terminate as soon as he hears this special message. Otherwise, he will keep running the scheme but remain silent. In either case, the protocol satisfies the semi-termination requirements. (recall that if Alice completed the simulation then Bob completed it as well, but the other direction does not necessarily hold). Bob’s actions in the final part of the protocol are slightly different from his normal behaviour. Once Bob completed simulating , but is unaware if Alice completed simulating  or not, he keeps silent unless he hears a message from Alice that re-simulates the final round. In this case, he replies with his final bit. In all other cases he remains silent.

1.2 Related Work

The field of interactive coding was initiated by the seminal work of Schulman [Sch92, Sch96] focusing on two parties that communicate over a binary channel with either substitution noise (random or adversarial) or with erasure noise. Followup work (for substitution noise) developed coding schemes with optimal resilience [BR14, BE17, GH14] efficiency [Sch03, Bra12, GMS14, BKN14, GH14, GHK18], or good rate [KR13, Hae14, GHK18]. Coding schemes for different channels and noise types were developed in [Pan13, EGH16, GH17] for channels with feedback, in [BGMO17, SW17, HSV18, EHK18] for insertions and deletions noise, and in [BNT14, LNS18] for quantum channels. Interactive coding over channels that introduce erasure noise was explored in [Sch96, Pan13, FGOS15, GH17, EGH16, AGS16]. In particular, Efremenko et al. [EGH16] developed efficient coding schemes for optimal erasure rates, and Gelles and Haeupler [GH17] developed efficient coding schemes with optimal rate assuming a small fraction of erasures. Adaptive models were considered in [AGS16, GHS14, GH14]. See [Gel17] for a survey on the field of interactive coding.

Closest to our work is the work of Dani et al. [DHM18] who considered the case of an arbitrary noise amount that is unknown to the scheme. Their coding scheme assumes substitution noise which is oblivious to the randomness used by the parties as well as to the bits communicated through the channel. An AMD code [CDF08] is used to fingerprint each transmitted message, allowing the other side to detect corruptions with high probability. Aggrawal et al. [ADHS18] use similar techniques to develop a robust protocol for message transfer, assuming bi-directional channel that suffers from an arbitrary (yet finite) and unknown amount of bit flips. Aggarwal et al. [ADHS17] extended this setting to the multiparty case, where parties, rather than two, perform a computation over a noisy network with an arbitrary and unknown amount of noise.

2 Model Definition

Standard notations.

For an integer we denote by the set . All logarithms are taken to base 2. The concatenation of two strings and is denoted . We let denote the empty string.

Interactive computation.

In our setting, two parties, Alice and Bob, possess private inputs and , respectively, and wish to compute some predefined function . The computation is performed by exchanging messages over a channel with fixed alphabet . The computation is specified by a synchronous interactive protocol . An interactive protocol is a pair of algorithms that work in rounds and specifies, for each round and each one of the parties, the following details: (1) which party speaks and which party listens in this round; (2) if the party is set to speak, which symbol to communicate; (3) whether the protocol terminates and if so, what the output is.

Without loss of generality, we assume that the (noiseless) protocol 

is alternating, i.e., Alice and Bob speak in an alternating manner, Alice speaks in odd rounds and Bob in even rounds. If this is not the case, it can be made so while increasing the communication by at most twice as much of the original protocol. We define a

round of the noiseless protocol to be a sequence of two consecutive messages such that the first is sent by Alice and the second by Bob. For example, the first round consists of the first message sent by Alice and the subsequent message from Bob. In general for , after rounds have elapsed, the round consists of the

message sent by Alice and Bob’s subsequent message. For the sake of convenience, we assume that the last message of the protocol is sent by Bob; this can be ensured by padding the protocol by at most one bit. Since the protocol is alternating we can also think of rounds being associated with timesteps. More formally, in the

round, Alice and Bob send messages corresponding to the and the timestep respectively. We say that a protocol has length if Alice and Bob exchange messages in the protocol.

Given a specific input  the transcript is the concatenation of all the messages received during the execution of  on .


The communication channel connecting Alice and bob is subject to erasure noise. In each timestep, the channel accepts a symbol and either outputs or a special erasure mark (). The noise is assumed to be worst-case (adversarial), where up to symbols may be replaced with erasure marks. The value of is unbounded and unknown to the parties—their protocol should be resilient to any possible . A noise pattern is a bit-string that indicates erasures in a given execution instance of the protocol. Specifically, there is an erasure in the -th round if and only if . Given a specific instance where both parties terminate by round , the number of erasures that are induced by on that instance is the Hamming weight of , i.e., the number of 1’s in the bit-string. We sometime allow to be of infinite length, however our protocols will be resilient only against noise patterns with bounded amount of noise (i.e., if is infinite, then its suffix is required to be the all-zero string; we call such noise finite noise).

Coding scheme: Order of speaking, silence and termination.

We say that a coding scheme  simulates the protocol over an erasure channel with corruptions, if on every pair of inputs both parties output the transcript  if executing  on in the presence of up to erasures.

We assume that, at a given round, exactly one party can be the sender and the other party is the receiver, that is, it is never the case that both parties sends a symbol or both listen during the same round. On the other hand we do not assume that the parties terminate together, and it is possible that one party terminates while the other does not. In this case, whenever the party that hasn’t yet terminated is set to listen, it hears some default symbol , which we call silence.

There are several ways to treat silence. One option, taken in [AGS16], is to treat the silence similar to any other symbol of . That is, as long a party has not terminated and is set to speak, it can either send a symbol or remain silent (“send ”) at that round. We take this approach in the scheme of Section 4. A different approach would be to require the parties to speak a valid symbol from  while they haven’t terminated. That is, a party cannot remain silent if it is set to speak; this prevents the parties from using silence as a means of communicating information during the protocol. Once a party terminates, and only then, silence is being heard by the other party. This mechanism makes it easier for the parties to coordinate their termination. In particular, the event of termination of one party transfers (limited) information to the other party. We take this approach in the scheme of Section 3.

Another subtlety stems from the fact that the length of the protocol is not predetermined. That is, the length of the protocol depends on the actual noise in the specific instance. Such protocols are called adaptive [AGS16]. In this case it makes sense to measure properties of the protocol with respect to a specific instance. For instance, given a specific instance of the protocol on inputs with some given noise pattern, the communication complexity is the number of symbols sent by both parties in the specific instance. The communication is usually measured in bits by multiplying the number of symbols by . The noise in a given instance is defined to be the number of corrupted transmissions until both parties have terminated, including corruptions that occur after one party has terminated and the other party has not. Corruptions made after both parties have terminated cannot affect the protocol, and we can assume such corruptions never happen.

3 A coding scheme for an unbounded number of erasures

In this section we provide a coding scheme that takes any noiseless protocol  and simulates it over a channel that suffers from an unbounded and unknown amount of erasures . The coding scheme uses an alphabet  of size 4; in this setting parties are not allowed to be silent (unless they quit the protocol) and in every round they must send a symbol from .

3.1 The Coding Scheme

The coding scheme for Alice and Bob, respectively, is depicted in Algorithms 1 and 2.

Inspired by the simulation technique of [EGH16], our simulation basically follows the behavior of the noiseless protocol  step by step, where Alice and Bob speak in alternating timesteps. In each timestep, the sending party tries to extend the simulated transcript by a single bit. To this end, the parties maintain partial transcripts for Alice (and for Bob) which is the concatenation of the information bits of that the parties have simulated so far and are certain of. Then, if Alice is to send a message to Bob, she generates the next bit of given her partial simulated transcript, i.e., , and sends this information to Bob.

In addition to the information bit, Alice also sends a parity of the round number she is currently simulating. That is, Alice holds a variable which indicates the round number she is simulating. Recall that each round contains two timesteps, where Alice communicates in the first timestep and Bob in the second. Alice sends Bob the next bit according to her and waits for Bob’s reply to see if this round was successfully simulated. If Bob’s reply indicates the same parity (i.e., the same round), Alice knows her message arrived to Bob correctly and hence the round was correctly simulated. In this case Alice increases . Otherwise, she assumes there was a corruption and she keeps as is; this causes the same round of  to be re-simulated in the next round of the simulation protocol.

Bob holds a variable which again holds the (parity of) the latest round in  he has simulated. In a somewhat symmetric manner (but not identical to Alice!), he expects to receive from Alice the bit of the next round of , . If this is the case he responds with his bit of that same round, or otherwise he re-transmits his bit of round .

We prove that the difference between and is at most one. Moreover, we prove that it is always the case that Bob is ahead of Alice. Hence, sending the parity (mod 2) of suffices to regain synchronization and perfectly distinguishes the case where the parties simulate a given round from the case where an erasure happened and it is required to re-simulate the same round. We note that, contrary to [EGH16], our coding scheme is not symmetric, and Alice can never advance unless she gets a correct reply from Bob. On the other hand Bob may receive a correct message from Alice and make progress, while his reply to Alice gets erased, causing Alice to re-simulate the same round.333This is also the reason why the simulation of [EGH16] requires a parity that distinguishes 3 cases (Alice is ahead, Bob is ahead, and the parties are synchronized) and requires 2 bits of parity, while our simulation requires only a single parity bit.

Given the above, our coding scheme assumes a channel with alphabet

where every non-silent message can be interpreted as where is the information bit (simulating ) and is the parity of the number of rounds seen so far by the sender.

The above continues until Alice has simulated the entire transcript of , i.e, when reaches the number of rounds in . Since Bob is always ahead of Alice, if indicates that the simulation is over for Alice, then is at least the same and the simulation is over for Bob as well. Bob, however, cannot tell whether Alice has completed the simulation or not. Bob waits until he sees a silence, which indicates that Alice has terminated, only then does he exit the protocol.

Data: An alternating binary protocol of length and an input .
A.2A.1 Initialize , . while  do
        // Send a Message (odd time-step)
      A.6A.5A.4A.3 send  // Receive a Message (even time-step)
      A.8A.7 Obtain if  and  then
            A.12A.11 Delete last symbol of
       end if
end while
Algorithm 1 Simulation of Noiseless Protocol for Alice
Data: An alternating binary protocol of length and an input .
B.2B.1 Initialize , , , while  do // while Silence isn’t heard
        // Receive a Message (odd time-step)
      B.4B.3 Obtain if  or  then
            B.5 // error detected
      B.6 else
       end if
       // Send a Message (even time-step)
      B.9 if  then
      B.14 else  // : keep and unchanged.
            B.15 send
       end if
end while
Algorithm 2 Simulation of Noiseless Protocol for Bob

3.2 Analysis


Recall that in our terminology a round consists of two times steps, where at every time step one party sends one symbol. Alice sends symbols in odd timesteps and Bob in even ones. The above applies for both the noiseless protocol  and the coding scheme given by Algorithms 1 and 2.

We think of the communication transcript as a string obtained by the concatenation of symbols sent during the course of the protocol run. Given a noiseless protocol  and inputs , we denote by (respectively, ) the message sent by Alice (respectively, Bob) in the -th round in the noiseless protocol . Let be the transcript of the players after rounds in . For the transcript in the noiseless protocol, all transmissions are received correctly and the players are always synchronized so we need not distinguish between the transcripts for Alice and Bob separately. However, this need not be the case in the coding scheme .

We start our analysis by fixing a run of the coding scheme , specified by a given erasure pattern and inputs and that Alice and Bob receive. Let be the number of timesteps in the given run. We note that Bob always terminates last, triggered by hearing a silence symbol, hence is odd. We define rounds in the coding scheme in the same way as we define rounds in the noiseless protocol. That is, for , round (in the coding scheme) corresponds to the timesteps and . Throughout this section by round we refer to the round in the coding scheme unless it is specified otherwise. For the sake of clarity, we maintain that round begins at the start of the -th timestep and ends at the ending of the -th timestep. Since is alternating, Alice begins a round by sending a message and the round ends after Alice receives Bob’s response.

We define (and ) to be the termination round of Alice (and respectively, Bob). Observe that since Bob terminates only once he hears silence, which can happen only if Alice exits first. Since the protocol ends when Bob exits, . For any round  and variable we denote by the value of  at the beginning of round . In particular, and denote the values of and at the beginning of round . Since is not defined after Alice exits in Algorithm 1, we set for all . Similarly, we set for all . Let and be the messages sent by Alice and Bob (see Line 1 and Line 2 or 2) respectively in round .

Technical Lemmas and Proof of correctness.

We begin with the following simple observation that, in every round, the parties transcripts (and the respective round number the parties believe they simulate) either increase by exactly the messages exchanged during the last round, or they remain unchanged.

Lemma 3.1.

For any , the following holds.

  1. and and,

  2. and .

Furthermore, () changes if and only if () changes.


Consider Algorithm 2. It is immediate that can increase by at most 1 in every round. In round Bob starts by either appending to and setting ; or keeping unchanged and setting . In the former case, Bob appends to and increases . Otherwise, he keeps unchanged, and remains the same as well.

In Algorithm 1 Alice may decrease her , but note that she always begins round by increasing it (Line 1) and appending to (Line 1). She then either decreases back to what it was and in this case she erases (see Lines 1 and 1), or she keeps it incremented and appends to . For , i.e., after Alice terminates, the above trivially holds. ∎

Corollary 3.2.

For any two rounds , if and only if . Similarly, if and only if .


Note that are non-decreasing. Lemma 3.1 proves that these two variable increase simultaneously, which proves this corollary. The same holds for . ∎

Lemma 3.3.

For , one of the following conditions holds:

  1. and or,

  2. and .

Furthermore, in either case, and are prefixes of .


We prove the lemma by induction on the round . In the first round, , (see Lines 12) and Item 1 is satisfied. Note that and are trivially the prefixes of . Now assume that the conditions hold at (the beginning of) round  and consider what happens during this round.

Case 1: .

Alice receives an (uncorrupted) message from Bob that carries the parity (i.e., when Alice executes Line 1 and not the else part of Lines 11). This means that Bob sends a message with parity . Note that Bob can either send the message he has in memory (Line 2) or a new generated message (Line 2). In the former case, the parity must be the saved value of , i.e., . Since Bob sends we know that he generates a new message, that is, he executes the if part where . In particular, he increases (Line 2). Hence, . In the case where Alice receives a corrupted message, she decrements (that she had increased at the beginning of the round) and also deletes the last message from her transcript. Bob however may have received Alice’s message correctly and in that event, he will increment his value of and update his transcript.

Note that if , then by the induction hypothesis and Lemma 3.1 both transcripts either remained the same in round  (so they are still the same at the beginning of round ), or both transcripts increased by appending to each, so they are still equal. Similarly, if , Lemma 3.1 establishes that hasn’t changed in round , while , which by the induction hypothesis gives Item 2. Lastly, we observe that and in the event that Bob receives correctly, he sets where . Since and are correct continuations of and respectively, it is clear that

are prefixes of . This together with the above discussion proves that and continue to remain prefixes of .

Case 2: .

In this case, whether Bob receives an erasure or an uncorrupted message, he sets . Indeed, if Alice’s message is not erased, then the parity Bob receives equals his saved parity (since Alice holds and she increases it by one in Line 1 before sending it to Bob). In both cases Bob does not change , i.e., and he sends the message (see Lines 2 and 2) from his memory. .

If Alice receives an erasure she sets to be same as (see Line 1). Since, and the claim holds. However, if Alice receives an uncorrupted message, she notices that (see Line 1) and she does not decrement . In this case, .

From the above, we notice that there are two options: either increases or not. In the former case it is immediate that , while in the latter, does not change. In both cases Bob’s transcript doesn’t change, hence,

Now we prove that and , thus in the former case, or in the latter.

Since in round , Bob sets we have that whence . To prove that the same holds for we will need the following claim.

Claim 3.4.

Supposing round satisfies Item 1, we must have that . If this were not true, then which is a contradiction of Lemma 3.1. On the other hand, if round satisfies Item 2, then we know cannot increase in round , whence, . Using this we have, . ∎

Following the above Claim, , and we get that (Lemma 3.1). Therefore, . From the induction hypothesis, is a prefix of . From the above discussion, we know that either or . In the former case, it is clear that is a prefix of whereas in the latter case, since is a prefix of (by the induction hypothesis) so is .

Claim 3.5.

For any round  such that , round satisfies Item 1 of Lemma 3.3 and .


Recall that . Since Alice exits in round , it must hold that and that for otherwise, Alice would have terminated in round . Via Lemma 3.3 we know that . If we know from the proof of case 1 in Lemma 3.3 that since increases in round , also increases. In the other case, namely, when and , we know from the proof of Lemma 3.3 (case 2) that .

After Alice exits, Bob can either hear silence or an erasure and therefore for all rounds , Bob sets and consequently is never incremented. It follows that for all , . The second part of the claim follows from Corollary 3.2 and Lemma 3.3 and the fact that . ∎

We are now ready to prove the main theorem and show that we can correctly simulate the noiseless protocol  under the specifications of Theorem 1.1. Before doing so, we will prove Claim 3.6 which will be help us bound the communication and also shows that Alice eventually terminates.

Claim 3.6.

In any round where there are no erasures at all, .


Consider the two cases of Lemma 3.3. If and both messages of round are not erased, we showed that increases (case 1). Similarly, if and no erasures occur, Alice extends her transcript and increases again (case 2). ∎

Theorem 3.7.

Let be an alternating binary protocol and be an arbitrary integer. There exists a coding scheme over a -ary alphabet such that for any instance of  that suffers at most erasures overall, Alice and Bob both output . Moreover, .


Lemma 3.3 guarantees that at every given round, Alice and Bob hold a correct prefix of . Moreover, we know that by the time Alice terminates, her transcript (and hence Bob’s transcript) is of length at least , which follows from Claim 3.5 and Lemma 3.1, i.e., from the fact that at the termination, and that every time increases by one, the length of increases by two. Finally, note that if the number of erasures is bounded by , then Alice will eventually reach termination. This follows since after erasures have happened, increases by one in every round (Claim 3.6), until it eventually reaches and Alice terminates.

Finally, we need to prove that the communication behaves as stated. Assume erasures happen up to round  and erasures happen in rounds . Since every round without erasures advances by one (again from Claim 3.6), and since when Alice terminates we have (Claim 3.5), then . Furthermore, after Alice terminates it takes one unerased (odd) round to make Bob terminate as well. Hence, at every round after and until Bob terminates, Alice’s silence must be erased. It follows that . Putting these together, we get

Every round amounts to up to two communicated symbols. Recall that , hence

3.3 Noise Resilience and Code Rate

We can compare the above result to the case where the noise is bounded as a fraction of the symbols communicated in the protocol [Sch96, FGOS15, EGH16]. In our setting, the noise amount can be arbitrary. In order to compare it to the bounded-noise model, we ask the following question. Assume an instance of with large amount of noise  (. Then, what fraction does the noise make out of the entire communication.

As a corollary of Theorem 3.7 it is easy to see that the fraction of noise is lower bounded by

whose limit, as tends to infinity, is . Indeed, is an upper bound on the noise fraction in the bounded-noise setting [FGOS15].

Now suppose that the noise is bounded to be -fraction of the total communication, for some . Then,


This implies a maximal asymptotic code rate of . Indeed,

Yet, the above is somewhat meaningless since is unbounded relative to which can potentially make grow disproportionately to and effectively get a zero rate. A somewhat more interesting measure is the ”waste” factor, i.e., how much the communication of increases per single noise, for large . In our scheme it is easy to see that each corruption delays the simulation by one round, that is, it wastes two symbols (4 bits). This implies a waste rate of  bits per corruption,

Finally, we mention that our result extends to binary alphabet by naively encoding each symbols as two bits. However, this results in a reduced tolerable noise rate of . Similar to the scheme in [EGH16], the noise resilience can be improved to  by encoding each symbol via an error correcting code of cardinality  and distance , e.g., {000,011,110,101}. In this case, two bits must be erased in order to invalidate a round. Each round (two timesteps) consists the transmission of six bits. Hence, the obtained resilience is .

4 A coding scheme in the AGS adaptive setting

In this section we provide an adaptive coding scheme in the AGS setting that simulates any noiseless protocol  and is resilient to an unbounded and unknown amount of erasures .

The setting of this section is based on the adaptive setting described in Section 2 with the following main difference: at any given round, parties are allowed to either remain silent or send a symbol from . The other party is assumed to listen and it either hears silence, a symbol from , or an erasure mark in case the channel corrupted the transmission.

The (symbol) communication complexity of a protocol in this setting, , is defined to be the number symbols the parties communicate, i.e., the number of timesteps in which the sender did not remain silent and transmitted a symbol from . We begin with an alphabet of size 4 and then in Section 4.3 we discuss how to reduce the alphabet to being unary—either a party speaks (sends energy) or it remains silent (no energy). The (symbol) communication complexity then portrays the very practical quantity of “how much energy” the protocol costs. In this case, we define .

Another difference from the previous setting regards termination. In this section, we do not require the parties to terminate and give output. Instead we only require that at some point the parties compute the correct output and that the communication is bounded. Formally, this property of semi-termination is defined as follows.

Definition 4.1.

We say that an adaptive protocol has semi-termination if there exists a round after which both the following conditions hold:

  1. Both parties have computed the correct output and,

  2. Both parties remain silent indefinitely (whether they terminate or not)

The round complexity of an instance, is the number of timesteps the instance takes until the parties terminate. In case of semi-termination, is the number of timesteps until the point in time where the semi-termination conditions hold according to Definition 4.1.

4.1 The coding scheme

The adaptive coding scheme for this setting is described in Algorithms 3 and 4. We now give an overview and in Section 4.2 we prove the correctness of the scheme and analyze its properties such as communication complexity, etc. As mentioned earlier, Algorithms 3 and 4 assume a channel alphabet of size  whose size we reduce in Section 4.3.

The coding scheme is based on the scheme of Section 3. Similar to the scheme in Section 3, we maintain that Alice and Bob communicate only in odd and even timesteps, respectively. Therefore, in odd (even) timesteps, Alice (Bob) may either send a message or remain silent. Whenever a player chooses to speak, they will send a message of the form . Here, Info is the information bit specified by the noiseless protocol (), based on the input and the information received by the player so far. Formally, Alice and Bob maintain the “received information” in the form of a partial transcript— or —which is the concatenation of the information bits exchanged until that point in the protocol. Parity is the parity of the round number (of ) that the player is currently simulating. Recall that a round corresponds to two timesteps where Alice transmits the first message and Bob the second message. As before, Alice maintains her round number as and Bob his as .

The main difference between the coding scheme in this section and the previous regards the way parties signal that an erasure has happened. Here, whenever Alice or Bob receive an erasure, they simply remain silent. This signals the other side that there was an erasure and that the last message should be re-transmitted. Otherwise, they behave similar to the scheme in Section 3. Namely, when they receive a non-silence non-erased message , they check the received parity: if it corresponds to the next-expected round they update their partial transcript by appending the new information bit, and increase the round number  by one. If the parity indicates a mismatch to the current state, they just re-transmit the last message again (as they do in case they hear silence).

The scheme in this section has a semi-termination property, where Bob never terminates the protocol while Alice terminates once her partial transcript obtained the required length. For Bob, it is possible that his partial transcript reaches the maximal length while Alice has not yet completed the simulation. Thus, Bob never exits and instead it remains silent waiting for a message from Alice. If indeed Alice hasn’t completed the simulation such a message is bound to arrive sometime, and Bob will reply to it with the last message of the protocol. Otherwise, Bob will keep waiting. To achieve this behavior, once Bob’s transcript reaches the maximal length he transitions to what we call termination phase. In termination phase, Bob no longer responds to silence by re-transmitting his message, but instead he responds to silence with silence.

Data: An alternating binary protocol of length and an input .
A.2A.1 Initialize , and while  do
        // Odd timesteps
      A.4A.3 if  then
       end if
      A.7send  // Even timesteps
      A.9A.8 receive if  then
       end if
end while
Algorithm 3 Alice’s Simulation of in the AGS setting
Data: An alternating binary protocol of length and an input .
B.2B.1 Initialize , , , and while  do
        // Even timesteps
      B.3 if  then
       end if
      B.6send  // Odd timesteps
      B.8B.7 receive if  then
       end if
end while
 // Once reached switch to termination phase
B.12 while  do
        // Even timesteps
      B.13 if  then
       end if
      B.16send  // Odd timesteps
      B.17 receive
end while
B.18The output is
Algorithm 4 Bob’s Simulation of in the AGS setting

4.2 Analysis

We now analyze the coding scheme  described in Algorithms 3 and 4 and prove that it (1) simulates any even in the presence of  erasures (for any a priori unknown ), and (2) the simulation communicates non-silent transmissions. The above properties are formulated as Theorem 4.11 at the end of this subsection.

The approach of the analysis goes in large parts similar to the analysis of Algorithms 3 and 4 which appeared in Section 3.2. First we show that at any given round, the difference between the round numbers maintained by Alice and Bob is at most one. Furthermore, Bob’s partial transcript is always as long as the one of Alice, but no longer than Alice’s partial transcript plus the information bits corresponding to the ongoing round. Then, we prove that the parties’ partial transcript is always the correct one and that for any round in which there are no erasures—the length of the partial transcript increases. It then follows that when Alice terminates the protocol, we are guaranteed that her partial transcript is the entire simulation. Since Bob’s partial transcript can only be longer, this means he also correctly simulated .

We start our analysis by fixing a run of the coding scheme . We recall that round (of ) corresponds to timesteps and and that Alice and Bob send symbols in odd and even timesteps respectively. Since is alternating, Alice begins a round by sending a message and the round ends after Alice receives Bob’s response. We define to be the termination round of Alice. Observe that , i.e., Bob never terminates. Indeed, once Bob enters the second while loop of Algorithm 4 (Line 4) he never changes his value of and continues to execute this loop. We borrow the definitions of and from Section 3.2. Also, let be the length of the string . Since is not defined after Alice exits in Algorithm 1, we set for all . Similarly, we set for all . If in round Alice or Bob do not remain silent then and be the messages sent by Alice or Bob, respectively (Line 3 and Line 4). In case Alice or Bob remain silent then we set (or ) to be and (or ) to be the empty word .

Lemma 4.2 (Analog of Lemma 3.1).

For any , the following holds.

  1. and and,

  2. and .

Furthermore, () changes if and only if () changes.


In Algorithms 3 and 4 we note that and respectively can increase by at most one in every round. From Lines 44 it is clear that changes (according to Item 2) if and only if changes. We note that Alice begins every round by incrementing (Line 3) in the odd timestep. From Lines 33, we see that changes (according to Item 1) if and only if Alice does not decrement in the even timestep. Lastly, whenever or changes it is clear from Lines 3 and 4 that and respectively.

Once Alice exits (or Bob enters termination phase), the values of (or never change. ∎

Lemma 4.3 (Analog of Lemma 3.3).

For , one of the following conditions holds:

  1. and or,

  2. and .

Furthermore, in either case, and are prefixes of .


We prove the lemma by induction on the round . In the first round, , (Lines 34), thus Item 1 is satisfied and moreover the transcripts are trivially correct prefixes of . Now assume that the conditions hold at (the beginning of) round for all , and consider what happens during this round.

Case 1: .

If Alice does not increase in this round, then either Item 1 or 2 hold via Lemma 4.2. Note that is a correct prefix of  by induction and Alice sends (Line 3).

Now assume that Alice increases at the end of round , i.e., she executes the if part (Line 3) rather than the else part (Line 3). Therefore, Bob must have sent a message with parity . Since , the above can only happen if Bob has executed Lines 44 and increased his own . Hence, .

Lemma 4.2 indicates that both partial transcripts, and have increased during round . We are left to show that they are identical and correct prefixes of . Indeed, both transcripts increase by appending , so they remain identical. Moreover, via Lemma 4.2 we have that