Cooperation in interference-limited wireless networks has the potential to significantly improve the system performance 
. Additionally, variational techniques for Bayesian inference are proven extremely useful for the design of iterative receiver architectures in non-cooperative scenarios. Hence, using such inference methods to design iterative algorithms for receiver cooperation could be beneficial.
Algorithms based on belief propagation (BP) are proposed in [3, 4] for distributed decoding in the uplink of cellular networks with base-station cooperation, assuming simple network models, uncoded transmissions and perfect channel knowledge at the receivers; it is shown that the performance of optimal joint decoding can be achieved with decentralized algorithms. In [5, 6], the authors discuss strategies for base-station cooperation and study the effect of quantizing the exchanged values, still assuming perfect channel knowledge.
In this paper, we study cooperative receiver processing in an interference channel and formulate it as probabilistic inference in factor graphs. We state a probabilistic model that explicitly incorporates the ability of the receivers to exchange a certain type of information. To infer the information bits, we apply a recently proposed inference framework that combines BP and the mean-field (MF) approximation 
. We obtain a distributed iterative algorithm within which all receivers iteratively perform channel and noise precision estimation, detection and decoding, and also pass messages along the edges in the factor graph that connect them. The rate of updating and passing these messages determines the amount of communication over the cooperation links.
Notation: The relative complement of in a set is written as . The set is denoted by
. Boldface lowercase and uppercase letters are used to represent vectors and matrices, respectively; superscriptsand denote transposition and Hermitian transposition, respectively. The Hadamard product of two vectors is denoted by and covariance matrix is denoted by
; the pdf of a Gamma distribution with scaleand rate is denoted by . We write when for some positive constant . The Dirac delta function is denoted by . Finally,
stands for the expectation of a random variable.
Ii System Model
We consider a system with parallel point-to-point links where each user sends information to its corresponding receiver and interferes with the others by doing so. To decode the desired messages, the receivers are able to cooperate by exchanging information over dedicated error-free links.
A message sent by user is represented by a vector of information bits and is conveyed by sending data and pilot channel symbols having the sets of indices and , respectively, such that and ; the sets and are identical for all users. The bits in are encoded and interleaved into a vector of bits which are then mapped to data symbols , where is a (user specific) discrete complex modulation alphabet of size . Symbols are multiplexed with pilot symbols which are randomly drawn from a QPSK modulation alphabet.
The users synchronously transmit their aggregate vectors of channel symbols over an interference channel with input-output relationship
The vector contains the signal received by receiver , is the vector of complex weights of the channel between transmitter and receiver , and contains the samples of additive noise at receiver with pdf for some positive precision . For all , we define the signal-to-noise ratio (SNR) and interference-to-noise ratio (INR) at receiver as
Iii The Combined BP-MF Inference Framework
In this section, we consider a generic probabilistic model and briefly describe the unified message-passing algorithm that combines the BP and MF approaches .
Let be an arbitrary pdf of a random vector which factorizes as
where is the vector of all variables that are arguments of the function for all . We have grouped the factors into two sets that partition : and . The factorization in (2) can be visualized by means of a factor graph . We define to be the set of indices of all variables that are arguments of function ; similarly, denotes the set of indices of all functions that have variable as an argument. The parts of the graph that correspond to and to are referred to as “BP part” and “MF part”, respectively.
The combined BP-MF inference algorithm approximates the marginals , , by auxiliary pdfs called beliefs. They are computed as 
where and are constants that ensure normalized beliefs.
Iv Distributed Inference Algorithm
In this section, we state a probabilistic formulation of cooperative receiver processing and use the combined BP-MF framework to obtain the message updates in the corresponding factor graph; finally, we define a parametric iterative algorithm for distributed receiver processing.
Iv-a Probabilistic system model
The probabilistic system function can be obtained by factorizing the joint pdf of all unknown variables in the signal model. Collecting the unknown variables in vector , we have:
To include in the probabilistic model the ability of the different receivers to exchange information of a certain type, we define an augmented pdf. Depending on the type of shared information, several cooperative strategies can be devised: the receivers could exchange their current local knowledge about the modulated data symbols , or coded and interleaved bits , or information bits . We focus on the case in which the receivers share information on 111The other alternatives can be implemented with straightforward modifications to the model presented in this section.. To construct the augmented pdf for this cooperation scenario, we replace each vector variable and with “alias” variables and , , which are constrained to be equal to the corresponding original variable. Keeping in mind that receiver is interested in decoding message , the factorization of the augmented pdf reads
incorporate the observation vector and they form the set ; the factors are the prior pdfs of the parameters and they form the set ; the factors , , represent the prior pdfs of the vectors and they form the set ; denoting by the subvector of containing the bits mapped on and by the mapping function, for all , the factors
account for the modulation mapping and they form the set ; the factors stand for the coding and interleaving operations performed at transmitter and they form the set ; the factors
are the uniform prior probability mass functions of the information bits and they form the set; finally, for all , the factors
constrain the alias variables , to be equal, and they form the set . Note that, due to these additional constraints, marginalizing (6) over all alias variables , leads to the original probabilistic model (5).
The factorization in (6) can be visualized in a factor graph, which is partially depicted in Fig. 1. The graphs corresponding to the channel codes and interleavers are not given explicitly, their structures being captured by . We coin “receiver ” the subgraph containing the factor nodes , , , , , and the variable nodes connected to them. The factor nodes and model the cooperative link between receivers and .
We can now recast the problem of cooperative receiver processing as an inference problem on the augmented probabilistic model (6): receiver needs to infer the beliefs of the information bits in using the observation vector and prior knowledge, i.e., the pilot symbols of all users222Since the pseudo-random pilot sequences can be generated deterministically based on some information available to all receivers, each receiver is able to reconstruct all the pilot symbols without the need of exchanging them. and their set of indices , the channel statistics, the modulation mappings of all users, the structure of the channel code and interleaver of user , and the external information provided by the other receivers. The inference problem is solved by applying the method described in Section III, which leads to iteratively passing messages in the factor graph. We can control the communication overhead between receivers by adjusting the rate of passing messages through nodes and .
Iv-B Message computations
To make the connection with the arbitrary model in Section III, we define and to be the sets of all factors and variables, respectively, introduced in the previous subsection333With a slight abuse of notation, from this point on we use the names of functions and variables as indices in the sets and , respectively.. We choose to split into the following two sets that yield the “MF part” and the “BP part”:
In the following, we use (4) to derive messages in our setup, focusing on their final expressions. More detailed message computations using the combined BP-MF method can be found in  and  for non-cooperative scenarios.
First, for all we define the statistics
for and we set and , for . We also define444The defined quantities are the parameters determining the corresponding beliefs, because is in the MF part and therefore the beliefs are equal to the “” messages (see (3),(4)). , ,
and we denote by the th entry of .
Channel estimation: Using (4), we obtain the messages
for all , where is a diagonal covariance matrix and
We have ; so, using (4), we obtain
Noise precision estimation: Using (4), we obtain
, with and
We select the conjugate prior pdf, . Using (4), we obtain
Setting the prior pdfs to be non-informative, i.e., , we obtain the estimate , .
Symbol detection: Using (4), we obtain
Assume that in the BP part of the graph we have obtained
where is the extrinsic value of for . According to (4), the discrete messages (APP values)
are sent to the MF part, while are sent to the BP part as extrinsic values.
) reduce to the BP computation rules. Messages from and to binary variable nodes are of the form, with . Computing is equivalent to MAP demapping, , . The messages , , and represent the input values to the de-interleaving and decoding BP operations which output and . Due to the equality constraints (7), messages pass transparently through the factor nodes , . Therefore, the following messages are received by receiver from receiver , and :
Iv-C Algorithm outline
We define the cooperative processing algorithm by specifying the order in which the messages in Section IV-B are computed and passed in the factor graph. The algorithm consists of three main stages:
Receiver obtains initial estimates of its variables. First, estimates of with are obtained for all by using an iterative estimator based on the signals at pilot positions only, similar to the one described in [9, Sec. V.A]. Specifically, we restrict (9), (10), (11) to include only subvectors and submatrices corresponding to pilot indices and we initialize and , . We compute (9) and (10) successively for all , and then (11) and (12); repeat this process times. The initial estimates of are obtained by applying (10) for whole vectors and matrices, with , . Then, we set and , . Estimation of is performed using (11) and (12), followed by symbol detection (13), applied successively for all ; this process is repeated times. Finally, soft demapping and decoding are performed in the BP part, with and initialized to have equal bit weights.
Iv-C2 Information exchange
Iv-C3 Local iteration
Receiver computes (18), followed by (14) and (15), for all . Next, , , are successively estimated using (9) and (10), and is estimated using (12). Then, (13) is successively computed for all , repeating this process times. Finally, soft demapping and decoding are performed in the BP part.
To define the distributed iterative algorithm, we use three parameters: describes the total number of receiver iterations, including the Initialization stage as first iteration; denotes the number of Information exchange stages; for , the vector with strictly increasing elements contains the iteration indices after which an Information exchange stage takes place. For we set .
The steps of the algorithm are:
Initialization for all ; Set and ;
If then go to step 5);
If then Information exchange ; ;
Local iteration for all ; ; go to step 2);
Take hard decisions using the beliefs
V Simulation Results
We consider an OFDM system consisting of links with symmetric channel powers, same noise levels at the receivers, and strong interference, i.e. . The detailed assumptions are listed in Table I. The performance of Algorithm 1 is evaluated through Monte-Carlo simulations. The BER dependence on SNR is illustrated in Fig. 2, while the BER convergence is given in Fig. 3. Receiver collaboration provides a significantly improved performance compared to a non-cooperative setting (). When , an error-floor occurs at BER , but the cooperation scheme with only two exchanges almost achieves the performance of “full” cooperation (); the improvement brought by the second exchange is clearly visible in Fig. 3. All schemes need about – receiver iterations to converge. The benefits of cooperation are also observed in the improved channel weights and noise precision estimation (results are not presented here), which of course lead to improved detection and decoding, and vice-versa.
|Parameters of the OFDM system||Value|
|Number of users|
|Number of active subcarriers|
|Number of pilot symbols||evenly spaced pilots|
|Modulation scheme for data symbols|
|Convolutional code (of both users)|
|Multipath channel model||3GPP ETU|
|Parameters of the algorithm||Value|
|Number of receiver iterations|
|Number of exchanges|
|Number of sub-iterations||,|
We proposed a message-passing design of a distributed algorithm for receiver cooperation in interference-limited wireless systems. Capitalizing on a unified inference method that combines BP and the MF approximation, we obtained an iterative algorithm that jointly performs estimation of channel weights and noise powers, detection, decoding in each receiver and information sharing between receivers. Simulation results showed a remarkable improvement compared to a non-cooperative system, even with 1–2 exchanges between receivers; as expected, a trade-off between performance and amount of shared information could be observed.
In general, our approach provides several degrees of freedom in the design of distributed algorithms, such as the type of shared information and the parameters of the algorithm (number of receiver iterations, rate and schedule of information exchange). The proposed approach can be extended to other cooperation setups and it can accommodate the exchange of quantized values – the quantization resolution thus becoming another implementation choice – by quantizing the parameters of the messages passed between the receivers.
-  D. Gesbert, S. Hanly, H. Huang, S. Shamai Shitz, O. Simeone, and W. Yu, “Multi-cell MIMO cooperative networks: A new look at interference,” IEEE J. Sel. Areas Comm., vol. 28, no. 9, pp. 1380–1408, Dec. 2010.
M. J. Wainwright and M. I. Jordan, “Graphical models, exponential families,
and variational inference,”
Foundations and Trends in Machine Learning, vol. 1, pp. 1–305, 2008.
-  A. Grant, S. Hanly, J. Evans, and R. Müller, “Distributed decoding for Wyner cellular systems,” in Proc. ACTW, 2004.
-  E. Aktas, J. Evans, and S. Hanly, “Distributed decoding in a cellular multiple-access channel,” IEEE Trans. Wir. Comm., vol. 7, no. 1, pp. 241–250, Jan. 2008.
-  S. Khattak, W. Rave, and G. Fettweis, “Distributed iterative multiuser detection through base station cooperation,” EURASIP J. Wirel. Commun. Netw., vol. 2008, pp. 17:1–17:15, Jan. 2008.
-  T. Mayer, H. Jenkac, and J. Hagenauer, “Turbo base-station cooperation for intercell interference cancellation,” in Proc. IEEE ICC, 2006.
-  E. Riegler, G. E. Kirkelund, C. N. Manchón, M.-A. Badiu, and B. H. Fleury, “Merging belief propagation and the mean field approximation: A free energy approach,” submitted to IEEE Trans. Inform. Theory, 2012, arXiv:1112.0467v2[cs.IT].
-  F. Kschischang, B. Frey, and H.-A. Loeliger, “Factor graphs and the sum-product algorithms,” IEEE Trans. Inform. Theory, vol. 47, no. 2, pp. 498–519, Feb. 2001.
-  C. N. Manchón, G. E. Kirkelund, E. Riegler, L. Christensen, and B. H. Fleury, “Receiver architectures for MIMO-OFDM based on a combined VMP-SP algorithm,” submitted to IEEE Trans. Inform. Theory, 2011, arXiv:1111.5848 [stat.ML].