I Introduction
Cooperation in interferencelimited wireless networks has the potential to significantly improve the system performance [1]
. Additionally, variational techniques for Bayesian inference
[2] are proven extremely useful for the design of iterative receiver architectures in noncooperative scenarios. Hence, using such inference methods to design iterative algorithms for receiver cooperation could be beneficial.Algorithms based on belief propagation (BP) are proposed in [3, 4] for distributed decoding in the uplink of cellular networks with basestation cooperation, assuming simple network models, uncoded transmissions and perfect channel knowledge at the receivers; it is shown that the performance of optimal joint decoding can be achieved with decentralized algorithms. In [5, 6], the authors discuss strategies for basestation cooperation and study the effect of quantizing the exchanged values, still assuming perfect channel knowledge.
In this paper, we study cooperative receiver processing in an interference channel and formulate it as probabilistic inference in factor graphs. We state a probabilistic model that explicitly incorporates the ability of the receivers to exchange a certain type of information. To infer the information bits, we apply a recently proposed inference framework that combines BP and the meanfield (MF) approximation [7]
. We obtain a distributed iterative algorithm within which all receivers iteratively perform channel and noise precision estimation, detection and decoding, and also pass messages along the edges in the factor graph that connect them. The rate of updating and passing these messages determines the amount of communication over the cooperation links.
Notation: The relative complement of in a set is written as . The set is denoted by
. Boldface lowercase and uppercase letters are used to represent vectors and matrices, respectively; superscripts
and denote transposition and Hermitian transposition, respectively. The Hadamard product of two vectors is denoted by. The probability density function (pdf) of a multivariate complex Gaussian distribution with mean
and covariance matrix is denoted by; the pdf of a Gamma distribution with scale
and rate is denoted by . We write when for some positive constant . The Dirac delta function is denoted by . Finally,stands for the expectation of a random variable.
Ii System Model
We consider a system with parallel pointtopoint links where each user sends information to its corresponding receiver and interferes with the others by doing so. To decode the desired messages, the receivers are able to cooperate by exchanging information over dedicated errorfree links.
A message sent by user is represented by a vector of information bits and is conveyed by sending data and pilot channel symbols having the sets of indices and , respectively, such that and ; the sets and are identical for all users. The bits in are encoded and interleaved into a vector of bits which are then mapped to data symbols , where is a (user specific) discrete complex modulation alphabet of size . Symbols are multiplexed with pilot symbols which are randomly drawn from a QPSK modulation alphabet.
The users synchronously transmit their aggregate vectors of channel symbols over an interference channel with inputoutput relationship
(1) 
The vector contains the signal received by receiver , is the vector of complex weights of the channel between transmitter and receiver , and contains the samples of additive noise at receiver with pdf for some positive precision . For all , we define the signaltonoise ratio (SNR) and interferencetonoise ratio (INR) at receiver as
Iii The Combined BPMF Inference Framework
In this section, we consider a generic probabilistic model and briefly describe the unified messagepassing algorithm that combines the BP and MF approaches [7].
Let be an arbitrary pdf of a random vector which factorizes as
(2) 
where is the vector of all variables that are arguments of the function for all . We have grouped the factors into two sets that partition : and . The factorization in (2) can be visualized by means of a factor graph [8]. We define to be the set of indices of all variables that are arguments of function ; similarly, denotes the set of indices of all functions that have variable as an argument. The parts of the graph that correspond to and to are referred to as “BP part” and “MF part”, respectively.
The combined BPMF inference algorithm approximates the marginals , , by auxiliary pdfs called beliefs. They are computed as [7]
(3) 
with
(4) 
where and are constants that ensure normalized beliefs.
Iv Distributed Inference Algorithm
In this section, we state a probabilistic formulation of cooperative receiver processing and use the combined BPMF framework to obtain the message updates in the corresponding factor graph; finally, we define a parametric iterative algorithm for distributed receiver processing.
Iva Probabilistic system model
The probabilistic system function can be obtained by factorizing the joint pdf of all unknown variables in the signal model. Collecting the unknown variables in vector , we have:
(5) 
To include in the probabilistic model the ability of the different receivers to exchange information of a certain type, we define an augmented pdf. Depending on the type of shared information, several cooperative strategies can be devised: the receivers could exchange their current local knowledge about the modulated data symbols , or coded and interleaved bits , or information bits . We focus on the case in which the receivers share information on ^{1}^{1}1The other alternatives can be implemented with straightforward modifications to the model presented in this section.. To construct the augmented pdf for this cooperation scenario, we replace each vector variable and with “alias” variables and , , which are constrained to be equal to the corresponding original variable. Keeping in mind that receiver is interested in decoding message , the factorization of the augmented pdf reads
(6) 
where denotes the vector of all unknown variables in (6), including the alias variables. Next, we denote, define and group in sets the factors in (6). For all , the factors
incorporate the observation vector and they form the set ; the factors are the prior pdfs of the parameters and they form the set ; the factors , , represent the prior pdfs of the vectors and they form the set ; denoting by the subvector of containing the bits mapped on and by the mapping function, for all , the factors
account for the modulation mapping and they form the set ; the factors stand for the coding and interleaving operations performed at transmitter and they form the set ; the factors
are the uniform prior probability mass functions of the information bits and they form the set
; finally, for all , the factors(7) 
constrain the alias variables , to be equal, and they form the set . Note that, due to these additional constraints, marginalizing (6) over all alias variables , leads to the original probabilistic model (5).
The factorization in (6) can be visualized in a factor graph, which is partially depicted in Fig. 1. The graphs corresponding to the channel codes and interleavers are not given explicitly, their structures being captured by . We coin “receiver ” the subgraph containing the factor nodes , , , , , and the variable nodes connected to them. The factor nodes and model the cooperative link between receivers and .
We can now recast the problem of cooperative receiver processing as an inference problem on the augmented probabilistic model (6): receiver needs to infer the beliefs of the information bits in using the observation vector and prior knowledge, i.e., the pilot symbols of all users^{2}^{2}2Since the pseudorandom pilot sequences can be generated deterministically based on some information available to all receivers, each receiver is able to reconstruct all the pilot symbols without the need of exchanging them. and their set of indices , the channel statistics, the modulation mappings of all users, the structure of the channel code and interleaver of user , and the external information provided by the other receivers. The inference problem is solved by applying the method described in Section III, which leads to iteratively passing messages in the factor graph. We can control the communication overhead between receivers by adjusting the rate of passing messages through nodes and .
IvB Message computations
To make the connection with the arbitrary model in Section III, we define and to be the sets of all factors and variables, respectively, introduced in the previous subsection^{3}^{3}3With a slight abuse of notation, from this point on we use the names of functions and variables as indices in the sets and , respectively.. We choose to split into the following two sets that yield the “MF part” and the “BP part”:
(8) 
In the following, we use (4) to derive messages in our setup, focusing on their final expressions. More detailed message computations using the combined BPMF method can be found in [7] and [9] for noncooperative scenarios.
First, for all we define the statistics
for and we set and , for . We also define^{4}^{4}4The defined quantities are the parameters determining the corresponding beliefs, because is in the MF part and therefore the beliefs are equal to the “” messages (see (3),(4)). , ,
and we denote by the th entry of .
Channel estimation: Using (4), we obtain the messages
(9) 
for all , where is a diagonal covariance matrix and
We have ; so, using (4), we obtain
(10) 
, with
Noise precision estimation: Using (4), we obtain
(11) 
, with and
We select the conjugate prior pdf
, . Using (4), we obtain(12) 
Setting the prior pdfs to be noninformative, i.e., , we obtain the estimate , .
Symbol detection: Using (4), we obtain
(13) 
with
Assume that in the BP part of the graph we have obtained
(14) 
where is the extrinsic value of for . According to (4), the discrete messages (APP values)
(15)  
are sent to the MF part, while are sent to the BP part as extrinsic values.
(De)mapping, decoding, information exchange: These operations are obtained using (4), which due to (8
) reduce to the BP computation rules. Messages from and to binary variable nodes are of the form
, with . Computing is equivalent to MAP demapping, , . The messages , , and represent the input values to the deinterleaving and decoding BP operations which output and . Due to the equality constraints (7), messages pass transparently through the factor nodes , . Therefore, the following messages are received by receiver from receiver , and :(16) 
(17) 
The messages
(18)  
, , are used in (4) to obtain the soft mapping updates (14).
IvC Algorithm outline
We define the cooperative processing algorithm by specifying the order in which the messages in Section IVB are computed and passed in the factor graph. The algorithm consists of three main stages:
IvC1 Initialization
Receiver obtains initial estimates of its variables. First, estimates of with are obtained for all by using an iterative estimator based on the signals at pilot positions only, similar to the one described in [9, Sec. V.A]. Specifically, we restrict (9), (10), (11) to include only subvectors and submatrices corresponding to pilot indices and we initialize and , . We compute (9) and (10) successively for all , and then (11) and (12); repeat this process times. The initial estimates of are obtained by applying (10) for whole vectors and matrices, with , . Then, we set and , . Estimation of is performed using (11) and (12), followed by symbol detection (13), applied successively for all ; this process is repeated times. Finally, soft demapping and decoding are performed in the BP part, with and initialized to have equal bit weights.
IvC2 Information exchange
IvC3 Local iteration
Receiver computes (18), followed by (14) and (15), for all . Next, , , are successively estimated using (9) and (10), and is estimated using (12). Then, (13) is successively computed for all , repeating this process times. Finally, soft demapping and decoding are performed in the BP part.
To define the distributed iterative algorithm, we use three parameters: describes the total number of receiver iterations, including the Initialization stage as first iteration; denotes the number of Information exchange stages; for , the vector with strictly increasing elements contains the iteration indices after which an Information exchange stage takes place. For we set .
Algorithm 1
The steps of the algorithm are:

Initialization for all ; Set and ;

If then go to step 5);

If then Information exchange ; ;

Local iteration for all ; ; go to step 2);

Take hard decisions using the beliefs
.
V Simulation Results
We consider an OFDM system consisting of links with symmetric channel powers, same noise levels at the receivers, and strong interference, i.e. . The detailed assumptions are listed in Table I. The performance of Algorithm 1 is evaluated through MonteCarlo simulations. The BER dependence on SNR is illustrated in Fig. 2, while the BER convergence is given in Fig. 3. Receiver collaboration provides a significantly improved performance compared to a noncooperative setting (). When , an errorfloor occurs at BER , but the cooperation scheme with only two exchanges almost achieves the performance of “full” cooperation (); the improvement brought by the second exchange is clearly visible in Fig. 3. All schemes need about – receiver iterations to converge. The benefits of cooperation are also observed in the improved channel weights and noise precision estimation (results are not presented here), which of course lead to improved detection and decoding, and viceversa.
Parameters of the OFDM system  Value 

Number of users  
Subcarrier spacing  
Number of active subcarriers  
Number of pilot symbols  evenly spaced pilots 
Modulation scheme for data symbols  
Convolutional code (of both users)  
Multipath channel model  3GPP ETU 
Parameters of the algorithm  Value 
Number of receiver iterations  
Number of exchanges  
Exchange indices  
Number of subiterations  , 
Vi Conclusions
We proposed a messagepassing design of a distributed algorithm for receiver cooperation in interferencelimited wireless systems. Capitalizing on a unified inference method that combines BP and the MF approximation, we obtained an iterative algorithm that jointly performs estimation of channel weights and noise powers, detection, decoding in each receiver and information sharing between receivers. Simulation results showed a remarkable improvement compared to a noncooperative system, even with 1–2 exchanges between receivers; as expected, a tradeoff between performance and amount of shared information could be observed.
In general, our approach provides several degrees of freedom in the design of distributed algorithms, such as the type of shared information and the parameters of the algorithm (number of receiver iterations, rate and schedule of information exchange). The proposed approach can be extended to other cooperation setups and it can accommodate the exchange of quantized values – the quantization resolution thus becoming another implementation choice – by quantizing the parameters of the messages passed between the receivers.
References
 [1] D. Gesbert, S. Hanly, H. Huang, S. Shamai Shitz, O. Simeone, and W. Yu, “Multicell MIMO cooperative networks: A new look at interference,” IEEE J. Sel. Areas Comm., vol. 28, no. 9, pp. 1380–1408, Dec. 2010.

[2]
M. J. Wainwright and M. I. Jordan, “Graphical models, exponential families,
and variational inference,”
Foundations and Trends in Machine Learning
, vol. 1, pp. 1–305, 2008.  [3] A. Grant, S. Hanly, J. Evans, and R. Müller, “Distributed decoding for Wyner cellular systems,” in Proc. ACTW, 2004.
 [4] E. Aktas, J. Evans, and S. Hanly, “Distributed decoding in a cellular multipleaccess channel,” IEEE Trans. Wir. Comm., vol. 7, no. 1, pp. 241–250, Jan. 2008.
 [5] S. Khattak, W. Rave, and G. Fettweis, “Distributed iterative multiuser detection through base station cooperation,” EURASIP J. Wirel. Commun. Netw., vol. 2008, pp. 17:1–17:15, Jan. 2008.
 [6] T. Mayer, H. Jenkac, and J. Hagenauer, “Turbo basestation cooperation for intercell interference cancellation,” in Proc. IEEE ICC, 2006.
 [7] E. Riegler, G. E. Kirkelund, C. N. Manchón, M.A. Badiu, and B. H. Fleury, “Merging belief propagation and the mean field approximation: A free energy approach,” submitted to IEEE Trans. Inform. Theory, 2012, arXiv:1112.0467v2[cs.IT].
 [8] F. Kschischang, B. Frey, and H.A. Loeliger, “Factor graphs and the sumproduct algorithms,” IEEE Trans. Inform. Theory, vol. 47, no. 2, pp. 498–519, Feb. 2001.
 [9] C. N. Manchón, G. E. Kirkelund, E. Riegler, L. Christensen, and B. H. Fleury, “Receiver architectures for MIMOOFDM based on a combined VMPSP algorithm,” submitted to IEEE Trans. Inform. Theory, 2011, arXiv:1111.5848 [stat.ML].
Comments
There are no comments yet.