Noisy Private Information Retrieval: On Separability of Channel Coding and Information Retrieval

07/16/2018
by   Karim Banawan, et al.
0

We consider the problem of noisy private information retrieval (NPIR) from N non-communicating databases, each storing the same set of M messages. In this model, the answer strings are not returned through noiseless bit pipes, but rather through noisy memoryless channels. We aim at characterizing the PIR capacity for this model as a function of the statistical information measures of the noisy channels such as entropy and mutual information. We derive a general upper bound for the retrieval rate in the form of a max-min optimization. We use the achievable schemes for the PIR problem under asymmetric traffic constraints and random coding arguments to derive a general lower bound for the retrieval rate. The upper and lower bounds match for M=2 and M=3, for any N, and any noisy channel. The results imply that separation between channel coding and retrieval is optimal except for adapting the traffic ratio from the databases. We refer to this as almost separation. Next, we consider the private information retrieval problem from multiple access channels (MAC-PIR). In MAC-PIR, the database responses reach the user through a multiple access channel (MAC) that mixes the responses together in a stochastic way. We show that for the additive MAC and the conjunction/disjunction MAC, channel coding and retrieval scheme are inseparable unlike in NPIR. We show that the retrieval scheme depends on the properties of the MAC, in particular on the linearity aspect. For both cases, we provide schemes that achieve the full capacity without any loss due to the privacy constraint, which implies that the user can exploit the nature of the channel to improve privacy. Finally, we show that the full unconstrained capacity is not always attainable by determining the capacity of the selection channel.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset