# Sampling an Edge Uniformly in Sublinear Time

The area of sublinear algorithms have recently received a lot of attention. In this setting, one has to choose specific access model for the input, as the algorithm does not have time to pre-process or even to see the whole input. A fundamental question remained open on the relationship between the two common models for graphs – with and without access to the "random edge" query – namely whether it is possible to sample an edge uniformly at random in the model without access to the random edge queries. In this paper, we answer this question positively. Specifically, we give an algorithm solving this problem that runs in expected time O(n/√(m)log n). This is only a logarithmic factor slower than the lower bound given in [5]. Our algorithm uses the algorithm from [7] which we analyze in a more careful way, leading to better bounds in general graphs. We also show a way to sample edges ϵ-close to uniform in expected time O(n/√(m)log1/ϵ), improving upon the best previously known algorithm. We also note that sampling edges from a distribution sufficiently close to uniform is sufficient to be able to simulate sublinear algorithms that use the random edge queries while decreasing the success probability of the algorithm only by o(1). This allows for a much simpler algorithm that can be used to emulate random edge queries.

READ FULL TEXT