Edge server placement with capacitated location allocation
Edge computing in the Internet of Things brings applications and content closer to the users by introducing an additional computational layer at the network infrastructure, between cloud and the resource-constrained data producing devices and user equipment. This way, the opportunistic nature of the operational environment is addressed by introducing computational power in location with low latency and high bandwidth. However, location-aware deployment of edge computing infrastructure requires careful placement scheme for edge servers. To provide the best possible Quality of Service for the user applications, their proximity needs to be optimized. Moreover, the deployment faces practical limitations in budget limitations, hardware requirements of servers and in online load balancing between servers. To address these challenges, we formulate the edge server placement as a capacitated location-allocation problem, while minimizing the distance between servers and access points of a real city-wide Wi-Fi network deployment. In our algorithm, we utilize both upper and lower server capacity constraints for load balancing. Furthermore, we enable sharing of workload between servers to facilitate deployments with low capacity servers. The performance of the algorithm is demonstrated in placement scenarios, exemplified by high capacity servers for edge computing and low capacity servers for Fog computing, with different parameters in a real-world data set. The data set consists of both dense deployment of access points in central areas, but also sparse deployment in suburban areas within the same network infrastructure. In comparison, we show that previous approaches do not sufficiently address such deployment. The presented algorithm is able to provide optimal placements that minimize the distances and provide balanced workload with sharing by following the capacity constraints.
READ FULL TEXT