Serverless Platforms on the Edge: A Performance Analysis
The exponential growth of Internet of Things (IoT) has given rise to a new wave of edge computing due to the need to process data on the edge, closer to where it is being produced and attempting to move away from a cloud-centric architecture. This provides its own opportunity to decrease latency and address data privacy concerns along with the ability to reduce public cloud costs. The serverless computing model provides a potential solution with its event-driven architecture to reduce the need for ever-running servers and convert the backend services to an as-used model. This model is an attractive prospect in edge computing environments with varying workloads and limited resources. Furthermore, its setup on the edge of the network promises reduced latency to the edge devices communicating with it and eliminates the need to manage the underlying infrastructure. In this book chapter, first, we introduce the novel concept of serverless edge computing, then, we analyze the performance of multiple serverless platforms, namely, OpenFaaS, AWS Greengrass, Apache OpenWhisk, when set up on the single-board computers (SBCs) on the edge and compare it with public cloud serverless offerings, namely, AWS Lambda and Azure Functions, to deduce the suitability of serverless architectures on the network edge. These serverless platforms are set up on a cluster of Raspberry Pis and we evaluate their performance by simulating different types of edge workloads. The evaluation results show that OpenFaaS achieves the lowest response time on the SBC edge computing infrastructure while serverless cloud offerings are the most reliable with the highest success rate.
READ FULL TEXT