- Could digital service providers (DSPs) benefit from the rise in enthusiasm for serverless cloud computing?
- Offering cloud-native serverless services, apps and functions from the telco edge might help network operators as they develop enterprise service strategies and make use of their high-speed, low-latency access networks
First there was network functions virtualisation (NFV), which detached network functions such as load balancing, routing, firewalling or encryption from hardwired appliances and jammed them onto increasingly powerful server hardware. It worked, but only just, and as a result NFV appeared to reach a certain level of adoption in telco-land, after which the initial enthusiasm dimmed and the industry pinned its hopes instead on cloud native. That fixed the major NFV drawbacks by adding in orchestration, microservices, containers and continuous delivery, thus offering a well-rounded and flexible virtualisation framework for the further development of network services and cloud applications generally. It also marked the moment when it looked likely that the hyperscalers would eventually impinge on the telecoms market.
One of those further developments has been the idea of serverless computing, which could be seen as one logical progression for the cloud-native brigade. Serverless essentially ‘abstracts away’ the lower levels of the stack, in theory freeing the developer to concentrate on the creative, value-added bit of an application rather than spend time organising the underlying infrastructure to handle an application’s data storage, scheduling, scaling and other requirements. In this sense, a serverless environment doesn’t simply do away with servers and all their tasks, but rather assigns the low-level tasks to a third party (probably the cloud provider) which provides the storage and processing required. Little wonder then that various studies show the serverless approach appeals to application developers, but it’s also chiming with corporate users more broadly for its potential cost and flexibility advantages, especially in relation to edge computing developments.
According to IDC’s IaaSView buyer survey, some 25% of cloud infrastructure as a service (IaaS) buyers have plans to adopt serverless functions in the next year, while a recent report from Datadog claims that serverless computing was “entering the mainstream” with more than half of all organisations already using serverless on either AWS Lambda , Azure Functions or Google Cloud Functions. Of particular interest for telcos here is the high serverless adoption rate clocked for edge developers, which in turn should be encouraging for telcos that are looking to explore business opportunities aligned to their edge strategies – more relevant applications will drive enterprise interest in what the telcos have to offer.
The Cloud Native Computing Foundation in its early 2021 report on The State of Cloud Native Development, which minutely charts the software usage trends of the developer community, calculates that some 48% of edge developers currently go serverless compared to only 33% of all back-end developers, and it claims that the ”lightweight nature of a serverless architecture is particularly appealing to edge developers since they don’t have to manage the underlying infrastructure.”
However, the report also notes that serverless usage rates have grown at a slower pace than Kubernetes and recently even experienced a small unexpected drop – the cause of which is not fully understood, it claims. That drop may be the result of shifting developer allegiances within the serverless category which continues to grow, just not as much as, say, Kubernetes.
AWS Lambda, the first cloud serverless product, continues to lead the segment with 53% of serverless developers using it. More recently Google’s serverless compute platform with built-in Kubernetes containers has notably advanced, says the CNCF, “gaining 8 percentage points in the last six months, whereas Google Cloud Functions (its original serverless offering) experienced a drop of 3%,” presumably partly as a result of cannibalisation. In any case, the CNCF claims that Google’s serverless plus Kubernetes offering converges “the best of both worlds”.
Serverless advantages
With serverless, the cloud provider can enable all the necessary provisioning, scheduling, scaling and back-end operations tasks from within the cloud and, as a result, developers are freed up to get on with front-end development – all the grunt work is taken care of. Needless to say, users pay for the processing and storage tasks assigned by the serverless application, but the point is that usage is minutely metered and users pay only what they use. Among other things, this provides an ongoing incentive to develop applications that use fewer datacentre resources (good for those sustainability goals), but mostly, serverless helps keep an edge application ‘light’ in terms of its ongoing complexity.
An added attraction for developers is the ability to use any of the popular programming languages, so there’s no requirement to learn what Anshu Agarwal vice president general manager of serverless at DigitalOcean, describes as “complex infrastructure-oriented concepts, such as containers and Kubernetes”. That brings allied advantages along for the ride, such as shorter time to market and, because functions only run in response to events, the associated costs are lower. Ditto with the scaling of infrastructure components, which is handled automatically by the serverless application provider.
The serverless landscape
As things currently stand, serverless use has grown significantly, with Datadog reporting that serverless options have been adopted by more than half of users on all three of the major hypersale platforms. Each platform has a roster of serverless options:
AWS: AWS Lambda, AWS App Runner, ECS Fargate, EKS Fargate
Azure: Azure Functions, AKS running on Azure Container Instances
Google Cloud: Google Cloud Functions, Google App Engine, Google Cloud Run
Such is the pace of technical change – and sentiment – that enthusiasm for a particular networking scheme or approach can peak and drop back faster than a digital currency during a financial crisis. This volatility is greatly assisted by the technology itself. Instead of research companies compiling carefully worded survey questionnaires probing companies’ future plans and current deployments, some, like Datadog, prefer instead to rely on network telemetry. That produces data straight from the source about who’s adopted what and how much they’re using it. There’s no room to hide or obfuscate and it means that tiny changes in adoption become visible almost immediately.
Email Newsletters
Sign up to receive TelecomTV's top news and videos, plus exclusive subscriber-only content direct to your inbox.
Subscribe