The network computing chain faces a changing reality over the coming years. According to the latest version of the Ericsson Mobility Report, a surge in consumer device capabilities and data-intensive content will see global data traffic rise fourfold by 2025. Industries, entering the fray for the first time, will place ultra-low-latency and high-privacy demands on the network through time-critical and massive-data use cases such as autonomous transport, remote robotics, smart factories and cross reality.
In all of these cases, the place at which the data is processed – whether it be in a centralized data center or at the edge of cloud – will have an immense impact on both latency and how the massive amounts of data are managed. Not only do these factors impact enterprise and network efficiency, but they will soon be critical in safeguarding human life.
For Azimeh Sefidcon, Ericsson’s Research Director for Cloud Systems and Platforms, that’s why edge computing represents a critical technological step forward, not only for 5G, but also 6G and beyond: “The new demands of today’s industrial use cases have led us into this new paradigm of computing. Edge computing allows us to reduce latency, while also providing improved flexibility, privacy and efficiency in managing the network.”
Future use cases such as the Internet of Senses and autonomous transport systems provide examples as to why edge computing and a fully-integrated network computing chain will be so critical moving forward.
“For use cases which demand ultra-low latency and the high data privacy, such as manufacturing, autonomous vehicles and haptic response, only computing which is embedded within the network can provide the necessary real-time translation and processing of the data,” Azimeh says. “It also enables the enterprise to keep the data in-house which increases data protection.”
The move to edge is also part of a broader evolution of 5G toward more niche, tailormade business models. As the network platform moves deeper into industries, service providers have an opportunity to tailor a service model unique to the enterprise profile, use case and vertical being addressed. Ericsson solutions including NFVi, Edge Gravity, Dynamic Orchestration and network slicing provide opportunities to tailor such models and be first to tomorrow’s business.
“We have overcome the technical challenges. What now makes this area interesting is the many different variants of business model that are possible,” Azimeh says. “The question is, which role does each provider want to play in this value chain? And which partnerships will they choose to make that happen?”
The cloud ecosystem
A challenge which exists in today’s cloud ecosystem is that it is fragmented. Closing this computing gap – from data center to device – will depend on the collaboration of network service providers and cloud ecosystem actors, such as hyperscale cloud providers, operational technology companies and system integrators.
A similar collaboration is already evident with today’s LTE use cases. We can make a phone call or access a server on the other side of the world and barely notice how it travels through so many different networks.
As use cases become increasingly time-critical and demand increased privacy, this global connected network fabric must evolve such that it provides seamless support for applications. The introduction of cloud native design in telecom is the first fundamental step to achieving this.
To close this gap and, importantly, maximize the value for service providers, Ericsson is already engaging other partners and players in the field.
“For more than ten years, we have been working hard to bring these pieces together,” Azimeh says. “When it comes to developing research for use cases, we never develop something in isolation. Instead, we really engage with the industry so that what we develop a solution that has longevity and value.”
Beyond the edge?
Azimeh believes that we are ultimately moving toward something much more integrated which she refers to as the network compute fabric. This form of computing, taking place within the fabric of the network, could not only satisfy latency demands but also the computing that is needed to play as the execution environment for the application.
“It offers benefits from the perspective of both the user and app developer,” Azimeh says. “It would allow us to deploy applications in one environment and having it portable to any other environment without the need for developers to adapt the application. If, for example, the processing power or data storage in an autonomous car reaches its limit, there will be no need for a change in the application as that processing can seamlessly happen in the network.”
No matter the use cases of the future, it’s clear that the world of today and tomorrow belongs firmly to the edge of the network. Distributed cloud computing is paving the way for the future of network communications and, rather than waiting for 5G, service providers should already be looking to build tomorrow's business models today.
Explore the latest trends and insights within edge computing and the industrial cloud.
Find out more about Ericsson’s use cases and solutions across the distributed cloud.