SUSECon 2021: Vendors Aim to Enhance Kubernetes Edge Computing

Linux distribution publisher SUSE is working closely with its partner ecosystem to better support Kubernetes edge computing.

Callum Cyrus

May 25, 2021

3 Min Read
Open Source

With noncloud networks projected to be processing the lion’s share of global data by 2025, developers have turned to containerization tools such as Kubernetes, which is poised to accelerate edge software deployment.

Containers for edge development were a focus at the SUSECON Conference, enabling developers to create development environments that enable workloads to be moved from central clouds or data centers to the edge without conversion requirements.

Containers are self-contained runtime environments that enable developers to minimize operating system variations that would typically stall performance of code. They include a software dependencies, libraries and binaries, as well as configuration files which govern a software’s path of execution. Containers are increasingly important as compute activities move from centralized cloud and data center environments to the edge.

A major focus of this year’s virtual event was synchronizing cloud and edge settings, enabling functions to run across both environments natively.

While over the past decade much compute-intensive processing has taken place in the cloud, users’ needs for mobility and compute-intensive processes such as video streaming are increasingly taking place at the edge.

As a result, the edge is expected to account for 75% of data processing by 2025, according to Gartner estimates.

By packaging all components in a container, developers can move software from one environment to another (testing to production, for example)and maintain performance at stable levels across various operating systems the developer might choose to work from.

That might mean running a machine learning model from a developer’s laptop, for example, or as a specialized instance on cloud servers, or from virtual machines (VMs) stored remotely.

Over the past few years,  open source container protocols such as Kubernetes have proliferated. They are popular with developers that use the major cloud computing platforms, ensuring more interoperability between compute instances hosted by the likes of Amazon Web Services or IBM.

But while developers appreciate the flexibility of Kubernetes, its utility for IoT networks was historically limited by meager compute resources in lightweight endpoints such as sensors, as well as the security risks of deploying runtime environments in distributed network settings.

As computing and analytics functions have moved from the cloud closer to the edge (and, correspondingly, the users and devices that need these resources), the IoT industry has begun to embrace microservices that containerize specific software functions across internet-enabled networks.

What Do IoT Users Want From Containers?

One of the key benefits of container usage in IoT networks is scalability. In landscapes such as industrial factories, the number of IoT sensors can will increase over time. Building containerization into the architecture could help DevOps (combinations of development and operations teams) debug and refine software as operational technology expands, without extensive disruption to users.

However containerization also brings its share of roadblocks at the edge as opposed to the cloud, so careful planning is needed. Larger IoT networks contain a heterogeneous set of devices and operating systems, all with varying capabilities.

While containerization can allow software functions to be quickly repurposed from one operating environment to another, the specifications of IoT devices tend to be more distinct. Some devices are too small to adequately support open source standards such as Kubernetes, and this may increase the complexity of implementation.

In addition, distributed IoT networks are likely to share a host’s local area network setup – this may create network instability because containerized microservices are populated across a multitude of devices. Latency then becomes a concern, possibly limiting one of benefits of edge computing from an IoT perspective.

 

Sign Up for the Newsletter
The most up-to-date news and insights into the latest emerging technologies ... delivered right to your inbox!

You May Also Like