Containers and Kubernetes are the driving force behind how the industry is reinventing the way we build and run applications, fueling enterprise IT efficiency.

By Daniel Teixera, systems engineer at Pure Storage

Containers are a standard unit of software that packages up code and all its dependencies so that an application runs quickly and reliably from one computing environment to another. Containers make it easier to roll out cloud-based applications because they contain all the information needed to run them in manageable packages.

In September 2020, we announced the acquisition of Portworx, the industry’s leading Kubernetes data services platform, for approximately $370 million, so it’s safe to say we recognise the significance of the technology.

Adoption of cloud-native applications is driven by a customer’s need to deliver products to the market quickly. Containers offer businesses granularity in application development and speed of iteration, and effectively decouple the underlying infrastructure from considerations developers need to make.

Adoption of containers in the South African market is widespread – from large enterprises to SMEs, businesses see the agility this new paradigm brings and the flexibility to use any cloud-native platform, be it an on-premises private cloud, a public cloud, or a combination thereof (hybrid cloud). Let’s look at key industry developments.

Importance of data centricity

Data is at the heart of tomorrow’s businesses. Leading digital organisations are using a new cloud-native technology stack to process this data into value and insight. Cloud-native applications are specifically designed to operate in a cloud-like manner, whether in the public cloud or on-prem, from day one. They can be deployed and fixed faster and can be moved across different environments easily.

Cloud-native applications are typically made up of microservices and are packaged in containers. This new cloud-native stack includes a new set of applications – apps that analyse streaming data in real time, apps that index massive quantities of data for search, and apps that train machine learning algorithms on increasingly large data sets.

Undoubtedly, this cloud-native revolution is being powered by a combination of containers and Kubernetes.

Containers make it efficient to run disaggregated applications at high degrees of scale and fluidity with minimal overhead, and Kubernetes creates the machine-driven orchestration that can juggle all these application fragments and assemble them into a composite application as necessary.

Container adoption speaks for itself

Adoption rates of this new cloud-native stack have been staggering. According to 451 Research, 95% of new apps are developed in containers. Enterprises are evolving their cloud strategies to be multi-cloud, and containers are also key to this.

Gartner reports that 81% of enterprises are already multi-cloud and work with more than two cloud providers. Gartner also predicts that 85% of all global businesses will use containers in production by 2025, which is a huge rise from just 35% in 2019.

It’s still an early market with huge growth potential, so it’s inherently hard to forecast, but IDC predicts that the commercial market for container infrastructure software alone will top $1.5 billion by 2022, and enterprises are paying attention.

However, from a skills perspective, we still see a shortage in developer and infrastructure skills, as the job and skills market pivots to an ‘infrastructure as code’ posture versus traditional hardware and virtualisation-based infrastructures.

Microservices and containers – a perfect match

Put simply, microservices are the individual functions within an application, and form the basis of a new architectural approach to building applications.

Microservices enable IT teams to build and run the applications their users want and need to stay ahead of competitors more easily. Many of the largest consumer and enterprise applications today run in microservices, proving that it’s not just a trend for small organisations, but also for the largest and most complex.

Indeed, the larger the organisation is, the more benefits there are to gain from adopting microservices because teams are often spread out with limited direct communication.

When was the last time you got a maintenance notification from your favourite streaming service to let you know you won’t be able to access services? It doesn’t happen.

There’s never a good time to update these services because someone is always binge-watching a new show. The principle of microservices states that you should break an application into smaller pieces that communicate via APIs, where each part can be updated independently from other parts.

As a result, if a streaming service needs to update its password-reset functionality, it doesn’t need to kick millions of users offline. This feature is a different microservice that can be updated independently. This results in happy developers and happy users.

Microservices are here to stay and will underpin the applications of tomorrow. In what kind of environment should you run them? Containers are the perfect building block for microservices. They present a lightweight, consistent environment for microservices, which can follow the application from the developer’s desktop, to testing, and to final deployment.

In addition, containers can run on physical or virtual machines, and they start up in seconds or even milliseconds, which is faster than VMs.

Packaging applications with their dependencies

Traditionally, software packages have included all the code needed to run the application on a particular operating system, such as Windows or Linux.

However, you need more than just application code to run an application, you also need other applications. For instance, an application for looking up stock prices might use a library to convert company names to ticker symbols and vice versa. This functionality is generic and not value-added, but it’s still important to allow a user to type ‘Apple’ and get the stock ‘AAPL’.

The library is an example of a dependency. Without IT knowing it, any application might have hundreds of these types of dependencies.

One of the main reasons that containers became so popular is that they provide a mechanism and format to package application code – with its dependencies – in a way that make it easy to run an application in different environments.

This solves a big problem for developers who are constantly fighting environment-compatibility issues between their development laptops, testing environments, and production.

By using containers to package their applications, they can code once and run anywhere, dramatically speeding up application delivery.

Not all container services are created equal

In terms of challenges, the first generation of cloud-native applications were designed to be stateless – using containers that did application work but didn’t need to store any persistent data in associated volumes.

As container usage evolves, developers are increasingly building stateful apps inside containers – apps that need to store data in a volume that must be persisted and kept. This is where the world of storage becomes challenging.

The flexibility and openness of containers turns into hurdles and bottlenecks at the storage layer, and simple storage capabilities that we’ve been taking for granted for years in the traditional application stack (high availability, disaster recovery, backup, and encryption) become challenges in the container world.

What’s worse, what often happens is that each application devises its own storage strategy, making it impossible to drive standards and data compliance across an organisation.

Therefore, as a best practice, we recommend choosing a solution that delivers the Kubernetes-native data services that both cloud-native and traditional apps require (since those traditional apps aren’t going away anytime soon).

This means delivering block, file, and object storage services, in multiple performance classes, provisioned on demand as Kubernetes requires. It also means providing instant data access, protection across all types of failures, the ability to mobilise data between clouds and even to or from the edge, as well as robust security no matter where an application travel.

If organisations do this, they will see for themselves why Kubernetes has become the not-so-secret special sauce for modern organisations.