How containers can become a vital part of the architect’s toolkit

A wrench with spanners.

How containers can become a vital part of the architect’s toolkit Anil Kumar is director of product management at Couchbase.


Much has been made of the quickening pace of digital transformation globally, as IT departments feel increasingly empowered to boost organisational agility, efficiency, and innovation. Yet less has been said of the role containerisation has played in supporting this acceleration. Containers enable greater flexibility and scalability where developer teams need it the most. However, many still find them to be a challenge. It’s easy to run into problems with containers due to their numerous complexities: from interoperability issues with legacy technology such as centralised databases, to container sprawl. Often, these issues threaten to derail digital transformation as DevOps projects become difficult to rollout when utilising containers. 

Essential building blocks

When used in the right way, containers can be a vital asset for any IT department and are increasingly thought of as fundamental for cloud computing. Acting as a virtual machine, containers give developers a self-contained package in which applications and libraries can be run, whilst remaining isolated from other functions. Offering a predictable environment, containers are easy to deploy at scale across hybrid, public and private cloud set-ups, and cost less than traditional virtual machines. 

It’s no wonder they have surged in popularity. In fact, a recent survey from the Cloud Native Computing Forum found that 84% of developers were running containers in production, while Google says it starts several billion containers each week.

Making containers simple to use

For all its merits, containerisation carries several significant downsides. Chief among these is the architectural divide it creates between container environments running stateless application workloads and stateful application workloads operating on legacy infrastructure.

The development of orchestration tools, such as Kubernetes, has meant organisations have started to take greater control over their container environments, enabling them to see the benefits of stateless applications. For instance, the ability to adopt an online-first approach to services, to easier redeployment and scalability, and the ability to make easier connections between applications and services using APIs, means businesses have been able to take advantage of the possibilities of containers.

Yet tools such as Kubernetes only go so far. Many users have found themselves effectively facing this challenge in reverse from their legacy infrastructure. The architecture initially built for stateful applications can’t provide the agility, innovation and flexibility that’s now possible from more modern alternatives. For example, stateful applications will often have their own independent network, policies, and infrastructure existing in silos, making it harder to scale without directly altering that infrastructure or to connect with other applications using APIs. This means developers are stuck with a dilemma where they need to bridge the gap between older legacy infrastructure and databases and newer technologies.

The longer this problem is left to fester, the harder it will be to overcome. Solving this problem isn’t a case of simply replacing a legacy database for a more modern alternative, instead a more careful approach is needed. Databases often support business critical applications where downtime and interruptions must be prevented and there’s no guarantee that a more innovative NoSQL database will automatically be able to cater for containers.

Ensuring DevOps success

Fortunately, there’s a way around this problem. Modern NoSQL databases are designed specifically to work seamlessly with new tools like Kubernetes – meaning developer teams can manage and oversee the process of containers connecting with centralised databases within the cloud. This means that it’s now possible to take a holistic approach towards IT infrastructure and DevOps, guaranteeing that all components are working in harmony with each other.

The issue for architects will be deciding which applications must be moved from stateful to stateless rapidly, to ensure they can keep up with the pace of containerisation, whilst being aware of which applications can be kept within a legacy environment. Payment functions are a good example: it’s important that certain types of payment transactions, which are carried out the same way each time, remain on a legacy database that’s better suited to handling them. Meanwhile, anything that directly impacts the customer could be moved to a more modern environment.

As time progresses, it’s likely that almost all business applications will be run and built within containers. When implemented correctly, the agility that containers grant offers IT departments a way out of the ‘keeping the lights on’ dilemma, where day-to-day maintenance limits their time for more innovative work. This is why IT must emphasise interoperability between its mix of technologies as much as possible. Without it, container initiatives are likely to fall short of their aims.

Tags: ,

View Comments
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *