Transforming application delivery using Containers

Increasingly complicated applications and demand for faster development is putting even more pressure on infrastructures, IT teams, and processes. Containers appear to be an answer. Developers love containers because they promise to solve numerous pain points in application delivery, including delivering the same functionality to multiple deployment environments. This Quru Insight white paper explores the concept of containerising an application, the potential benefits, the role of Docker and a way to bring these two technologies together through the capabilities of Red Hat Enterprise Linux 7.


Increasingly complicated applications - and demands for faster development - are putting even more pressure on your infrastructure, IT teams, and processes.

"I&O [IT infrastructure and operations] leaders must improve server deployment strategies or they will build infrastructure incapable of supporting the next generation of applications."
SOURCE: IT Market clock for server virtualization and operating environments, 16 September 2014. GARTNER #G00262842

The industry is moving beyond self-contained, isolated, and monolithic applications. New workloads will be part of a connected application fabric—flexibly woven together to serve particular business needs, yet easily torn apart and re-structured to meet changing requirements.

This requires a new approach in terms of the way applications are managed during development in order to facilitate success in production.

What are containers?

Linux containers keep applications and their runtime components together by combining lightweight application isolation with an image-based deployment method. Containers introduce autonomy for applications by packaging apps with the libraries and other binaries on which they depend. This avoids conflicts between apps that otherwise rely on key components of the underlying host operating system. Containers do not contain a(n) (OS) kernel, which makes them faster and more agile than virtual machines. However, it does mean that all containers on a host must use the same kernel.

Containers are increasingly been seen as a better way to manage Linux based applications:

Containers Survey REsults

Containers in context

Applications don't always work as expected. One way to avoid application issues in production is to maintain identical environments for development, testing, and production. Another is to create a Continuous Integration environment, where code is compiled and deployed to test machines and vetted with each and every code check-in, long before being pushed to production.

Enter containers. Linux containers depend on key capabilities in the Kernel and the operating system to function. Resource management, isolation, abstraction, and security – all of these are fundamental building blocks for Linux containers.

Developers love containers because, with the help of the Docker CLI, they promise to solve numerous pain points in delivering applications, including delivering the same functionality to multiple deployment environments. Containers introduce autonomy for applications by allowing them to be packaged with their dependencies, rather than relying on those dependencies being installed and configured on the host machine.

So Docker supports containers, and now let’s focus on what it actually does. Docker is an open-source project that allows applications to be automatically deployed inside software containers. It does this by providing an additional layer of abstraction together with the automation of operating-system-level virtualization on Linux. Docker provides an engine that gives access to lightweight containers that run processes in isolation through an API implementation.

A Docker ‘container’ builds on top of facilities, primarily cgroups and namespaces, provided by the Linux kernel. Contrary to a traditional virtual machine, this method does not require or include a separate operating system. Instead it relies on the functionality and resource isolation in the Linux kernel – CPU, memory, block I/O, network etc. By using separate namespaces, the application’s view of the operating system is completely isolated which provides the ability to run multiple containers on a single operating system.

Red Hat Enterprise Linux

Looking at Linux containers and Docker together, the obvious place to go is Red Hat Enterprise Linux 7. This offers developers and system administrators a portfolio of tools to enable the delivery of containerized applications through:

  • An integrated application delivery platform built on open standards from application container to deployment target.
  • Container portability with deployment across physical hardware, hypervisors, private clouds, and public clouds.
  • Safe access to digitally signed container images that are certified to run on certified container hosts.

Red Hat was an early proponent of Docker technology, and became one of the leading contributors to the community project, thanks to their expertise on the Kernel and OS. This enables the company to standardize Linux containers across its own solutions, including Red Hat Enterprise Linux, OpenShift, Red Hat Enterprise Linux Atomic Host and more, with the aim of helping to drive standards for Linux containers in the industry. Because Linux containers will work in the same way across Red Hat solutions, containerized applications can be deployed anywhere and everywhere.

Red Hat is now focused on three main areas of the development of Containers:

  • Container portability with deployment across physical hardware, hypervisors, private clouds, and public clouds
  • An integrated application delivery platform that spans from app container to deployment target—all built on open standards
  • Trusted access to digitally signed container images that are safe to use and have been verified to work on certified container hosts

Standardize the components, reap the benefits

Shipping companies can easily exchange containers—whether they transport by boat, rail, or truck—because the container dimensions comply with international standards.

However, container technology is not yet enterprise-ready, hence Red Hat is working to advance it and the supporting ecosystem, as they did with Linux. In particular, Red Hat advocates that similar standards now need to be established for software containers and their applications, covering:

  • Container host
    • Isolating and securing applications on the host operating system.
  • Container image
    • Package applications should include information on what's needed to run in a container, including digital signatures and encryption for security.
  • Orchestration
    • Managing a cluster of Linux containers as a single system.
  • Registry and discovery
    • Finding and consuming trusted application container images from federated sources.

Red Hat is working with the open source community through Project Atomic to help create industry-wide Linux container standards.

Project Atomic helps make sure that common containers work with trusted operating system platforms. By working towards compatibility and coordinating standards, Project Atomic helps Red Hat and other vendors deliver a complete hosting architecture that's modern, reliable, and secure.

Quru has been working with containers and Docker since this technology came onto the scene. If you're looking at deploying your applications in a container, give us call on 0207 160 2888


About Quru

We are passionately committed to open source technologies and consider Red Hat technology a core part of the solutions we deliver. Quru is a Red Hat Premier Business Partner and we play a strong role in assisting Red Hat in the development of new open source solutions. An example of our close partnership is that a Quru consultant, Dhruv Ahuja was recognised as Red Hat Consultant of the Year 2012 for his innovative work on RHEV, Grid and Storage.

Quru is based in Somerset House on the banks of the Thames, right in the centre of London.More...

Quru :: inspired open source solutions