Jay Taylor's notes

back to listing index

Orchestrating Workflows with Jenkins and Docker

[web search]
Original source (www.cloudbees.com)
Tags: docker continuous-integration jenkins cloudbees www.cloudbees.com
Clipped on: 2016-03-07

Image (Asset 1/7) alt=
Search

Orchestrating Workflows with Jenkins and Docker

Submitted by jglick on 18 Jun 2015
Like
0
Image (Asset 2/7) alt=
Most real world pipelines are more complex than the canonical BUILD→TEST→STAGE→PRODUCTION flow. These pipelines often have stages which should not be triggered unless certain conditions are met, while others should trigger only if the first’s conditions fall through. Jenkins Workflow helps writes these pipelines, allowing complex deployments to be better represented and served by Jenkins.
 
The Jenkins Workflow Docker plugin extends these workflows even further to provide first class support for Docker images and containers. This plugin allows Jenkins to build/release Docker images and leverage Docker containers for customized and reproducible slave environments.
 
What is Docker?
Docker is an open-source project that provides a platform for building and shipping applications using containers. This platform enables developers to easily create standardized environments that ensure that a testing environment is the same as the production environment, as well as providing a lightweight solution for virtualizing applications.
 
Docker containers are lightweight runtime environments that consist of an application and its dependencies. These containers run “on the metal” of a machine, allowing them to avoid the 1-5% of CPU overhead and 5-10% of memory overhead associated with traditional virtualization technologies. They can also be created from a read-only template called a Docker image.  
 
Docker images can be created from an environment definition called a Dockerfile or from a running Docker container which has been committed as an image. Once a Docker image exists, it can be pushed to a registry like Docker Hub and a container can be created from that image, creating a runtime environment with a guaranteed set of tools and applications installed to it. Similarly, containers can be committed to images which are then committed to Docker Hub.
 
What is Workflow?
Jenkins Workflow is a new plugin which allows Jenkins to treat continuous delivery as a first class job type in Jenkins. Workflow allows users to define workflow processes in a single place, avoiding the need to coordinate flows across multiple build jobs. This can be particularly important in complex enterprise environments, where work, releases and dependencies must be coordinated across teams. Workflows are defined as a Groovy script either within a Workflow job or checked into the workspace from an external repository like Git.
 
Docker for simplicity
In a nutshell, the CloudBees Docker Workflow plugin adds a special entry point named Docker that can be used in any Workflow Groovy script. It offers a number of functions for creating and using Docker images and containers, which in turn can be used to package and deploy applications or as build environments for Jenkins.
 
Broadly speaking, there are two areas of functionality: using Docker images of your own, or created by the worldwide community, to simplify build automation; and creating and testing new images. Some projects will need both aspects and you can follow along with a complete project that does use both: see the demonstration guide.
 
Jenkins Build Environments and Workflow
Before getting into the details, it is helpful to know the history of configuring build environments in Jenkins. Most project builds have some kind of restrictions on the computer which can run the build. Even if a build script (e.g. an Ant build.xml) is theoretically self-contained and platform-independent, you have to start somewhere and say what tools you expect to use.
 
Since a lot of people needed to do this, back in 2009 I worked with Kohsuke Kawaguchi and Tom Huybrechts to add a facility to Jenkins’ predecessor - Hudson - for “tools”. Now a Jenkins administrator can go the system configuration page and say that Ant 1.9.0 and JDK 1.7.0_67 should be offered to projects which want them, so download and install them from public sites on demand.

From a traditional job, this becomes a pulldown option in the project configuration screen, and from a Workflow, you can use the tool step:

node('libqwerty') {
 withEnv(["PATH=${tool 'Ant 1.9.0'}/bin:${env.PATH}"]) {
   sh 'ant dist-package'
 }
 archive 'app.zip'
}

While this is a little better, this still leaves a lot of room for error. What if you need Ant 1.9.3—do you wait for a Jenkins administrator? If you want to scale up to hundreds of builds a day, who is going to maintain all those machines?

Clear, reproducible build environments with Docker
Docker makes it very easy for the project developer to try a stock development-oriented image on Docker Hub or write a customized one with a short Dockerfile:
FROM webratio/ant:1.9.4

RUN apt-get install libqwerty-devel=1.4.0
Now the project developer is in full control of the build environment. Gone are the days of “huh, that change compiled on my machine”; anyone can run the Docker image on their laptop to get an environment identical to what Jenkins uses to run the build.

Unfortunately, if other projects need different images, the Jenkins administrator will have to get involved again to set up additional clouds. Also there is the annoyance that before using an image you will need to tweak it a bit to make sure it is running the SSH daemon with a predictable user login, and a version of Java new enough to run Jenkins slaves.

 
What if all this hassle just went away? Let us say the Jenkins administrators guaranteed one thing only:
If you ask to build on a slave with the label docker, then Docker will be installed.
and proceeded to attach a few dozen beefy but plain-vanilla Linux cloud slaves. With CloudBees Docker Workflow, you can use these build servers as they come.
 
// OK, here we come
node('docker') {
 // My project sources include both build.xml and a Dockerfile to run it in.
 git 'https://git.mycorp.com/myproject.git'
 // Ready?
 docker.build('mycorp/ant-qwerty:latest').inside {
   sh 'ant dist-package'
 }
 archive 'app.zip'
}
Embedded in a few lines of Groovy instructions is a lot of power. First we used docker.build to create a fresh image from a Dockerfile definition. If you are happy with a stock image, there is no need for even this:​
 
node('docker') {
 git 'https://git.mycorp.com/myproject.git'
 docker.image('webratio/ant:1.9.4').inside {
   sh 'ant dist-package'
 }
 archive 'app.zip'
}
docker.image just asks to load a named image from a registry, in this case the public Hub. .inside asks to start the image in a new throwaway container, then run other build steps inside it. So Jenkins is really running docker exec abc123 ant dist-package behind the scenes. The neat bit is that your single project workspace directory is transparently available inside or outside the container, so you do not need to copy in sources, nor copy out build products. The container does not need to run a Jenkins slave agent, so it need not be “contaminated” with a Java installation or a jenkins user account.
 
Easily adaptable CD pipelines
The power of Workflow is that structural changes to your build are just a few lines of script away. Need to try building the same sources twice, at the same time, in different environments?
 
def buildIn(env) {
 node('docker') {
   git 'https://git.mycorp.com/myproject.git'
   docker.image(env).inside {
     sh 'ant dist-package'
   }
 }
}
parallel older: {
 buildIn 'webratio/ant:1.9.3'
}, newer: {
 buildIn 'webratio/ant:1.9.4'
}
Simplified application deployments
So far everything I have talked about assumes that Docker is “just” the best way to set up a clear, reproducible, fast build environment, but the main use for Docker is to simplify deployment of applications to production. We already saw docker.build creating images, but you will want to test them from Jenkins, too. To that end, you can .run an image while you perform some tests against it. And you can .push an image to the public or an internal, password-protected Docker registry, where it is ready for production systems to deploy it.
 
Try it yourself
Look at the demo script to see all of the above-mentioned use-cases in action. This demo highlights that you can use multiple containers running concurrently to test the interaction between systems.
 
In the future, we may want to build on Docker Compose to make it even easier to setup and tear-down complex assemblies of software, all from a simple Jenkins workflow script making use of freestanding technologies. You can even keep that flow script in source control, too, so everything interesting about how the project is built is controlled by a handful of small text files.
 
Closing thoughts
By this point you should see how Jenkins and Docker can work together to empower developers to define their exact build environment and reliably reproduce application binaries ready for operations to use, all with minimal configuration of Jenkins itself.
 
Download CJE 15.05 or install CloudBees Docker Workflow on any Jenkins 1.596+ server and get started today!
 
Where do I start?
  1. The CloudBees Docker Workflow plugin is an open-source plugin, so it is available for download from the open-source update center or packaged as part of the CloudBees Jenkins Platform.
  2. Documentation on this plugin is available in the CloudBees Jenkins Platform documentation
  3. A more technical version of this blog is available on the CloudBees Developer Blog
  4. More information on this feature is available in the new Jenkins Cookbook
  5. Other plugins complement and enhance the ways Docker can be used with Jenkins. Read more about their uses cases in these blogs.
    1. Docker Build and Publish plugin
    2. Docker Slaves with the CloudBees Jenkins Platform
    3. Docker Traceability
    4. Docker Hub Trigger Plugin
    5. Docker Custom Build Environment plugin
 
 
Image (Asset 3/7) alt=

Jesse Glick
Developer Extraordinaire
CloudBees

Jesse Glick is a developer for CloudBees and is based in Boston. He works with Jenkins every single day. Read more about Jesse on the Meet the Bees blog post about him and follow him on Twitter.

 

 

 

 

Blog Categories: 

Add new comment

Your name
E-mail

The content of this field is kept private and will not be shown publicly.

Homepage
Subject
Comment *

Categories

Archive

2016

March (2)

February (5)

January (6)

2015
2014
2013
2012
2011
2010