BoxBoat Blog

Service updates, customer stories, and tips and tricks for effective DevOps

x ?

Get Hands-On Experience with BoxBoat's Cloud Native Academy

Jenkins Blue Ocean Pipeline

by Brandon Adams | Tuesday, May 30, 2017 | Docker

featured.png

Next in our series about Blue Ocean is a quick walk through on the creation and execution of pipelines. This will involve setup of the Jenkins server and a Git repository with our project's code. I'll be using a sample project built for this demo, keeping the pipeline simple for clarity. It will begin with securing a node from the swarm, selecting the build tools needed by our application, and running the build through a simple node command. The pipeline finishes by simulating a deployment by pushing to a local registry and updating a docker-compose.yml file.

Project Setup

I set up a repo with the code used for the example. A simple local Gitlab installation was enough. Our repo contains the files needed for NodeJS to launch the hello world server. It also contains several Docker-related files used within the build. The Dockerfile is used to build the container for our app, and the docker-compose.yml is used to deploy our stack to our local Swarm cluster. The other important file is the Jenkinsfile containing the specifications of the Pipeline. To keep things simple, I have chosen to write the file using the Declarative syntax. This gives our pipeline a structure within to run. it has clearly laid out stages and steps to allow for a predictable and repeatable process. As we walk through the build, we will discuss the sections of the file and how they translate into actions.

#!/usr/bin/env groovy
pipeline { 
  agent { 
    node { 
      label 'docker'
    }
  }
  tools {
    nodejs 'nodejs'
  }
 
  stages {
    stage ('Checkout Code') {
      steps {
        checkout scm
      }
    }
    stage ('Verify Tools'){
      steps {
        parallel (
          node: { sh "npm -v" },
          docker: { sh "docker -v" }
        )
      }
    }
    stage ('Build app') {
      steps {
        sh "npm prune"
        sh "npm install"
      }
    }
    stage ('Test'){
      steps {
        sh "npm test"
      }
    }
 
    stage ('Build container') {
      steps {
        sh "docker build -t badamsbb/node-example:latest ."
        sh "docker tag badamsbb/node-example:latest badamsbb/node-example:v${env.BUILD_ID}"
      }
    }
    stage ('Deploy') {
      steps {
        input "Ready to deploy?"
        sh "docker stack rm node-example"
        sh "docker stack deploy node-example --compose-file docker-compose.yml"
        sh "docker service update node-example_server --image badamsbb/node-example:v${env.BUILD_ID}"
      }
    }
    stage ('Verify') {
      steps {
        input "Everything good?"
      }
    }
    stage ('Clean') {
      steps {
        sh "npm prune"
        sh "rm -rf node_modules"
      }
    }
  }
}

I then launched the Jenkins server, using the jenkinsci/jenkins:latest Docker image. This got me Jenkins core 2.62, and after installing all necessary Blue Ocean dependencies, Blue Ocean 1.0.1. As our sample app is a NodeJS app, I configured the automatic installer for the latest version of Node. I also turned on the Swarm plugin, which allows me to dynamically attach slave nodes to my Jenkins master. The slave nodes are custom built, containing the Swarm agent that automatically adds itself to the cluster. Due to the simplicity of the project, only one node is needed.

Creating the pipeline takes only a few button presses from the Blue Ocean launch screen. Pressing the new pipeline button prompts for an SCM repo, either from GitHub or a general Git URL. We enter the repo URL for our test project. The Jenkinsfile is located inside of the repo, and is read automatically. The project is created and the first build is begun.

Configuration

The build begins by specifying a node from the swarm with the appropriate tag. Our Jenkinsfile calls for the tag ‘docker’, so Jenkins will secure the build agent with that tag. Our small cluster only has one node that satisfies the requirements, so this step is easy. The entirety of our pipeline will be run on this node.

The tool section enables the NodeJS tool that we have configured for this demo. We enable it with the clause labeled ‘nodejs’. In the setup of our master, I enabled the NodeJS plugin with an installation of node, and named it ‘nodejs’. This section allows it to be referenced within our build.

Build Pipeline

Stage 1 – Checkout Code

The first stage is the code checkout. This is an extremely simple one-liner, “checkout scm”. Because we configured our pipeline to connect to our repo in the first step, Jenkins already knows the location of our code. It pulls it down onto our node into our workspace.

Stage 2 – Verify Tools

The second stage is a parallel stage. It simulates the installation and configuration of any external tools needed by the pipeline. In our sample, it is just verifying that Node and Docker are both installed. Because they are simple checks, I have placed them in parallel steps.

Stage 3 – Build App

The third stage is the building of the artifacts. Our app is simple; all it takes is a few npm commands to clean the workspace, and pull dependencies. These are shown in just two lines. Pulling dependencies is in preparation for the next stage, testing our app.

Stage 4 – Test

The fourth stage is a simple npm test. This would be a more complex test stage in a larger project, but there are no defined tests for this simple app so it passes with no issue. It's only included to show the expected flow.

Stage 5 – Build Container

The fifth stage is building the Docker container. There is a Dockerfile included inside the repo. This builds the container in the same manner as the in the last stage.

Stage 6 – Deploy

The sixth stage is deployment. My target for deployment is my base VM infrastructure. I am running my example as a part of a one node Swarm. This lets me deploy my sample app as a service within a stack. Because of this, I use the docker-compose.yml file pulled down with the repo, and run a few docker stack commands to update the image.

Stage 7 – Verify

I have built a verification step into the pipeline. This is the point where we can check our deployment to make sure it has run correctly. If everything checks out we may proceed.

Stage 8 – Clean

Once the verification check has passed, the build cleans up behind itself by running an npm command and deleting the dependencies downloaded to the Jenkins slave. If this passes our build is complete!

And We're Done!

This was a very simple example showing off some of the features of declarative Jenkinsfiles and the Blue Ocean Pipeline Plugin. As we have seen, creating a streamlined and robust pipeline is very easy using the Declarative syntax. It provides a strong framework by which we can quickly enable a pipeline in conjunction with Docker.


Here at BoxBoat, we are Jenkins experts. As a Gold Cloudbees Partner, we offer a wide variety of services to help your organization adopt and improve your Jenkins CI system. Find out more today.