Serengeti logo BLACK white bg w slogan
Menu

Jenkins Pipelines, or Five Instructions Yielding Utility

Jurica Žigić, DevOps Engineer
26.01.2021.

When we started the internal project that will be announced soon, there were many decisions we had to make regarding architecture, infrastructure, and development processes. Naturally, we pooled our knowledge and experience together, taking what worked best for us in the past and balancing it with the industry's contemporary best practices.

One such decision was to use Jenkins as an automation server. Another one was to treat configuration as code. These two translated into a singular task: learning Jenkins pipelines. At this point in the journey, I’d like to share some of the stops along the way.

There’s a concept in Ancient Greek philosophy called kairos – it means the right time. The right time to read a blog post about things you wished you knew before even knowing you wished to know them is probably just after you know what you wish to know, but don’t know it yet.

Here are five such things.

1. Access files produced by one agent from another agent

Imagine you have a neat little setup: you use a Docker container to build your project, then you build and deploy via the Docker plugin. There’s just one problem there: those are two separate agents, and separate agents use separate workspaces.

There are quite a few solutions here: stashing, archiving, reusing nodes, shared workspaces (via a plugin), uploading to an artifact repository… Some are less attractive than others, though. Archiving would arguably be misused here, plus it has some limitations you’d need to keep in mind. Installing a third-party plugin is fine if there’s no other way, but now you have another liability in your pipeline. You might already have set up Nexus or Artifactory, but now you have to create another repository, decide which security policies you want, create another clean-up policy, and wait for upload/download to finish. That’s a lot of overhead for something that ought to be easy.

And it can be – if it suits your scenario, enable node reuse:

stage('Build') {
agent {
docker {
image 'your image'
registryUrl 'your url'
registryCredentialsId 'your credentials id'
reuseNode true
}
}

Otherwise, you might want to use stashing.
In the build step, save the file(s):

stash includes: 'target/my-app.jar', name: 'my-app'

and then later, prior to building an image or deploying:

unstash 'my-app'

Note the paths are relative to the workspace, and globbing is supported.

2. Spaces will break you

Well, OK, they’ll break your pipeline. Avoid spaces in project and pipeline names. It’s more trouble than it’s worth.

3. Environment variables

Now here’s a topic deserving of an entire blog post of its own. Indeed, you can find several blog posts about it scattered about the interwebs. I’d like to go over a few things I actually had some use for and save the arcane bits for another day.

First up, environment variables set by Jenkins:

echo "My workspace is: ${env.WORKSPACE}"

These are usable anywhere. You can see the entire list at your https://jenkins.example.com/env-vars.html page. 

Secondly, user-defined env vars:

environment {
CHROME_BIN='/usr/bin/chromium'
GCC_BIN="${sh(returnStdout: true, script: 'which gcc')}".trim()
}

This assumes a declarative pipeline, as the syntax for the scripted approach is different. Declaring right after the agent means they’ll be usable anywhere. You can also insert this under a particular stage, though then only that stage will have access to them. The second line showcases setting an env var dynamically.

When the env vars are set, you would use them like this:

echo "Path to Chrome: ${CHROME_BIN}"
echo "Path to GCC: ${GCC_BIN}"

Thirdly, you can use variables you have declared in a script block outside of it, including in a declarative block, at any stage.

stage('Deploy') {
   steps {
      script {
         env.TARGET = 'https://destination.example.com'
      }
      echo "My target is ${env.TARGET}"
   }
}
stage('Debrief') {
   steps {
      sh "Stuff was deployed to ${TARGET}"
   }
}

4. Mixing declarative and scripted

In short, yes, you can mix the two. You’ve just seen this in the above example – the script block inside of steps. But you’re not limited to setting variables. You can freely use the Jenkins variant of Groovy inside the script block to accomplish anything you might want to. For example, if you’re using Docker to not only build your project, but also to deploy it, you need to build and push the image first. You could do this by just calling the commands via Bash, or you can employ the scripted Docker pipeline:

script {
docker.withRegistry('https://registry.example.com', 'credentials-id') {
docker.build("example/app", "./docker").push("tag")
}
}

 While I do prefer the declarative approach, for some things you just need to use Groovy, which is a great excuse to learn it (just enough).

5. How to SSH

You might find the situation with pipelines and SSH a little confusing. After all, the most commonly used functionality comes wrapped into a plugin. You have perhaps searched for such a plugin, and found some plugins, only to realize they are not meant for pipelines. How, then, do you use SSH?

As it turns out, you just use it like you would outside of Jenkins. Sort of. You have to add your key to the Jenkins credentials store. Then you wrap the command(s) using the withCredentials block:

withCredentials([sshUserPrivateKey(credentialsId: 'ssh-credentials-id',
keyFileVariable: 'key')]) {
      sh 'ssh -i $key user@my.example.com -o StrictHostKeyChecking=no'
}

There are two further things to note in this example. One is the quotes. In my previous point, I mentioned learning Groovy. However, it should be noted that there are differences between regular Groovy and the Jenkins pipeline DSL based on it. This is one example. Normally, Groovy behaves just like Bash; single quotes around variables are taken literally, while double quotes interpolate the value. But, since an interpolated string in this context could present a security risk, Jenkins will do the right thing here when you enclose key within single quotes.

The second thing is the -o StrictHostKeyChecking=no parameter. This disables the prompt ssh gives you when you first try connecting to a host. Since Jenkins has no idea what to do with a prompt requiring user input, you get an error. You could also set it to accept-new, as explained in the man page for ssh_config.

With the addition of pipelines, Jenkins is surely in the top tier when it comes to the power and degree of control it gives you over any automation task. However, occasionally the documentation is all over the place, things have changed but the available resources don’t reflect it, and there are some sharp edges here and there. I hope this guide will help smooth out some of them.

The entire experience has been beneficial for our team and instrumental in optimizing our development process. The most valuable lesson here is to treat configuration as code whenever possible, regardless of what software you use for automation: it is simpler to manage, version, audit, and modify.

If you need help with Jenkins, feel free to contact us.

Let's do business

The project was co-financed by the European Union from the European Regional Development Fund. The content of the site is the sole responsibility of Serengeti ltd.
cross