Testing bash scripts with Jenkins
Magnolia in action
Take 12 minutes and a coffee break to discover how Magnolia can elevate your digital experience.
At Magnolia, we use Jenkins and AWS EC2 instances to run our builds. We mostly do Maven builds that leverage JUnit 4/5 for both unit and IT/UI testing. For the latter, we also use Docker and Selenium to spin up instances and run test scenarios.
Before we can use these instances, they need to download utilities, settings, and a couple of bash scripts. We call those scripts the ‘aws-build-scripts’. They are critical to our development workflow and must follow development best practices. This is why we need to validate them. In today's blog, we'll share how we wrote a test suite with a minimalistic setup.
Evaluating testing frameworks
For sure, there are dedicated libraries such as shUnit2 or Bats out there. These are the solutions we ran into immediately. But our use case was only about improving the stability of our builds. Investing in a 3rd-party library would have complicated the setup as fun as that could have been. It would also have made it more challenging for a colleague to solve an issue without prior knowledge of these frameworks. We decided it wasn't worth it.
Magnolia Features
Magnolia allows you to manage all your content and media in one place, and create personalized experiences across multiple channels. See an overview of key capabilities and benefits.
Defining a simple test convention
We didn’t do anything until we ran into issues that we couldn’t explain easily. This is when we realized that we needed some sort of test that is triggered with every change. We thought: “Wait, we're only one Jenkinsfile away from this!” Because Jenkins automatically scans our Bitbucket repositories, we can use it solve our problem. We didn’t need Jenkins to build anything in this case, but we could use it to run tests.
We then came up with a Jenkinsfile convention to run simple tests:
stage('NAME_OF_THE_BASH_FILE') {
stages {
stage('FIRST_TEST_DESCRIPTION') {
steps {
script {
// GIVEN
// here it's possible to setup and assess the test's preliminary state, for instance:
sh "docker volume create test-volume"
def volumes = callSh("docker volume ls --filter 'dangling=true'")
assert volumes.contains("test-volume")
// WHEN
sh "./NAME_OF_THE_BASH_FILE"
// THEN
// wrap up the test, for instance:
volumes = callSh("docker volume ls --filter 'dangling=true'")
assert !volumes.contains("test-volume")
}
}
}
stage('SECOND_TEST_DESCRIPTION') {
…
}
}
}
callSh() comes from a file named callSh.groovy which has the following content:
def call(command) {
return sh(script: command, returnStdout: true).trim()
}
For example, to clean up and remove any Docker volumes from the machine for the next build we run this test:
stage('Removes dangling volumes') {
steps {
script {
// GIVEN
sh "docker volume create test-volume"
def volumes = callSh("docker volume ls --filter 'dangling=true'")
assert volumes.contains("test-volume")
// WHEN
sh "./docker-clean-up.sh"
// THEN
volumes = callSh("docker volume ls --filter 'dangling=true'")
assert !volumes.contains("test-volume")
}
}
}
The 'GIVEN, WHEN, THEN' convention and the Groovy (Java) syntax make it easy for all our developers to get familiar with the test suite.
The use case can even be extended when leveraging other functionality of Jenkins pipelines. For instance, making tests optional can be achieved with the when clause like this:
stage('Leaves already mounted volume') {
when {
expression {
return callSh("df").contains("/home/ubuntu/ec2")
}
}
…
That’s it: a successful example of bash script testing made simple. You can have a look at the full test suite here. Have fun replicating the setup or tweaking it to fit your needs!