Hello Newman - How to Build a CI/CD Pipeline that Executes API Tests

Ed LeGault - Jun 26 '20 - - Dev Community

Hello Newman

Executing automated tests in a CI/CD pipeline can be made simple and easy when using containers. After an image is built and can be started and tested to decide if it is a viable candidate to push to the registry. In this article we will walk through a demo of a GitLab CI/CD pipeline. We will build a simple express REST API application based on a Node image and then execute and verify automated tests against the api by running the application in docker-compose. Automated tests are written in javascript and will be executed in the PostMan CLI tool called Newman. Verifying that the running container can pass automated tests as early in the pipeline as possible is a very good thing. Let's take a look at how to accomplish this.

Prerequisites

Building the Application

We need to start with an example application. For this exercise we will be making a API that returns members of boy-bands. It will be a node app using Express to keep it simple. Create a file named index.js that contains the following code:

var express = require("express");
var app = express();

app.listen(3000, () => {
    console.log("Server running on port 3000");
});

app.get("/newkids", (req, res, next) => {
    res.json(["Donnie","Joey","Jordan","Jonathan","Danny","David"]);
});

app.get("/nsync", (req, res, next) => {
    res.status(501);
    res.send("Not Implemented");

});

app.get("/health", (req, res, next) => {
    res.json("Active");
});
Enter fullscreen mode Exit fullscreen mode

Also make a file named "package.json" that contains the following code:

{
  "name": "ci-node",
  "version": "1.0.0",
  "main": "index.js",
  "dependencies": {
    "express": "^4.17.1"
  }
}
Enter fullscreen mode Exit fullscreen mode

We are going to package the application as a Docker image so we need a Dockerfile. If you need an intro to Docker to better follow along you can get started here. Create a file named "Dockerfile" that contains the following code:

FROM node:13.1.0

WORKDIR /usr/src/app

COPY package*.json ./

RUN npm install

COPY . .

EXPOSE 3000

CMD [ "node", "index.js" ]

Enter fullscreen mode Exit fullscreen mode

You can now build the application locally with the following command:

docker build -t ci-demo .
Enter fullscreen mode Exit fullscreen mode

To verify that the application works you can execute the following command:

docker run -d -p 3000:3000 ci-demo:latest
Enter fullscreen mode Exit fullscreen mode

The application is now available at http://localhost:3000. The following commands should work and return the proper results or status codes:

Writing and executing automated API tests

There are many frameworks available that will execute automated tests against a REST API. For the purposes of this demonstration we are going to use Postman. Most people are familiar with Postman, or have used it in the past, to test an API manually. Postman also has the capability of creating, editing and executing automated tests against the same API. The tests are written in Javascript and can be imported and exported in JSON format. More information and tutorials are available on the Postman Learning Center.

Create a new directory named api-tests. Within the api-tests directory create a file named "ci-demo.postman_collection.json" that contains the following JSON:

{
    "variables": [],
    "info": {
        "name": "ci-demo",
        "_postman_id": "eaba83b1-65f2-e995-adcf-e07477fe67dd",
        "description": "",
        "schema": "https://schema.getpostman.com/json/collection/v2.0.0/collection.json"
    },
    "item": [
        {
            "name": "Check Health",
            "event": [
                {
                    "listen": "test",
                    "script": {
                        "type": "text/javascript",
                        "exec": [
                            "tests[\"Body matches string\"] = responseBody.has(\"Active\");",
                            "tests[\"Status code is 200\"] = responseCode.code === 200;"
                        ]
                    }
                }
            ],
            "request": {
                "url": "http://{{host_name}}:{{host_port}}/health",
                "method": "GET",
                "header": [],
                "body": {},
                "description": ""
            },
            "response": []
        },
        {
            "name": "Check Not Implemented",
            "event": [
                {
                    "listen": "test",
                    "script": {
                        "type": "text/javascript",
                        "exec": [
                            "tests[\"Body matches string\"] = responseBody.has(\"Not Implemented\");",
                            "tests[\"Status code is 501\"] = responseCode.code === 501;"
                        ]
                    }
                }
            ],
            "request": {
                "url": "http://{{host_name}}:{{host_port}}/nsync",
                "method": "GET",
                "header": [],
                "body": {},
                "description": ""
            },
            "response": []
        },
        {
            "name": "Check New Kids",
            "event": [
                {
                    "listen": "test",
                    "script": {
                        "type": "text/javascript",
                        "exec": [
                            "tests[\"Status code is 200\"] = responseCode.code === 200;",
                            "",
                            "var jsonData = JSON.parse(responseBody);",
                            "tests[\"Make sure Donnie is in the list\"] = jsonData[0] === \"Donnie\";",
                            "tests[\"Make sure Mark is not in the list\"] = !responseBody.has(\"Mark\");"
                        ]
                    }
                }
            ],
            "request": {
                "url": "http://{{host_name}}:{{host_port}}/newkids",
                "method": "GET",
                "header": [],
                "body": {},
                "description": ""
            },
            "response": []
        }
    ]
}
Enter fullscreen mode Exit fullscreen mode

Also in the api-tests directory create a file named "ci.postman_environment.json" that contains the following JSON:

{
  "id": "6a67c87a-36b4-126c-2696-b6256e7be90b",
  "name": "ci",
  "values": [
    {
      "enabled": true,
      "key": "host_name",
      "value": "app",
      "type": "text"
    },
    {
      "enabled": true,
      "key": "host_port",
      "value": "3000",
      "type": "text"
    }
  ]
}
Enter fullscreen mode Exit fullscreen mode

The next thing we are going to do is import our tests into the Postman application. To do this we will create a new Collection while importing it by clicking on the "Import" button in the top left. You can either drag and drop the file or click in the window and select your "ci-demo.postman-collection.json" file. Your screen should now look like this:
Alt Text

Click the "Import" button and you should see your collection on the left side of the screen.

Expanding the collection and selecting one of the tests should allow you to then select the "Tests" tab and should look like this:
Alt Text

You can see that this particular test is verifying that a GET call to /newkids should return a 200 response code and the body returned should contain the value of "Donnie". It is also verifying that the body returned should not contain the value of "Mark". These are just very simple examples but there are many more to explore by clicking on the "SNIPPETS" in the right side or by writing your own.

You may have noticed that the URL we are testing has some variables in it. The URL in our tests is actually "http://host_name:host_port". This is because Postman allows for variables to be defined for a particular environment and you can change between them in the top right of the screen. This is important for using this in CI because later when these tests are executed in docker-compose the hostname is going to be a different value than when you execute these tests locally.

The next thing we need to do to allow our tests to be executed locally is to create a local environment configured with our local environment settings. To do this click on the gear in the top right of the screen to get to the "Manage Environments" screen and click the "Add" button. For the name enter "local" and click the "Add a new variable" text. In the "Variable" section enter "host_name" and for the "Initial Value" enter "localhost". The "Current Value" section should auto fill with what you entered for the initial value. Your screen should look like this:
Alt Text

You should now be able to execute your tests locally by selecting your "local" environment you just created and clicking "Send" on your various tests.

Executing API Tests With Newman and docker-compose

We are now going to use the Postman CLI tool named "Newman" to execute the same API tests we just imported. We are going to use docker-compose to start our sample application, as well as the Newman CLI. If you need to get up to speed on docker-compose you can do so here.

Create a file named "docker-compose.yaml" in the base directory of your project with the following contents:

version: "3"

services:
  app:
    image: ci-demo:latest
    ports:
      - "3000"
  app-test:
    image: postman/newman_ubuntu1404:4.5.5
    depends_on:
      - app
    entrypoint: [""]
    volumes:
      - $PWD/api-tests:/etc/newman
    command: >
        newman run ci-demo.postman_collection.json 
        --environment="ci.postman_environment.json" 
        --reporters cli,junit --reporter-junit-export /etc/newman/api-results.xml
Enter fullscreen mode Exit fullscreen mode

As you can see we have one service definition that is our application. The "app-test" service is a image that contains the Newman CLI tool. We are mapping a volume in the app-test service that puts our api-tests directory in the /etc/newman directory which allows us to execute the tests from the default location within the container. We are then executing a "newman run" command and passing the arguments of our tests, what environment file to use, what reporters to output our results and where to write those results. Notice that the name of our applications service is "app". This means that when the Newman CLI is executed it can execute the API tests against the hostname of "app". This is why we have a different environment file that we are specifying that contains a "host_name" value of "app". You can now execute the tests by running the following command:

docker-compose up --abort-on-container-exit --exit-code-from app-test
Enter fullscreen mode Exit fullscreen mode

If everything worked correctly a file should be written to your api-tests folder named "api-results.xml" that is in JUnit test output format.

Creating a Pipeline

To put this all together we are going to put all of these steps in a CI/CD pipeline. For example purposes I have a file for defining a pipeline in Gitlab CI/CD. If you don't have access to Gitlab you can execute similar steps in other pipeline systems. This is an example of how you can build the application and test it when code is committed:

stages:
  - build
  - test

build:
  tags:
    - docker
  stage: build
  image: docker:latest
  services:
    - docker:dind
  script:
    - docker build -t ci-demo:latest .

test:
  tags:
    - docker
  stage: test
  image: tmaier/docker-compose:latest
  services:
    - docker:dind
  script:
    - docker-compose up --abort-on-container-exit --exit-code-from app-test
  after_script:
    - docker-compose down --rmi local --remove-orphans   
  artifacts:
    paths:
      - api-tests/api-results.xml
    reports:
      junit: api-tests/api-results.xml
Enter fullscreen mode Exit fullscreen mode

Advanced Stuff I left Out

  • In a more real world version of a pipeline like this you would want to use a more dynamic tag than "latest" and would also want to pass that as an argument to the compose command.
  • You would also push the image to a container registry and then pull it in subsequent steps.

Conclusion

Being able to execute automated tests against your application as early as possible in a CI/CD pipeline is going to lead to a great deal of time saved and a lower defect count. Allowing the same automated tests to be executed both locally and in the pipeline is also a key to making this level of testing functional and useful.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .