Series Introduction
Welcome to Part 6 of this blog series that will go from the most basic example of a .net 5 webapi in C#, and the journey from development to production with a shift-left mindset. We will use Azure, Docker, GitHub, GitHub Actions for CI/C-Deployment and Infrastructure as Code using Pulumi.
In this post we will be looking at:
- Unit Testing - with Docker and GitHub Actions
TL;DR
We optimised the image layering by ensuring we COPY
our .csproj
files separately in order for the Docker build engine to use its build cache more effectively.
We add a basic unit test (in principle), use the .NET Core Test Explorer for ease of use within VS Code.
We build 2 images, one for unit testing and our final image, we achieved this by using the --target
option on the docker build
command. This enabled us to run Docker for our unit tests separately (as it needs the SDK to run).
We add the same capability to our GitHub Actions' Workflow and persist the unit test results as an artifact, and use a GitHub Action to publish a pretty test report against the build job. GitHub Actions - Test Reporter
GitHub Repository
peteking / Samples.WeatherForecast-Part-6
This repository is part of the blog post series, API's from Dev to Production - Part 6 on dev.to. Based on the standard .net standard Weather API sample.
Introduction
In Part 5 we were able to get our healthcheck in our API and even our Dockerfile. This is a natural progression to move onto other key capabilities we should have in our API engineering flow - Unit tests!
We won't go so in depth into the good practices around uniting testing per se, nor will I describe the different types of unit testing; socialable and solitary unit testing.
What we will cover is:
- How to add unit testing to our out-of-the-box weather API that we have?
- How that can work with our Docker setup?
- How we can run unit tests locally?
- How we can run unit tests in GitHub Actions?
- How we can report the unit test results in GitHub Actions?
Requirements
We will be picking-up where we left off in Part 5, which means you’ll need the end-result from GitHub Repo - Part 5 to start with.
If you have followed this series all the way through, and I would encourage you to do so, but it isn't necessary if previous posts are knowledge to you already.
Additional Requirements
We will need one further VS Code extension that will prove useful:
- .NET Core Test Explorer - https://marketplace.visualstudio.com/items?itemName=formulahendry.dotnet-test-explorer
Add new unit test project
Open any terminal such as Windows Terminal.
Navigate to the root folder of the project.
In my case, I've put it into a GitHub folder under a folder called, 'Samples.WeatherForecast'.
Execute:
dotnet new xunit -o ./test/Samples.WeatherForecast.Api.UnitTest
Add reference
We now need to add a reference to our API to our unit test project.
Staying in the same terminal session; ensuring you are still in the root directory.
Execute:
dotnet add ./test/Samples.WeatherForecast.Api.UnitTest/Samples.WeatherForecast.Api.UnitTest.csproj reference ./src/Samples.WeatherForecast.Api/Samples.WeatherForecast.Api.csproj
Open VS Code
Let's get started and open VS Code.
TIP
In your terminal, execute:
code .
You should now have your API project as before, but now you should have a test
folder and your new xUnit test project too.
Navigate to the test project and Click → UnitTest1.cs
.
Unit test
- Let's rename the file from,
UnitTest.cs
toWeatherForecastControllersTests.cs
. - Change the namespace from,
UnitTest1
toSamples.WeatherForecast.Api.UnitTest
. - Add a new (overwrite existing test) with a new test called,
ShouldReturnAListOfValues()
.
Full code
using System;
using Xunit;
using Microsoft.Extensions.Logging.Abstractions;
namespace Samples.WeatherForecast.Api.UnitTest
{
public class WeatherForecastControllerTests
{
[Fact]
public void ShouldReturnAListOfValues()
{
// Arrange
var logger = new NullLogger<Samples.WeatherForecast.Api.Controllers.WeatherForecastController>();
var service = new Samples.WeatherForecast.Api.Controllers.WeatherForecastController(logger);
// Act
var result = service.Get();
// Assert
Assert.NotNull(result);
}
}
}
Your VS Code screen should look like mine below:
Now, this is not the best unit test in the world, but the principle is still the same.
This is to demonstrate how you can set it up and where to put your unit tests.
Navigate to the .NET Core Test Explorer
You'll notice that it says, "Test1", this was the original test name in our UnitTest.cs
file, it needs to re-scan to find tests.
Click Refresh
You should see our new test method popup like below:
Click the Play button
You should see a nice green tick in two places, on the test explorer, and in your code against the test method.
Dockerfile
Now we have the unit test(s) working, we need to modify our Dockerfile, and I'm sorry to say, but we are going to change it quite a lot...
- We need to add in the unit test stage, which will introduce a new
ENTRYPOINT
. - Whilst we are here, we can optimise the build process a little too, and I'll explain why this is important - it goes back to Part 2 where we optimised for size, and we went through how the Docker build cache works. However, we haven't taken complete full advantage of it here... and we should!
Let's start with the optimisations
We are going to start with the optimisations simply because for our unit tests to run, we will need some of those changes.
We will go from line 1 to the end of the file just so it's all nice & clear.
Feel free to clear your Dockerfile
.
ARG VERSION=5.0-alpine
FROM mcr.microsoft.com/dotnet/runtime-deps:${VERSION} AS base
WORKDIR /app
EXPOSE 8080
# HEALTHCHECK --interval=60s --timeout=3s --retries=3 \
# CMD wget localhost:8080/health -q -O - > /dev/null 2>&1
Here we set a variable for the version and we setup our base image, set our working directory to
/app
,EXPOSE
port 8080, and we have our commented-outHEALTHCHECK
; this we don't really need, but it's nice to have it on hand if we do.This was at the end of our
Dockerfile
previously, but I've moved it to the top as it's a common thing to see in multi-stage builds.
FROM mcr.microsoft.com/dotnet/sdk:${VERSION} AS build
WORKDIR /code
# Copy and restore as distinct layers
COPY ["src/Samples.WeatherForecast.Api/Samples.WeatherForecast.Api.csproj", "src/Samples.WeatherForecast.Api/Samples.WeatherForecast.Api.csproj"]
COPY ["test/Samples.WeatherForecast.Api.UnitTest/Samples.WeatherForecast.Api.UnitTest.csproj", "test/Samples.WeatherForecast.Api.UnitTest/"]
Here we are using the dotnet SDK and we label that as our
build
image.We set the
WORKDIR
to/code
- I didn't want to call it/app
as I wanted a clear difference between the code to build and the final deploy folder.The most important thing to note is that we are now specifically using
COPY
to copy our API project, and our Unit Test project
Why copy each one instead of what we had before, which was using COPY . .
?
Well, it's because if any files change, the entire layer will need to be re-created each time, if we copy each one at a time, Docker will create a new layer for each COPY
command. Therefore, Docker can use its build cache to see if it needs copying or not; Docker will highly optimise the build process.
RUN dotnet restore "src/Samples.WeatherForecast.Api/Samples.WeatherForecast.Api.csproj" -r linux-musl-x64
RUN dotnet restore "test/Samples.WeatherForecast.Api.UnitTest/Samples.WeatherForecast.Api.UnitTest.csproj" -r linux-musl-x64
COPY . .
We then need to run
dotnet restore
for each project, similar to what we had before.Finally, we execute
COPY . .
to copy any other files that could have changed.
# Build
RUN dotnet build \
"src/Samples.WeatherForecast.Api/Samples.WeatherForecast.Api.csproj" \
-c Release \
--runtime linux-musl-x64 \
--no-restore
RUN dotnet build \
"test/Samples.WeatherForecast.Api.UnitTest/Samples.WeatherForecast.Api.UnitTest.csproj" \
-c Release \
-r linux-musl-x64 \
--no-restore
Now we build each project using
dotnet build
, ensuring we set,--no-restore
, just like we did before.
# Unit test runner
FROM build AS unit-test
WORKDIR /code/test/Samples.WeatherForecast.Api.UnitTest
ENTRYPOINT dotnet test \
-c Release \
--runtime linux-musl-x64 \
--no-restore \
--no-build \
--logger "trx;LogFileName=test_results_unit_test.trx"
This is a big change to note here, this is our unit test runner.
We use
FROM build
as we need to use the .NET SDK, we label this one as,unit-test
.Set the
WORKDIR
We create a new ENTRYPOINT and execute
dotnet test
This is where the magic happens!
We have another ENTRYPOINT
- How can that work I hear you ask?
Well, it comes down to how Docker processes the Dockerfile
- The last ENTRYPOINT
command is the one it will run; kind of like a default if you will.
So now I hear you ask, how can we make use of 2 ENTRYPOINT
commands then...?
Part of the docker build
command is an option called, --target
.
For more information about the
docker build
command, please see, Docker Docs - docker build - target
--target
will target a specific build stage, the label we need to use is defined using the FROM
statement, it's the AS
we need - in our case here, we have FROM build AS unit-test
, therefore, the build stage name is, unit-test.
When we wish to execute our unit test runner using Docker, we can simply build our Dockerfile
up to this build stage; the remaining instructions in the Dockerfile
are ignored.
Build [unit-test]
To build our docker image, we can execute the following command in your terminal of choice:
docker build --target unit-test -t samples-weatherforecast-unit-test:v6 .
Run [unit-test]
If we wish to execute our unit tests, we can run the following command:
docker run --rm samples-weatherforecast-unit-test:v6`
This will run our instructions in that build stage, which in our case are our unit tests using, dotnet test
.
Test Results
When you execute docker run
on this unit-test image, our dotnet test
command is outputting test results.
However, the test results that dotnet test
creates are stored in the in the container, we need to get them out to our host; otherwise, we can't see them.
This is where bind volume mount option comes into play - we can use the -volume
OR -v
option for short to map the container directory to our host directory when we use, docker run
.
For more information about bind volume mount please see, Docker Docs - docker run - Bind Volume Mount
Our docker run
command can be modified to something like the following:
docker run --rm -v "${pwd}\TestResults:/code/test/Samples.WeatherForecast.Api.UnitTest/TestResults/" samples-weatherforecast-unit-test:v6
Info
pwd = Print Working Directory
Now when we execute our docker run
command, we should see the test results file; called, test_results_unit_test.trx
.
Please ensure this works before continuing.
Let's make execution easier
It is a little bit repetitive executing these commands from time to time, sure you can use the test explorer quite a lot, but having to remember these commands and type them out without any typos is rather annoying.
The best approach is to script it, you can use pretty much anything you want, here as an example, I'm going to use trusty old PowerShell, but feel free to use BASH or even a MAKE file.
Script running of unit tests
In your root directory, create a new file called, unit-test.ps1
.
Paste in the following code:
$IMAGE_NAME_AND_TAG="samples-weatherforecast-unit-test:v6"
Write-Output "Unit tests [build]"
docker build --target unit-test -t $IMAGE_NAME_AND_TAG .
Write-Output "Unit tests [run]"
docker run --rm -v "${pwd}\TestResults:/code/test/Samples.WeatherForecast.Api.UnitTest/TestResults/" $IMAGE_NAME_AND_TAG
As you can see, it's pretty simplistic, we are just trying to make things easier for ourselves.
Script the running of the build
Let's do the same thing to our build so we don't have to type docker build
all the time.
In your root directory, create a new file called, build.ps1
.
Paste in the following code:
$IMAGE_NAME_AND_TAG="samples-weatherforecast:v6"
Write-Output "App [build]"
docker build -t $IMAGE_NAME_AND_TAG .
How do I use these scripts?
Unit tests
Open your terminal and execute: .\unit-test.ps1
.
This will build your test container, and run your unit tests.
Run this now please...
If all goes well, you should see that the unit test(s) have passed.
Build
Open your terminal and execute: .\build.ps1
.
Run this one now too please...
Again, if all is fine, you should have your container image built.
Please make sure your container has been built by double-checking it's there:
docker image ls
is your friend here.In addition, run your the container and test with Postman or your preferred method.
docker run -it --rm -p 8080:8080 samples-weatherforecast:v6`
GitHub Actions
We have everything working locally and we are happy, let's move onto our CI; lovely GitHub Actions.
We will need to add in a some steps to our workflow that we had before.
In our env
section in our build-and-push.yaml
file, let's add a new variable called, image-name-unit-tests
, code below:
env:
image-name: ghcr.io/peterjking/samples-weatherforecast-part-6:${{ github.sha }}
image-name-unit-tests: unit-tests:latest
Next up is adding a new step, just below the, checkout repo
step. We need to build our unit-test runner image like so below:
- name: Unit tests [build]
run: docker build --target unit-test -t ${{ env.image-name-unit-tests }} .
Immediately after this step, we need to run the unit tests, please add the step below:
- name: Unit tests [run]
run: docker run --rm -v ${{ github.workspace }}/path/to/artifacts/testresults:/code/test/Samples.WeatherForecast.Api.UnitTest/TestResults ${{ env.image-name-unit-tests }}
You'll notice we are doing the bind volume mount, -v
, but it's different from our local script, this is because of how GitHub Actions works, it seems we cannot make use of pwd
, but thankfully, GitHub provides built-in variables we can take advantage of, in our case, we will use the, github.workspace
.
The GitHub workspace directory path. The workspace directory is a copy of your repository if your workflow uses the actions/checkout action. If you don't use the actions/checkout action, the directory will be empty. For example, /home/runner/work/my-repo-name/my-repo-name.
For more information, please see, GitHub Docs - Environment variables
We have the test results file in trx
format on this GitHub Actions Runner Host. It would be nice if we can persist this against this build job... and we can! We can use the upload artifact v2 action.
- name: Unit tests [results]
uses: actions/upload-artifact@v2
if: always()
with:
name: unit-test-results
path: ${{ github.workspace }}/path/to/artifacts/testresults/test_results_unit_test.trx
Commit your changes now, and ensure everything works.
As soon as you commit your workflow file, the build should kick-off, your unit tests image should be build, run the image, output the tests results, and store the test results against the build job itslef.
You should see a screen like this:
We can do better!
Can we? Yes, we sure can... It's great we have an artifact stored against the job, but it's rather painful and annoying to download it, and open the trx file, or look through the build log.
There are a bunch of GitHub Actions that can help us out here, for me I've chosen, GitHub Actions - Test Reporter
Immediately after our artifact upload step, let's create another one like so:
- name: Unit tests [publish]
uses: dorny/test-reporter@v1
if: always()
with:
name: Unit tests
path: ${{ github.workspace }}/path/to/artifacts/testresults/test_results_unit_test.trx
reporter: dotnet-trx
token: ${{ secrets.GITHUB_TOKEN }}
This action will carry out some magic for us.
The
token: ${{ secrets.GITHUB_TOKEN }}
is an in-built, predefined secret that gives access to the build job.
Please commit this change, you'll build should kick-off once again, and hopefully you'll see the test results nicely formatted.
What have we learned?
We have learned how to subtly change our image layering to ensure the Docker build cache is as optimized as possible; yes, it's more code to write, but you'll save build time in the end, especially if your solution grows.
We've added a new unit test project using the commandline and hooked up the references, build an incredibly basic unit test (in principle).
We have taken advantage of the --target
option to build a container image up to a certain point in our Dockerfile; in our case, we have our unit test runner. We have even scripted it to make our lives a litter easier too with a few lines of good old Powershell :)
On top of this we have modified our GitHub Actions' Workflow in a similar vein to how we are executing locally; we build the unit test image, run the unit test using Docker, and build our final image.
Finally, we have persisted the unit test results as an artifact and have a nicely formatted report against our build job.
Next up
Part 7 in this series will be about:
- Code coverage