💡Automatic Deployment of your project dependencies updates on GCP : Efficiency vs. Cost?

Jean-Phi Baconnais - Apr 30 - - Dev Community

Updating the dependencies of your project is not an easy task. If you have to do this on each side project you have, You're gonna need a lot of caffeine! 💪.

More seriously, this can take a lot of time especially if you have a lot of components in your scope (side projects, open source projects or for the applications of your company).

This month, I gave a talk with my Zenika colleague Lise at the DevoxxFR conference about Renovate and Dependabot, two great tools to help you automatize and upgrade your dependencies.

⁉️ How do they work ?

These tools analyze your applications depending on their types (node, maven, gradle, docker etc). They scan all the dependencies to find new versions adapted to the project from the registry. Then they create Pull Request (PR) or Merge Request (MR) depending on the platform your project is.

Renovate MR

⬆️ Send these changes in production?

Updating your dependencies is a very good practice, especially, if you have many projects you don’t work every day on. Renovate and Dependabot check for you if new versions of your dependencies are available and propose you to a PR/MR. Very useful when, unfortunately, vulnerabilities can come from any versions of dependencies.

People working on the project can see the new PR/MR. After verifying the impact of these PR/MR and testing them, developers can approve and merge them.

The main branch of your project is up to date, a great step for your project but …

📦 Enjoy the auto merge/auto tag

Renovate and Dependabot bring the “auto merge” feature to automatically merge the PR/MR created. Be careful, this behavior may be very interesting, but it is very dangerous if you have no tests on your project and tests you trust in. If a PR/MR contains a big and breaking change, your tests should detect them and fail. Rest assured, the auto merge will not be active in this case, but let's say it again, testing is very important 😁.

The auto merge can be configured and the auto merged PR/MR can be selected, for example for only patch or minor updates, or for updates on some packages.

A patch or a new feature is on your main branch. But when and who will be the lucky (fortunate?) person to have the honor of deploying this to production?

This “reflection” takes time, and during this time, maybe someone is trying to hack your application

Auto-tagging a project, Renovate or Dependabot can do this. With a Git Workflow and another tool like semantic-release you can do this. This behavior is a “gymnastic” to do on the CI/CD of your project but it’s not complicated. For example with GitLab CI, you can verify the pipeline run on the default branch of your project :

and run the semantic-release command:

And the release is created, the version depends on the commit 🎉:

A release is created

🚀 The last step: auto deploy on GCP?

After configuring Renovate or Dependabot, set up the auto merge behavior, create one release with the new version of one dependency. Great, but your production doesn't have this fix. The ultimate goal would be to imagine automation from the creation of one PR/MR to their deployment on production. Let’s go to “auto deploy”.

A majority of my personal and open source projects are deployed on Google Cloud Platform (GCP) serverles structures as Cloud Run or Cloud functions. For my GitLab projects, I enjoy using Cloud Seed - a GitLab project born from the partnership between Google and GitLab - to deploy quickly on GCP. You can find more information about Cloud Seed on this blog post.

By default, for each branch or tag, a new service is automatically deployed on GCP as you can see in this example:

Cloud Run services

With Renovate and a CI/CD using Cloud Seed, each release generated by the Renovate auto merge will deploy a new version of my application. It’s great, is everything ok?

🌱 What about the impact?

Let’s imagine 10 MR opened by Renovate means 10 pipelines generated. If these 10 MR are concerned by auto merge, they execute 10 releases generated by 10 other pipelines.

And these 10 new releases generate 10 deployments on GCP, so 10 new revisions of our application.

When working on multiple projects or within a microservices infrastructure, the execution time and cost of your CI/CD pipeline can become significant. This graph illustrates the evolution of CI/CD performance following the integration of Renovate on one personal project.

Compute usage of CI/CD Pipeline

Therefore, it is crucial to anticipate the potential impacts of implementing Renovate or Dependabot on your projects. These impacts include environmental concerns related to the machine resource costs consumed by CI/CD, as well as financial considerations associated with the overall cost incurred.

If you auto merge, auto tag and auto deploy your application, you can quickly have an important list of builds, deployments, infrastructure impacts and also, an important cost as you can see with Cloud Build here::

Cloud Build history

With Cloud Run, you pay for what you use. For the open-source application where I've implemented auto-deployment, I'm the only one using the API directly, so there's no price impact to worry about.

If your application is hosted on Kubernetes for example, adding a service for each tag of your project can affect the infrastructure and the performance of your cluster.

🤔 How can you reduce impacts?

Two directions of improvement are possible: Renovate and GCP.

With Renovate, you can group dependencies in your PR/MR. In this example, angular and angular cli dependencies are bundled together:

{
    "extends": ["monorepo:angular", "monorepo:angular-cli"],
    "groupName": "all angular"
}

Enter fullscreen mode Exit fullscreen mode

Reducing the number of Renovate PRs/MRs on your project can benefit both your infrastructure and your teams by minimizing noise. To achieve this, you can limit the number of PRs/MRs proposed by Renovate using the prConcurrentLimit attribute.

Another approach is to schedule the execution of your Renovate instance. This can be done using the schedule attribute, allowing you to schedule Renovate to run outside of your teams' working hours, such as in the morning, at noon, or in the evening.

On Cloud Run, you can limit the number of instances using the --max-instances attribute. This allows you to automatically remove the oldest instances and avoid having an overwhelming number of instances running at any given time.

From a financial perspective, Cloud Build bills based on the number of builds for your project. Therefore, the more you build, the more you pay. What's the point of paying for 10 releases in a given timeframe if your project can easily wait and deliver only once? In my case, I'm only talking about serverless architectures, but as I mentioned earlier, for larger infrastructures, the impact can be more significant. Following my discussion with Julien Landuré, it would be beneficial to have the deployment cost information directly included in the MR. I didn’t see in the gcloud CLI the possibility to do this but this allows to see and only deploy when it’s necessary.

👀To be continued

As I mentioned at the beginning of this blog post, this is an introduction to the “risk” of integrating tools like Renovate or Dependabot on your projects. While these projects offer valuable benefits by ensuring projects are up to date, it is crucial to carefully consider the potential impacts that need to be anticipated

Thank Séverine and Jimmy for you help and reveiw 🙏

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .