Terraform Ecosystem Pipelines
The Ecosystem surrounding Terraform is growing every day. Some of the tools have become essential and need to be integrated into your CI/CD pipelines. Here are some examples using Azure DevOps.
Introduction
This is an quick overview for some tool you should check out, when you want to work properly with Terraform. Terraform has sparked an entire ecosystem of other tools to make your life better. Terraform itself has some testing integrated, even if its not the greatest, better than nothing, right? terraform fmt formats your code, terraform validate checks for invalid configurations. If you want to do more, you have come to the right place.
TL;DR
You can find the pipeline code in the Repo terraform-pipelines on my GitHub account. All pipelines are written for Azure Pipelines and with a heavy use of templating.
Prerequisites
The pipelines are written for Azure Pipelines, and they have some prerequisites.
- secure files
- authentication for terraform is done using secure files - two files
- env_<env name>_backend.sec.tfvars
- credentials for azurerm backend
- env_<env name>.sec.tfvars
- credentials for azurerm provider to access resources
- env_<env name>_backend.sec.tfvars
- authentication for terraform is done using secure files - two files
Pipelines
All files related to pipelines are stored in the repo root in the folder .azuredevops. In there are two folders:
- pipelines
- contains all pipelines definitions - these files are added in Azure Pipelines
- templates
- three subfolders - one for each kind of template
- steps
- jobs (empty in this example)
- stages
- three subfolders - one for each kind of template
Infrastructure Deployment
The bread and butter - a pipeline to deploy your terraform cofiiguration. It consists of two stages - the first creates a terraform plan and checks, if changes were made. If changes were made, the second stage - terraform apply - will run, otherwise, it will be skipped and the run is complete
Terraform Docs
To automatically create a docs.md for the root module as well as any other module, this pipeline uses terraform-docs to commit the documentation directly during the pipeline run. This pipeline is meant to be used as a build-validation pipeline for pull requests.
terraform fmt
Formats the terraform code according to best practices. Commits changes directly to the current branch. Meant to be used as a build validation pipeline in your pull requests. Official Docs here.
Infrastructure Validation
This is where the magic happens, this pipeline runs several tests against your terraform code and publishes the results in junit format to the pipeline. If there are any errors, the run fails.
Conclusion
This is not all, there are many tools for Terraform, but some of my favorites, nevertheless. Its a great way to perform some level of automated testing as well as creation of documentation.
Office 365 Tenant to Tenant Migration Stores Part 6
In the first part of the blog series, we took a look at the topic of planning and selecting the migration scenario and developed a long-term strategy based on the business and technical requirements and defined how the tenant migration should be implemented schematically.
Terraform - create Blocks Dynamically - using the Dynamic Block
Some resources in Terraform allow to pass them a list if multiple values can be set - for instance, DNS servers for the azurerm_virtual_network can be set this way. However, on the same resouce, you can add subnet during the creation and this would not work in the same way, because each subnet requires their own subnet-block in the resource. However, Terraform offers a way to create mutiple blocks of the same kind dynamically - using the dynamic-block.