Using Terraform workspaces for fun and profit – Part 2
In the first part of this series, we built a Terraform system that uses a single set of files to maintain 3 separate environments, the next step is to automate as much of this process as possible.
To do that, we’re going to leverage Azure DevOps to create a build pipeline for each environment, which will kick off when we merge code into the respective environment’s git branch.
Step 1: Create a git repo to store the .tf
files
First you’ll need to sign into Azure DevOps. Once signed in, you’ll want to either select an existing organization, or create a new one, and then select an existing project, or create a new one.
Once you’ve got a project open, the +
symbol at the top of the left-hand panel and select ‘New repository’. Create a repository using the naming convention of your company (or if you don’t have a particular convention, I prefer Terraform.DevOps.IT-CompanyName
)
Finally, if you have existing .tf
files that already exist, create a branch called dev
(git checkout -b dev
), and then commit and push the files up to the repo.
Step 2: Use Azure Blob Storage as the backend provider to store your state files
Since the alternative is hosting a state file on a single location (like your laptop), which may die/be stolen/other bad things, we want to tell Terraform to keep its state files in the cloud for easy and shared access.
Open up the Azure portal and browse to the Storage Accounts section. Create a new Storage Account and use the default options. Unless your Terraform environment is absolutely massive (e.g. greater than 1 gigabyte of .tf
files), this account will use next to no storage, certainly less than the threshold to be charged $0.01 per month. Once the the Storage Account is created, you’ll want to add the appropriate backend type to your Terraform library as defined here, and then run terraform init
which will initialize the Storage Account to host your state file.
Step 3: Begin creating the Build pipeline
Now that we have our state stored in Azure Blob Storage, we can begin working on the actual pipeline. This is where the automation actually happens and you’ll see how much easier it makes things.
- Go back to Azure DevOps, under the Pipelines section on the left-hand panel, select ‘Builds’, and then click the ‘New pipeline’ button.
- When the window opens, choose the Classic editor link near the bottom.
- Select the correct project, repository, and branch (
dev
) - Choose to start with an Empty job
Now that you’ve got a skeleton of a build, here’s where we actually start adding the steps to create a fully functional Terraform setup.
Next to ‘Agent job 1’, click the +
sign. In the search field, enter Terraform
, select the entry called ‘Terraform Build & Release Tasks’ by Charles Zipp, and then select ‘Get it free’. Follow the prompts and install it on DevOps (it doesn’t cost anything). Once it’s installed, just refresh the build page and click the +
sign and search for Terraform
again.
For each pipeline we build, we’ll first need to install the Terraform tools, so add a ‘Terraform Installer’ task. Select the new task and ensure that the version of Terraform being installed is exactly the same as the version you’re using on your local machine, so edit the ‘Version’ field and replace it with the correct value.
Add another task, so search for Terraform again, but this time select ‘Terraform CLI’. It will probably default to ‘terraform validate’ so select it so that we can update a few things. Since the first thing we need to do is initialize the environment by downloading all of the necessary providers, change the Display Name to something more descriptive like terraform init
and change the Command dropdown to ‘init’. Leave everything else the same.
Add another Terraform CLI task, this time for the validation step. This will make sure that before we try to do anything with our .tf
files that they’re valid and the syntax is correct. Since validate
is the default option for the Terraform CLI task, we’re done with this one.
Add another task, but this time, since the Terraform add-on doesn’t support the workspace
command (as of writing), we need to add a shellexec task, so search for that string and add it. Change the Display Name and Code to both read terraform workspace select dev
.
You guessed it, add another Terraform CLI task for the plan. Update the Display Name to read terraform plan
and then select plan
from the Command dropdown. Here’s where we get to define the environment, so in the ‘Command Options’ text field, enter -var-file=environments/dev.tfvars -out=TFPlan -input=False
This tells Terraform to use the variables that you’ve specified for the DEV
environment, write the output to a file called TFPlan
and don’t expect any input.
Almost there.
Add another task, this time searching for copy
and selecting Microsoft’s Copy Files task. For the Source Folder, enter $(Build.SourcesDirectory)
and for the Target Folder, enter $(Build.ArtifactStagingDirectory)
For the final task, search for publish
and select Microsoft’s Publish Build Artifacts. The defaults here are fine.
All you have to do to finish up is click the dropdown arrow next to ‘Save and queue’ and select ‘Save’, accept the defaults, and click ‘Save’ again.
You now have a working DevOps Build Pipeline!
In the next part, we’ll take the artifacts published from this Build pipeline and use them to create a Release pipeline that will actually tell Terraform to make your changes.