How to spam your co-workers with cat facts in 5 easy steps

Step 1 – Find a cat facts API

https://catfact.ninja/

Well that was easy.

Step 2 – Build a serverless, Azure Logic App using Terraform that will connect to the API and spam your co-workers with a new fact every 5 minutes

https://github.com/nexxai/cat-facts/

Ok that part was easy too, but come on, it’s gotta be at least a little difficu–

Step 3 – Create an Office 365 connection that your Logic App can use

Open the Azure Logic Apps blade

You have 60 seconds to manually add a step that connects your Office 365 account to this app. ‘Get Calendars’ requires the least configuration.

Step 4 – Wait for your co-workers’ email clients to play their New Email alert sound

Start laughing, and keep laughing every 5 minutes from now until forever, asserting your feline dominance over your team.

“But that was only 4 steps, where’s number fi

Step 5 – Have Senior PM of Microsoft Azure Functions see your stupid app and tweet about it

Sure, no prob–wait, what?

Add your AWS API key info in a Key Vault for Terraform

EDIT: Updated on July 10, 2019; modified second- and third-last paragraphs to show the correct process of retrieving the AWS_SECRET_ACCESS_KEY from the Key Vault and setting it as a protected environment variable

Our primary cloud is in Azure which makes building DevOps pipelines with automation scoped to a particular subscription very easy, but what happens when we want to deploy something in AWS, since storing keys in source control is A Very Bad Idea™?

Simple, we use Azure Key Vault.

First, we created a Key Vault specifically for this purpose called company-terraform which will specifically be used to store the various secrets for Terraform-based deployments. When you tie a subscription from Azure DevOps to an Azure subscription, it creates an “application” in the Azure Enterprise Applications list, so give that application Get and List permissions to this vault.

Next, we created a secret called AmazonAPISecretKey and then set the secret’s content to the actual API key you are presented when you enable programmatic access to an account in the AWS IAM console.

In our Azure DevOps Terraform build and release pipelines, we then added an Azure Key Vault step, selecting the appropriate subscription and Key Vault. Once selected, we added a Secrets filter AmazonAPISecretKey meaning that it will only ever fetch that secret on run; if you will be adding multiple secrets which will all be used in this particular pipeline, add them to this filter list.

Finally, we can now use the string $(AmazonAPISecretKey) in any shellexec or other pipeline task to authenticate against AWS, while never having to commit the actual key to a viewable source.

Since one of the methods the Terraform AWS provider can use to authenticate is by using the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables, we will set them up so that DevOps can use them in its various tasks.

First, open your Build or Release pipeline and select the Variables tab. Create a new variable called AWS_ACCESS_KEY_ID and set the value to your access key ID (usually something like AK49FKF4034F42DZV2VRMD). Then create a second variable called AWS_SECRET_ACCESS_KEY which you can leave blank, but click the padlock icon next to it, to tell DevOps that its contents are secret and shouldn’t be shared.

Now create a shellexec task and add the following command to it, which will set the AWS_SECRET_ACCESS_KEY environment variable to the contents of the Key Vault entry we created earlier:

echo "##vso[task.setvariable variable=AWS_SECRET_ACCESS_KEY;]$(AmazonAPISecretKey)"

And there you have it! You can now reference your AWS accounts from within your Terraform structure without ever actually exposing your keys to prying eyes!

Using Terraform workspaces for fun and profit – Part 2

In the first part of this series, we built a Terraform system that uses a single set of files to maintain 3 separate environments, the next step is to automate as much of this process as possible.

To do that, we’re going to leverage Azure DevOps to create a build pipeline for each environment, which will kick off when we merge code into the respective environment’s git branch.

Step 1: Create a git repo to store the .tf files

First you’ll need to sign into Azure DevOps. Once signed in, you’ll want to either select an existing organization, or create a new one, and then select an existing project, or create a new one.

Once you’ve got a project open, the + symbol at the top of the left-hand panel and select ‘New repository’. Create a repository using the naming convention of your company (or if you don’t have a particular convention, I prefer Terraform.DevOps.IT-CompanyName)

Finally, if you have existing .tf files that already exist, create a branch called dev (git checkout -b dev), and then commit and push the files up to the repo.

Step 2: Use Azure Blob Storage as the backend provider to store your state files

Since the alternative is hosting a state file on a single location (like your laptop), which may die/be stolen/other bad things, we want to tell Terraform to keep its state files in the cloud for easy and shared access.

Open up the Azure portal and browse to the Storage Accounts section. Create a new Storage Account and use the default options. Unless your Terraform environment is absolutely massive (e.g. greater than 1 gigabyte of .tf files), this account will use next to no storage, certainly less than the threshold to be charged $0.01 per month. Once the the Storage Account is created, you’ll want to add the appropriate backend type to your Terraform library as defined here, and then run terraform init which will initialize the Storage Account to host your state file.

Step 3: Begin creating the Build pipeline

Now that we have our state stored in Azure Blob Storage, we can begin working on the actual pipeline. This is where the automation actually happens and you’ll see how much easier it makes things.

  1. Go back to Azure DevOps, under the Pipelines section on the left-hand panel, select ‘Builds’, and then click the ‘New pipeline’ button.
  2. When the window opens, choose the Classic editor link near the bottom.
  3. Select the correct project, repository, and branch (dev)
  4. Choose to start with an Empty job

Now that you’ve got a skeleton of a build, here’s where we actually start adding the steps to create a fully functional Terraform setup.

Next to ‘Agent job 1’, click the + sign. In the search field, enter Terraform, select the entry called ‘Terraform Build & Release Tasks’ by Charles Zipp, and then select ‘Get it free’. Follow the prompts and install it on DevOps (it doesn’t cost anything). Once it’s installed, just refresh the build page and click the + sign and search for Terraform again.

For each pipeline we build, we’ll first need to install the Terraform tools, so add a ‘Terraform Installer’ task. Select the new task and ensure that the version of Terraform being installed is exactly the same as the version you’re using on your local machine, so edit the ‘Version’ field and replace it with the correct value.

Add another task, so search for Terraform again, but this time select ‘Terraform CLI’. It will probably default to ‘terraform validate’ so select it so that we can update a few things. Since the first thing we need to do is initialize the environment by downloading all of the necessary providers, change the Display Name to something more descriptive like terraform init and change the Command dropdown to ‘init’. Leave everything else the same.

Add another Terraform CLI task, this time for the validation step. This will make sure that before we try to do anything with our .tf files that they’re valid and the syntax is correct. Since validate is the default option for the Terraform CLI task, we’re done with this one.

Add another task, but this time, since the Terraform add-on doesn’t support the workspace command (as of writing), we need to add a shellexec task, so search for that string and add it. Change the Display Name and Code to both read terraform workspace select dev.

You guessed it, add another Terraform CLI task for the plan. Update the Display Name to read terraform plan and then select plan from the Command dropdown. Here’s where we get to define the environment, so in the ‘Command Options’ text field, enter -var-file=environments/dev.tfvars -out=TFPlan -input=False This tells Terraform to use the variables that you’ve specified for the DEV environment, write the output to a file called TFPlan and don’t expect any input.

Almost there.

Add another task, this time searching for copy and selecting Microsoft’s Copy Files task. For the Source Folder, enter $(Build.SourcesDirectory) and for the Target Folder, enter $(Build.ArtifactStagingDirectory)

For the final task, search for publish and select Microsoft’s Publish Build Artifacts. The defaults here are fine.

All you have to do to finish up is click the dropdown arrow next to ‘Save and queue’ and select ‘Save’, accept the defaults, and click ‘Save’ again.

You now have a working DevOps Build Pipeline!

In the next part, we’ll take the artifacts published from this Build pipeline and use them to create a Release pipeline that will actually tell Terraform to make your changes.

Posts navigation