key-vault Archives » nexxai.dev https://nexxai.dev/category/key-vault/ reminders for my future self Mon, 22 Mar 2021 22:06:43 +0000 en-CA hourly 1 https://wordpress.org/?v=6.5.5 Convert a CRT SSL certificate chain to PFX format https://nexxai.dev/convert-a-crt-ssl-certificate-chain-to-pfx-format/?utm_source=rss&utm_medium=rss&utm_campaign=convert-a-crt-ssl-certificate-chain-to-pfx-format https://nexxai.dev/convert-a-crt-ssl-certificate-chain-to-pfx-format/#respond Mon, 22 Mar 2021 21:40:11 +0000 https://nexxai.dev/?p=275 The post Convert a CRT SSL certificate chain to PFX format appeared first on nexxai.dev.

Many SSL certificate authorities (CAs) do not natively support .PFX format certificates which means that if you plan on installing them on something like an Azure App Service, you may encounter issues. Today, let’s figure out how to convert a CRT SSL certificate chain to PFX format. First, let’s generate a private key and certificate […]

]]>
The post Convert a CRT SSL certificate chain to PFX format appeared first on nexxai.dev.

Many SSL certificate authorities (CAs) do not natively support .PFX format certificates which means that if you plan on installing them on something like an Azure App Service, you may encounter issues. Today, let’s figure out how to convert a CRT SSL certificate chain to PFX format.

First, let’s generate a private key and certificate signing request. Run the following command, and answer the questions as accurately as possible. The private key file (domain.key) should be kept secret and protected.

openssl req \
        -newkey rsa:2048 -nodes -keyout domain.key \
        -out domain.csr

Next, take the contents of domain.csr (it is just a plaintext file with your answers and some other non-secret information base64-encoded; it can be opened in any text editor) and request your certificate through your CA. This process varies per certificate authority, and so is out of scope for this article.

[Time passes]

Now, your CA provides you with a .ZIP file with the following files.

your_domain_com.crt
AAACertificateServices.crt
DomainValidationSecureServerCA.crt
USERTrustRSAAAACA.crt

(where your_domain_com.crt is the actual certificate file and the other .CRT files represent the various certificates that will allow a browser to chain up to the root; while the filenames and number of files will almost certainly be different for each certificate authority, the point here is to illustrate that there will be some number of .CRT files and that they are all important)

Extract those files into the same folder that you have the domain.key file from earlier in.

Finally, let’s take our certificate and combine them with the rest of the chain to create a single .PFX file by running the following command. Your site’s certificate should be specified in the -in parameter, and for each of the chain certificates, adding another -certfile entry.

openssl pkcs12 -export -out certificate.pfx \
        -inkey domain.key \
        -in your_domain_com.crt \
        -certfile AAACertificateServices.crt \
        -certfile DomainValidationSecureServerCA.crt \
        -certfile USERTrustRSAAAACA.crt

NOTE: Azure App Services and Azure Key Vaults require a password-protected .PFX file, so ensure that you enter one when prompted. When you go to upload the certificate and you are required to select the .PFX file and a password, the password you created here is the one it’s referring to.

And you’re done! You now have a file in that folder (certificate.pfx) that you can upload/install and ensure your site is protected against MITM attacks.

]]>
https://nexxai.dev/convert-a-crt-ssl-certificate-chain-to-pfx-format/feed/ 0
Deploying an Azure App Service from scratch, including DNS and TLS https://nexxai.dev/deploying-an-azure-app-service-from-scratch-including-dns-and-tls/?utm_source=rss&utm_medium=rss&utm_campaign=deploying-an-azure-app-service-from-scratch-including-dns-and-tls https://nexxai.dev/deploying-an-azure-app-service-from-scratch-including-dns-and-tls/#respond Fri, 11 Oct 2019 17:27:51 +0000 https://nexxai.dev/?p=186 The post Deploying an Azure App Service from scratch, including DNS and TLS appeared first on nexxai.dev.

As many of you have probably gathered, over the past few weeks, I’ve been working on building a process for deploying an Azure App Service from scratch, including DNS and TLS in a single Terraform module. Today, I write this post with success in my heart, and at the bottom, I provide copies of the […]

]]>
The post Deploying an Azure App Service from scratch, including DNS and TLS appeared first on nexxai.dev.

As many of you have probably gathered, over the past few weeks, I’ve been working on building a process for deploying an Azure App Service from scratch, including DNS and TLS in a single Terraform module.

Today, I write this post with success in my heart, and at the bottom, I provide copies of the necessary files for your own usage.

One of the biggest hurdles I faced was trying to integrate Cloudflare’s CDN services with Azure’s Custom Domain verification. Typically, I’ll rely on the options available in the GUI as the inclusive list of “things I can do” so up until now, if we wanted to stand up a multi-region App Service, we had to do the following:

  1. Build and deploy the App Service, using the azurewebsites.net hostname for HTTPS for each region (R1 and R2)

    e.g. example-app-eastus.azurewebsites.net (R1), example-app-westus.azurewebsites.net (R2)
  2. Create the CNAME record for the service at Cloudflare pointing at R1, turning off proxying (orange cloud off)

    e.g. example-app.domain.com -> example-app-eastus.azurewebsites.net
  3. Add the Custom Domain on R1, using the CNAME verification method
  4. Once the hostname is verified, go back to Cloudflare and update the CNAME record for the service to point to R2

    e.g. example-app.domain.com -> example-app-westus.azurewebsites.net
  5. Add the Custom Domain on R2, using the CNAME verification method
  6. Once the hostname is verified, go back to Cloudflare and update the CNAME record for the service to point to the Traffic Manager, and also turn on proxying (orange cloud on)

While this eventually accomplishes the task, the failure mode it introduces is that if you ever want to add a third (or fourth or fifth…) region, you temporarily have to not only direct all traffic to your brand new single instance momentarily to verify the domain, but you also have to turn off proxying, exposing the fact that you are using Azure (bad OPSEC).

After doing some digging however, I came across a Microsoft document that explains that there is a way to add a TXT record which you can use to verify ownership of the domain without a bunch of messing around with the original record you’re dealing with.

This is great because we can just add new awverify records for each region and Azure will trust we own them, but Terraform introduces a new wrinkle in that it creates the record at Cloudflare so fast that Cloudflare’s infrastructure often doesn’t have time to replicate the new entry across their fleet before you attempt the verification, which means that the lookup will fail and Terraform will die.

To get around this, we added a null_resource that just executes a 30 second sleep to allow time for the record to propagate through Cloudflare’s network before attempting the lookup.

I’ve put together a copy of our Terraform modules for your perusal and usage:

Using this module will allow you to easily deploy all of your micro-services in a Highly Available configuration by utilizing multiple regions.

]]>
https://nexxai.dev/deploying-an-azure-app-service-from-scratch-including-dns-and-tls/feed/ 0
Using a certificate stored in Key Vault in an Azure App Service https://nexxai.dev/using-a-certificate-stored-in-key-vault-in-an-azure-app-service/?utm_source=rss&utm_medium=rss&utm_campaign=using-a-certificate-stored-in-key-vault-in-an-azure-app-service https://nexxai.dev/using-a-certificate-stored-in-key-vault-in-an-azure-app-service/#comments Fri, 04 Oct 2019 21:54:01 +0000 https://nexxai.dev/?p=176 The post Using a certificate stored in Key Vault in an Azure App Service appeared first on nexxai.dev.

For the last two days, I’ve been trying to deploy some new microservices using a certificate stored in Key Vault in an Azure App Service. By now, you’ve probably figured out that we love them around here. I’ve also been slamming my head against the wall because of some not-well-documented functionality about granting permissions to […]

]]>
The post Using a certificate stored in Key Vault in an Azure App Service appeared first on nexxai.dev.

For the last two days, I’ve been trying to deploy some new microservices using a certificate stored in Key Vault in an Azure App Service. By now, you’ve probably figured out that we love them around here. I’ve also been slamming my head against the wall because of some not-well-documented functionality about granting permissions to the Key Vault.

As a quick primer, here’s the basics of what I was trying to do:

resource "azurerm_app_service" "centralus-app-service" {
   name                = "${var.service-name}-centralus-app-service-${var.environment_name}"
   location            = "${azurerm_resource_group.centralus-rg.location}"
   resource_group_name = "${azurerm_resource_group.centralus-rg.name}"
   app_service_plan_id = "${azurerm_app_service_plan.centralus-app-service-plan.id}"

   identity {
     type = "SystemAssigned"
   }
 }

data "azurerm_key_vault" "cert" {
   name                = "${var.key-vault-name}"
   resource_group_name = "${var.key-vault-rg}"
 }
resource "azurerm_key_vault_access_policy" "centralus" {
   key_vault_id = "${data.azurerm_key_vault.cert.id}"
   tenant_id = "${azurerm_app_service.centralus-app-service.identity.0.tenant_id}"
   object_id = "${azurerm_app_service.centralus-app-service.identity.0.principal_id}"
   secret_permissions = [
     "get"
   ]
   certificate_permissions = [
     "get"
   ]
 }
resource "azurerm_app_service_certificate" "centralus" {
   name                = "${local.full_service_name}-cert"
   resource_group_name = "${azurerm_resource_group.centralus-rg.name}"
   location            = "${azurerm_resource_group.centralus-rg.location}"
   key_vault_secret_id = "${var.key-vault-secret-id}"
   depends_on          = [azurerm_key_vault_access_policy.centralus]
 }

and these are the relevant values I was passing into the module:

  key-vault-secret-id       = "https://example-keyvault.vault.azure.net/secrets/cert/0d599f0ec05c3bda8c3b8a68c32a1b47"
  key-vault-rg              = "example-keyvault"
  key-vault-name            = "example-keyvault"

But no matter what I did, I kept bumping up against this error:

Error: Error creating/updating App Service Certificate "example-app-dev-cert" (Resource Group "example-app-centralus-rg-dev"): web.CertificatesClient#CreateOrUpdate: Failure responding to request: StatusCode=400 -- Original Error: autorest/azure: Service returned an error. Status=400 Code="BadRequest" Message="The service does not have access to '/subscriptions/[SUBSCRIPTIONID]/resourcegroups/example-keyvault/providers/microsoft.keyvault/vaults/example-keyvault' Key Vault. Please make sure that you have granted necessary permissions to the service to perform the request operation." Details=[{"Message":"The service does not have access to '/subscriptions/[SUBSCRIPTIONID]/resourcegroups/example-keyvault/providers/microsoft.keyvault/vaults/example-keyvault' Key Vault. Please make sure that you have granted necessary permissions to the service to perform the request operation."},{"Code":"BadRequest"},{"ErrorEntity":{"Code":"BadRequest","ExtendedCode":"59716","Message":"The service does not have access to '/subscriptions/[SUBSCRIPTIONID]/resourcegroups/example-keyvault/providers/microsoft.keyvault/vaults/example-keyvault' Key Vault. Please make sure that you have granted necessary permissions to the service to perform the request operation.","MessageTemplate":"The service does not have access to '{0}' Key Vault. Please make sure that you have granted necessary permissions to the service to perform the request operation.","Parameters":["/subscriptions/[SUBSCRIPTIONID]/resourcegroups/example-keyvault/providers/microsoft.keyvault/vaults/example-keyvault"]}}]

I checked and re-checked and triple-checked and had colleagues check, but no matter what I did, it kept puking with this permissions issue. I confirmed that the App Service’s identity was being provided and saved, but nothing seemed to work.

Then I found this blog post from 2016 talking about a magic Service Principal (or more specifically, a Resource Principal) that requires access to the Key Vault too. All I did was add the following resource with the magic SP, and everything worked perfectly.

resource "azurerm_key_vault_access_policy" "azure-app-service" {
   key_vault_id = "${data.azurerm_key_vault.cert.id}"
   tenant_id = "${azurerm_app_service.centralus-app-service.identity.0.tenant_id}"

   # This object is the Microsoft Azure Web App Service magic SP 
   # as per https://azure.github.io/AppService/2016/05/24/Deploying-Azure-Web-App-Certificate-through-Key-Vault.html
   object_id = "abfa0a7c-a6b6-4736-8310-5855508787cd" 

   secret_permissions = [
     "get"
   ]

   certificate_permissions = [
     "get"
   ]
 }

It’s frustrating that Microsoft hasn’t documented this piece (at least officially), but hopefully with this knowledge, you’ll be able to automate using a certificate stored in Key Vault in your next Azure App Service.

]]>
https://nexxai.dev/using-a-certificate-stored-in-key-vault-in-an-azure-app-service/feed/ 6
Store a private key in Azure Key Vault for use in a Logic App https://nexxai.dev/store-a-private-key-in-azure-key-vault-for-use-in-a-logic-app/?utm_source=rss&utm_medium=rss&utm_campaign=store-a-private-key-in-azure-key-vault-for-use-in-a-logic-app https://nexxai.dev/store-a-private-key-in-azure-key-vault-for-use-in-a-logic-app/#respond Thu, 12 Sep 2019 16:10:34 +0000 https://nexxai.dev/?p=157 The post Store a private key in Azure Key Vault for use in a Logic App appeared first on nexxai.dev.

Today, I found myself in need of an automated SFTP connection that would reach out to one of our partners, download a file, and then dump it in to a Data Lake for further processing. This meant that I would need to store a private in Azure Key Vault for use in a Logic App. […]

]]>
The post Store a private key in Azure Key Vault for use in a Logic App appeared first on nexxai.dev.

Today, I found myself in need of an automated SFTP connection that would reach out to one of our partners, download a file, and then dump it in to a Data Lake for further processing. This meant that I would need to store a private in Azure Key Vault for use in a Logic App. While this was mainly a straightforward process, there was a small hiccup that we encountered and wanted to pass along.

First, we went ahead and generated a public/private key pair using:

ssh-keygen -t rsa -b 4096

where rsa is the algorithm and 4096 is the length of the key in bits. We avoided the ec25519 and ecdsa algorithms as our partner does not support elliptic-curve cryptography. As this command was run on a Mac laptop which already has it’s own ~/.ssh/id_rsa[.pub] key pair, we chose a new filename and location /tmp/sftp to temporarily store this new pair.

The problem arose when we tried to insert the private key data into Key Vault as a secret: the Azure portal does not support multi-line secret entry, resulting in a non-standard and ultimately broken key entry.

The solution was to use the Azure CLI to upload the contents of the private key by doing:

az keyvault secret set --vault-name sftp-keyvault -n private-key -f '/tmp/sftp'

This uploaded the file correctly to the secret titled private-key, which means that we can now add a Key Vault action in our Logic App to pull the secret, without having to leave the key in plain view, and then use it as the data source for the private key field in SFTP - Copy File action.

As an aside, we also created a new secret called public-key and uploaded a copy of sftp.pub just so that 6 months from now if we need to recall a copy of it to send to another partner, it’s there for us to grab.

]]>
https://nexxai.dev/store-a-private-key-in-azure-key-vault-for-use-in-a-logic-app/feed/ 0
How to import a publicly-issued certificate into Azure Key Vault https://nexxai.dev/how-to-import-a-publicly-issued-certificate-into-azure-key-vault/?utm_source=rss&utm_medium=rss&utm_campaign=how-to-import-a-publicly-issued-certificate-into-azure-key-vault https://nexxai.dev/how-to-import-a-publicly-issued-certificate-into-azure-key-vault/#respond Thu, 05 Sep 2019 18:37:46 +0000 https://nexxai.dev/?p=145 The post How to import a publicly-issued certificate into Azure Key Vault appeared first on nexxai.dev.

Today, after spending several hours swearing and researching how to import a publicly-issued certificate into Azure Key Vault, I thought I’d share the entire process of how we did it from start to finish so that you can save yourself a bunch of time and get back to working on fun stuff, like spamming your […]

]]>
The post How to import a publicly-issued certificate into Azure Key Vault appeared first on nexxai.dev.

Today, after spending several hours swearing and researching how to import a publicly-issued certificate into Azure Key Vault, I thought I’d share the entire process of how we did it from start to finish so that you can save yourself a bunch of time and get back to working on fun stuff, like spamming your co-workers with Cat Facts. We learned a bunch about the different encoding formats of certificates and some of their restrictions, both within Azure Key Vault as well as with the certificate types themselves. Let’s get started!

Initially, we created an elliptic curve-derived (EC) private key (using elliptic curve prime256v1), and a CSR by doing:

openssl ecparam -out privatekey.key -name prime256v1 -genkey
openssl req -new -key privatekey.key -out request.csr -sha256

making sure to not include an email address or password. I am not actually clear on what the technical reasoning behind this is, but I saw it noted on several sites.

We submitted the CSR to our certificate authority (CA) and shortly thereafter got back a signed PEM file.

We next needed to create a single PFX/PKCS12-formatted, password-protected certificate, so we grabbed our signed certificate (ServerCertificate.crt) and our CA’s intermediate certificate chain (Chain.crt) and then did:

openssl pkcs12 -export -inkey privatekey.key -in ServerCertificate.crt -certfile Chain.crt -out Certificate.pfx

But when we went to import it into the Key Vault with the correct password, it threw a general “We don’t like this certificate” error. The first thing we did was check out the provided link and saw that we could import PEM-formatted certificates directly. I didn’t remember this being the case in the past, so maybe this is a new feature?

No problem. We concatenated the certificate and key files into a single, large text file (echo ServerCertificate.crt >> concat.crt ; echo privatekey.key >> concat.crt) which would create a file called concat.crt which itself would consist of the

-----------BEGIN CERTIFICATE-----------
-----------END CERTIFICATE-----------

section from the ServerCertificate.crt file as well as the

-----BEGIN EC PARAMETERS-----
-----END EC PARAMETERS-----

and

-----BEGIN EC PRIVATE KEY-----
-----END EC PRIVATE KEY-----

sections from the privatekey.key file.

We went to upload concat.crt to the Key Vault and again were given the same error as before however after re-reading the document, we were disappointed when we saw this quote:

We currently don’t support EC keys in PEM format.

Section: Formats of Import we support

It surprises me that Microsoft does not support elliptic curve-based keys in PEM format. I am not aware of any technical limitation on the part of the certificate so this seems very much like a Microsoft-specfic thing, however if anyone is able to provide insight into this, I’d love to hear it.

OK, we’ll generate an 2048-bit RSA-derived key and CSR, and then try again.

openssl genrsa -des3 -out rsaprivate.key 2048
openssl req -new -key rsaprivate.key -out RSA.csr

We uploaded the CSR to the CA as a re-key request, and waited.

When the certificate was finally issued (as cert.pem), we could now take the final steps to prepare it for upload to the Key Vault. We concatenated the key and certificate together (echo rsaprivate.key >> rsacert.crt ; echo cert.pem >> rsacert.crt) and went to upload it to the Key Vault.

And yet again, it failed. After a bunch of researching on security blogs and StackOverflow, it turns out that the default output format of the private key is PKCS1, and Key Vault expects it to be in PKCS8 format. So now time to convert it.

openssl pkcs8 -topk8 -inform PEM -outform PEM -nocrypt -in rsaprivate.key -out rsaprivate8.key

Finally, we re-concatenated the rsaprivate8.key and cert.pem files into a single rsacert8.crt file (echo rsaprivate8.key >> rsacert8.crt ; echo cert.pem >> rsacert8.crt) which we could import into Key Vault.

It worked!

We now have our SSL certificate in our HSM-backed Azure Key Vault that we can apply to our various web properties without having to store the actual certificate files anywhere, which makes our auditors very happy.

]]>
https://nexxai.dev/how-to-import-a-publicly-issued-certificate-into-azure-key-vault/feed/ 0
Terraform: “Error: insufficient items for attribute “sku”; must have at least 1″ https://nexxai.dev/terraform-error-insufficient-items-for-attribute-sku-must-have-at-least-1/?utm_source=rss&utm_medium=rss&utm_campaign=terraform-error-insufficient-items-for-attribute-sku-must-have-at-least-1 https://nexxai.dev/terraform-error-insufficient-items-for-attribute-sku-must-have-at-least-1/#respond Tue, 06 Aug 2019 16:30:42 +0000 https://nexxai.dev/?p=130 The post Terraform: “Error: insufficient items for attribute “sku”; must have at least 1″ appeared first on nexxai.dev.

Last week, we were attempting to deploy a new Terraform-owned resource but every time we ran terraform plan or terraform apply, we got the error Error: insufficient items for attribute "sku"; must have at least 1. We keep our Terraform code in a Azure DevOps project, with approvals being required for any new commits even […]

]]>
The post Terraform: “Error: insufficient items for attribute “sku”; must have at least 1″ appeared first on nexxai.dev.

Last week, we were attempting to deploy a new Terraform-owned resource but every time we ran terraform plan or terraform apply, we got the error Error: insufficient items for attribute "sku"; must have at least 1. We keep our Terraform code in a Azure DevOps project, with approvals being required for any new commits even into our dev environment, so we were flummoxed.

Our first thought was that we had upgraded the Terraform azurerm provider from 1.28.0 to 1.32.0 and we knew for a fact that the azurerm_key_vault resource had been changed from accepting a sku {} block to simply requiring a sku_name property. We tried every combination of having either, both, and none of them defined, and we still received the error. We even tried downgrading back to 1.28.0 as a fallback, but it made no change. At this point we were relatively confident that it wasn’t the provider.

The next thing we looked for was any other resources that had a sku {} block defined. This included our azurerm_app_service_plans, our azure_virtual_machines, and our azurerm_vpn_gateway. We searched for and commented out all of the respective declarations from our .tf files, but still we received the error.

Now we were starting to get nervous. Nothing we tried would solve the problem, and we were starting to get a backlog of requests for new resources that we couldn’t deploy because no matter what we did, whether adding or removing potentially broken code, we couldn’t deploy any new changes. To say the tension on our team was palpable would be the understatement of the year.

At this point we needed to take a step back and analyze the problem logically, so we all took a break from Terraform to clear our minds and de-stress a bit. We started to suspect something in the state file was causing the problem, but we weren’t really sure what. We decided to take the sledgehammer approach and using terraform state rm, we removed every instance of those commented out resources we found above.

This worked. Now we could run terraform plan and terraform apply without issue, but we still weren’t sure why. That didn’t bode well if the problem re-occured; we couldn’t just keep taking a sledgehammer to the environment, it’s just too disruptive. We needed to figure out the root cause.

We opened an issue on the provider’s GitHub page for further investigation, and after some digging by other community members and Terraform employees themselves, it seems that Microsoft’s API returns a different response for App Service Plans than any other resource when it is found to be missing. An assumption was being made that it would be the same for all resources, but it turned out that this was a bad assumption to make.

This turned out to be the key for us. Someone had deleted several App Service Plans from the Azure portal (thinking they were not being used) and so our assumption is that when the provider is checking for the status of a missing App Service Plan, the broken response makes Terraform think it actually exists, even though there’s no sku {} data in it, causing Terraform to think that that specific data was missing.

Knowing the core problem, the error message Error: insufficient items for attribute "sku"; must have at least 1 kind of makes sense now: the sku attribute is missing at least 1 item, it just doesn’t make clear that the “insufficient items” are on the Azure side, not the Terraform / .tf side.

They’ve added a workaround in the provider until Microsoft updates the API to respond like all of the other resources.

Have you seen this error before? What did you do to solve it?

]]>
https://nexxai.dev/terraform-error-insufficient-items-for-attribute-sku-must-have-at-least-1/feed/ 0
Add your AWS API key info in a Key Vault for Terraform https://nexxai.dev/add-your-aws-api-key-info-in-a-key-vault-for-terraform/?utm_source=rss&utm_medium=rss&utm_campaign=add-your-aws-api-key-info-in-a-key-vault-for-terraform https://nexxai.dev/add-your-aws-api-key-info-in-a-key-vault-for-terraform/#respond Thu, 20 Jun 2019 21:53:47 +0000 https://nexxai.dev/?p=99 The post Add your AWS API key info in a Key Vault for Terraform appeared first on nexxai.dev.

EDIT: Updated on July 10, 2019; modified second- and third-last paragraphs to show the correct process of retrieving the AWS_SECRET_ACCESS_KEY from the Key Vault and setting it as a protected environment variable Our primary cloud is in Azure which makes building DevOps pipelines with automation scoped to a particular subscription very easy, but what happens […]

]]>
The post Add your AWS API key info in a Key Vault for Terraform appeared first on nexxai.dev.

EDIT: Updated on July 10, 2019; modified second- and third-last paragraphs to show the correct process of retrieving the AWS_SECRET_ACCESS_KEY from the Key Vault and setting it as a protected environment variable

Our primary cloud is in Azure which makes building DevOps pipelines with automation scoped to a particular subscription very easy, but what happens when we want to deploy something in AWS, since storing keys in source control is A Very Bad Idea™?

Simple, we use Azure Key Vault.

First, we created a Key Vault specifically for this purpose called company-terraform which will specifically be used to store the various secrets for Terraform-based deployments. When you tie a subscription from Azure DevOps to an Azure subscription, it creates an “application” in the Azure Enterprise Applications list, so give that application Get and List permissions to this vault.

Next, we created a secret called AmazonAPISecretKey and then set the secret’s content to the actual API key you are presented when you enable programmatic access to an account in the AWS IAM console.

In our Azure DevOps Terraform build and release pipelines, we then added an Azure Key Vault step, selecting the appropriate subscription and Key Vault. Once selected, we added a Secrets filter AmazonAPISecretKey meaning that it will only ever fetch that secret on run; if you will be adding multiple secrets which will all be used in this particular pipeline, add them to this filter list.

Finally, we can now use the string $(AmazonAPISecretKey) in any shellexec or other pipeline task to authenticate against AWS, while never having to commit the actual key to a viewable source.

Since one of the methods the Terraform AWS provider can use to authenticate is by using the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables, we will set them up so that DevOps can use them in its various tasks.

First, open your Build or Release pipeline and select the Variables tab. Create a new variable called AWS_ACCESS_KEY_ID and set the value to your access key ID (usually something like AK49FKF4034F42DZV2VRMD). Then create a second variable called AWS_SECRET_ACCESS_KEY which you can leave blank, but click the padlock icon next to it, to tell DevOps that its contents are secret and shouldn’t be shared.

Now create a shellexec task and add the following command to it, which will set the AWS_SECRET_ACCESS_KEY environment variable to the contents of the Key Vault entry we created earlier:

echo "##vso[task.setvariable variable=AWS_SECRET_ACCESS_KEY;]$(AmazonAPISecretKey)"

And there you have it! You can now reference your AWS accounts from within your Terraform structure without ever actually exposing your keys to prying eyes!

]]>
https://nexxai.dev/add-your-aws-api-key-info-in-a-key-vault-for-terraform/feed/ 0
Azure password storage in a pinch https://nexxai.dev/azure-password-storage-in-a-pinch/?utm_source=rss&utm_medium=rss&utm_campaign=azure-password-storage-in-a-pinch https://nexxai.dev/azure-password-storage-in-a-pinch/#respond Tue, 18 Jun 2019 14:43:38 +0000 https://nexxai.dev/?p=89 The post Azure password storage in a pinch appeared first on nexxai.dev.

Yesterday, it was discovered that our developers had built a Docker container that was encrypted with a password that resided in a single location: the Azure App Service’s Application Settings (aka: an environment variable). Of course we discovered this when they pushed out a deployment of the container, something broke during the deployment, the Application […]

]]>
The post Azure password storage in a pinch appeared first on nexxai.dev.

Yesterday, it was discovered that our developers had built a Docker container that was encrypted with a password that resided in a single location: the Azure App Service’s Application Settings (aka: an environment variable). Of course we discovered this when they pushed out a deployment of the container, something broke during the deployment, the Application Setting with the password disappeared, and no one knew what the password was.

It took nearly 30 minutes to rebuild the container with a new encryption password, which is entirely too long for a core piece of our company’s booking system to be unavailable, so until we have a proper password management solution in place, we wanted to stand up something.

Azure Key Vault to the rescue!

Azure Key Vault is a tool for securely storing and accessing secrets. A secret is anything that you want to tightly control access to, such as API keys, passwords, or certificates. A vault is logical group of secrets.

https://docs.microsoft.com/en-ca/azure/key-vault/key-vault-whatis

We created a new Key Vault, and set up secrets for each of our App Services because as it turns out, several – although not all – of our microservices have similar encryption setups. We then dumped in the encryption passwords for each service so that if in the future the password disappears, it’s as simple as grabbing it out of the Key Vault and re-creating it as the Application Setting.

In the future, the developers will be updating their applications to reference the Key Vault directly, but for the time being and until they can change how their apps are architected.

]]>
https://nexxai.dev/azure-password-storage-in-a-pinch/feed/ 0
Using a Client Certificate to authenticate via an Azure Logic App https://nexxai.dev/using-a-client-certificate-to-authenticate-via-an-azure-logic-app/?utm_source=rss&utm_medium=rss&utm_campaign=using-a-client-certificate-to-authenticate-via-an-azure-logic-app https://nexxai.dev/using-a-client-certificate-to-authenticate-via-an-azure-logic-app/#comments Mon, 17 Jun 2019 14:00:19 +0000 https://nexxai.dev/?p=66 The post Using a Client Certificate to authenticate via an Azure Logic App appeared first on nexxai.dev.

Today we faced a problem where we needed to interface with a vendor’s SOAP API (*screams in old-person-ese*) which they protect using an internal PKI. They had provided us a certificate to use, but we found that actually using it in the Logic App we built was going to be a little more complicated than […]

]]>
The post Using a Client Certificate to authenticate via an Azure Logic App appeared first on nexxai.dev.

Today we faced a problem where we needed to interface with a vendor’s SOAP API (*screams in old-person-ese*) which they protect using an internal PKI. They had provided us a certificate to use, but we found that actually using it in the Logic App we built was going to be a little more complicated than we originally expected.

Here’s what we did.

First, the vendor provided us the certificate in .pem format, while Logic Apps expect to use .pfx format in the HTTP actions, so we needed to convert it. Luckily, openssl makes this relatively easy:

openssl pkcs12 -export -out certificate.pfx -in certificate.pem -inkey key.pem -passin pass:examplepassword -passout pass:examplepassword

Next, we need to take the .pfx-formatted certificate and base64 encode it:

cat certificate.pfx | base64

After removing any line breaks to make the result one continuous line of text, we now have a certificate we can pass to the vendor, but we don’t want to store that in the Logic App. It’s not secure and we want secure. What do we do now?

Within Azure, we create a Key Vault, and within that Key Vault we create a secret within which we place the base64-encoded, pfx-converted certificate.

Now we have everything we need to put this all together.

In the Logic App we create an action that reaches out to the Key Vault we created, requests the secret and sets the result as a variable called PFXKey. We then create an HTTP action that uses “Client Certificate” as the authentication method, and the value of the PFXKey variable as the variable. We set the password to the password of the certificate (examplepassword in the example above) and we can now use a POST request type to send the data to the vendor, using Client Certificate authentication, all while keeping the certificate contents and its password secure.

]]>
https://nexxai.dev/using-a-client-certificate-to-authenticate-via-an-azure-logic-app/feed/ 2