Store a private key in Azure Key Vault for use in a Logic App

Today, I found myself in need of an automated SFTP connection that would reach out to one of our partners, download a file, and then dump it in to a Data Lake for further processing. This meant that I would need to store a private in Azure Key Vault for use in a Logic App. While this was mainly a straightforward process, there was a small hiccup that we encountered and wanted to pass along.

First, we went ahead and generated a public/private key pair using:

ssh-keygen -t rsa -b 4096

where rsa is the algorithm and 4096 is the length of the key in bits. We avoided the ec25519 and ecdsa algorithms as our partner does not support elliptic-curve cryptography. As this command was run on a Mac laptop which already has it’s own ~/.ssh/id_rsa[.pub] key pair, we chose a new filename and location /tmp/sftp to temporarily store this new pair.

The problem arose when we tried to insert the private key data into Key Vault as a secret: the Azure portal does not support multi-line secret entry, resulting in a non-standard and ultimately broken key entry.

The solution was to use the Azure CLI to upload the contents of the private key by doing:

az keyvault secret set --vault-name sftp-keyvault -n private-key -f '/tmp/sftp'

This uploaded the file correctly to the secret titled private-key, which means that we can now add a Key Vault action in our Logic App to pull the secret, without having to leave the key in plain view, and then use it as the data source for the private key field in SFTP - Copy File action.

As an aside, we also created a new secret called public-key and uploaded a copy of sftp.pub just so that 6 months from now if we need to recall a copy of it to send to another partner, it’s there for us to grab.

How to import a publicly-issued certificate into Azure Key Vault

Today, after spending several hours swearing and researching how to import a publicly-issued certificate into Azure Key Vault, I thought I’d share the entire process of how we did it from start to finish so that you can save yourself a bunch of time and get back to working on fun stuff, like spamming your co-workers with Cat Facts. We learned a bunch about the different encoding formats of certificates and some of their restrictions, both within Azure Key Vault as well as with the certificate types themselves. Let’s get started!

Initially, we created an elliptic curve-derived (EC) private key (using elliptic curve prime256v1), and a CSR by doing:

openssl ecparam -out privatekey.key -name prime256v1 -genkey
openssl req -new -key privatekey.key -out request.csr -sha256

making sure to not include an email address or password. I am not actually clear on what the technical reasoning behind this is, but I saw it noted on several sites.

We submitted the CSR to our certificate authority (CA) and shortly thereafter got back a signed PEM file.

We next needed to create a single PFX/PKCS12-formatted, password-protected certificate, so we grabbed our signed certificate (ServerCertificate.crt) and our CA’s intermediate certificate chain (Chain.crt) and then did:

openssl pkcs12 -export -inkey privatekey.key -in ServerCertificate.crt -certfile Chain.crt -out Certificate.pfx

But when we went to import it into the Key Vault with the correct password, it threw a general “We don’t like this certificate” error. The first thing we did was check out the provided link and saw that we could import PEM-formatted certificates directly. I didn’t remember this being the case in the past, so maybe this is a new feature?

No problem. We concatenated the certificate and key files into a single, large text file (echo ServerCertificate.crt >> concat.crt ; echo privatekey.key >> concat.crt) which would create a file called concat.crt which itself would consist of the

-----------BEGIN CERTIFICATE-----------
-----------END CERTIFICATE-----------

section from the ServerCertificate.crt file as well as the

-----BEGIN EC PARAMETERS-----
-----END EC PARAMETERS-----

and

-----BEGIN EC PRIVATE KEY-----
-----END EC PRIVATE KEY-----

sections from the privatekey.key file.

We went to upload concat.crt to the Key Vault and again were given the same error as before however after re-reading the document, we were disappointed when we saw this quote:

We currently don’t support EC keys in PEM format.

Section: Formats of Import we support

It surprises me that Microsoft does not support elliptic curve-based keys in PEM format. I am not aware of any technical limitation on the part of the certificate so this seems very much like a Microsoft-specfic thing, however if anyone is able to provide insight into this, I’d love to hear it.

OK, we’ll generate an 2048-bit RSA-derived key and CSR, and then try again.

openssl genrsa -des3 -out rsaprivate.key 2048
openssl req -new -key rsaprivate.key -out RSA.csr

We uploaded the CSR to the CA as a re-key request, and waited.

When the certificate was finally issued (as cert.pem), we could now take the final steps to prepare it for upload to the Key Vault. We concatenated the key and certificate together (echo rsaprivate.key >> rsacert.crt ; echo cert.pem >> rsacert.crt) and went to upload it to the Key Vault.

And yet again, it failed. After a bunch of researching on security blogs and StackOverflow, it turns out that the default output format of the private key is PKCS1, and Key Vault expects it to be in PKCS8 format. So now time to convert it.

openssl pkcs8 -topk8 -inform PEM -outform PEM -nocrypt -in rsaprivate.key -out rsaprivate8.key

Finally, we re-concatenated the rsaprivate8.key and cert.pem files into a single rsacert8.crt file (echo rsaprivate8.key >> rsacert8.crt ; echo cert.pem >> rsacert8.crt) which we could import into Key Vault.

It worked!

We now have our SSL certificate in our HSM-backed Azure Key Vault that we can apply to our various web properties without having to store the actual certificate files anywhere, which makes our auditors very happy.

Using a Client Certificate to authenticate via an Azure Logic App

Today we faced a problem where we needed to interface with a vendor’s SOAP API (*screams in old-person-ese*) which they protect using an internal PKI. They had provided us a certificate to use, but we found that actually using it in the Logic App we built was going to be a little more complicated than we originally expected.

Here’s what we did.

First, the vendor provided us the certificate in .pem format, while Logic Apps expect to use .pfx format in the HTTP actions, so we needed to convert it. Luckily, openssl makes this relatively easy:

openssl pkcs12 -export -out certificate.pfx -in certificate.pem -inkey key.pem -passin pass:examplepassword -passout pass:examplepassword

Next, we need to take the .pfx-formatted certificate and base64 encode it:

cat certificate.pfx | base64

After removing any line breaks to make the result one continuous line of text, we now have a certificate we can pass to the vendor, but we don’t want to store that in the Logic App. It’s not secure and we want secure. What do we do now?

Within Azure, we create a Key Vault, and within that Key Vault we create a secret within which we place the base64-encoded, pfx-converted certificate.

Now we have everything we need to put this all together.

In the Logic App we create an action that reaches out to the Key Vault we created, requests the secret and sets the result as a variable called PFXKey. We then create an HTTP action that uses “Client Certificate” as the authentication method, and the value of the PFXKey variable as the variable. We set the password to the password of the certificate (examplepassword in the example above) and we can now use a POST request type to send the data to the vendor, using Client Certificate authentication, all while keeping the certificate contents and its password secure.