Logic Apps Standard, Managed connections in Visual Studio Code

I had a talk at Integrate 2022 in London about Lifecycle of Logic Apps Standard. I thought it would be a good idea to write down some key takeaways as blog posts. Today(June 2022), handling managed connections while developing Logic Apps standard in Visual Studio Code is not straight forward. The Visual Studio Code extension for Logic Apps Standard expect you to create the connections from within the editor, this is not ideal as more and more we strive towards creating our Azure resources with code (ARM|Bicep|Terraform), additionally you cannot use existing connections. Another problem is that when the connection is created a token (connectionKey) is added to the local.settings.json file and valid for only 7 days. If you want to work with a Logic App longer than 7 days or have a peer to continue work the connections need to be recreated. The main idea to solve these problems is to create Managed connections with Bicep and save the connectionKeys in a Key Vault. When you have the connectionKeys saved in Key vault your App settings can reference the secret in Key Vault. The image below shows steps and references.


Saving the connection key

I prefer to create my resources with Bicep, that said what I describe here can be achieved with ARM and most likely Terraform(not tested) also. To create a connection resource you will use the Microsoft.Web/connections resource type. I will not dive into creating the connections here, the documentation for that can be found here.

Note I have written a script to help you generate bicep for the connections, it’s not bullet proof but saves time to get the main parts in place. You find the script here.

One of the things the script does is add to the generated Bicep an additional section to save the connectionKey to Key Vault if it is a lab or dev environment. Below you can see the code to get and save the connectionKey. Each time you deploy a new version of the secret will be created with a new connectionKey that will be valid the number of days you set validityTimeSpan variable.

// Handle connectionKey
param baseTime string = utcNow('u')

var validityTimeSpan = {
  validityTimeSpan: '30'

var validTo = dateTimeAdd(baseTime, 'P${validityTimeSpan.validityTimeSpan}D')

var key = environment == 'lab' || environment == 'dev' ? connection_resource.listConnectionKeys('2018-07-01-preview', validityTimeSpan).connectionKey : 'Skipped'

resource kv 'Microsoft.KeyVault/vaults@2021-11-01-preview' existing = {
  name: kv_name

resource kvConnectionKey 'Microsoft.KeyVault/vaults/secrets@2021-11-01-preview' = if (environment == 'lab' || environment == 'dev') {
  parent: kv
  name: '${DisplayName}-connectionKey'
  properties: {
    value: key
    attributes: {
      exp: dateTimeToEpoch(validTo)

Using existing connections in Visual Studio Code

To use existing connections in Visual Studio Code you must write the managedApiConnections element in the connections.json file. As you can see in the example below the parameter element has a reference to an application setting.

  "managedApiConnections": {
    "bingmaps": {
      "api": {
        "id": "/subscriptions/---your subscription---/providers/Microsoft.Web/locations/westeurope/managedApis/bingmaps"
      "authentication": {
        "type": "Raw",
        "scheme": "Key",
        "parameter": "@appsetting('bingmaps-connectionKey')"
      "connection": {
        "id": "/subscriptions/---your subscription---/resourcegroups/---your resource group----/providers/microsoft.web/connections/bingmaps"
      "connectionRuntimeUrl": "https://3b829fe9f0975a92.14.common.logic-westeurope.azure-apihub.net/apim/bingmaps/788bd6f6ab024898a829cb4e9b463d1d"

Locally the application setting is in your local.settings.json file that should not be saved to source control as it will contain secrets. In the example below you see the bingmaps-connectionKey element referring the connectionKey we saved in key vault.

  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "UseDevelopmentStorage=true",
    "WORKFLOWS_TENANT_ID": "---your tenant id---",
    "WORKFLOWS_SUBSCRIPTION_ID": "---your subscription---",
    "WORKFLOWS_RESOURCE_GROUP_NAME": "---your resource group---",
    "WORKFLOWS_LOCATION_NAME": "---your region---",
    "WORKFLOWS_MANAGEMENT_BASE_URI": "https://management.azure.com/",

    "bingmaps-connectionKey": "@Microsoft.KeyVault(VaultName=---your key vault---;SecretName=bingmaps-connectionKey)"

With these steps taken you should be able to use the existing connection.

Helper script

I have written a script to extract all connections in a resource group, you can find it here. Get-UpdatedManagedConnectionFiles.ps1 will generate files containing Connection information based on the connections in a provided resource group to use in Visual Studio Code.

Assumes you have stored the connectionKeys in KeyVault.

The script calls Generate-ConnectionsRaw.ps1 and Generate-Connections.ps1 and saves the files in a folder you provide. See the table below for details on the saved files.

File Description
.connections.az.json Shows how the managed connections should look in should look in the deployment.
.connections.code.json Shows how the managed connections should look in Visual Studio Code. Copy the contents of the managedApiConnections element to your connections.json
.connectionKeys.txt Lines you can use in your local.settings.json to match the connection information created in .connections.code.json. Here we have to connectionKeys as when created the normal way in VS Code.
.KvReference.txt Lines you can in your local.settings.json to match the connection information created in .connections.code.json. Here we use Key vault references instead which is the best solution as you don’t need to update the local.settings.json when the keys are updated.


It is possible to workaround the current limitations handling managed connections when working with Logic Apps Standard in Visual Studio Code. Hope you find this handy.

Queueing Azure DevOps Pipelines from Logic Apps

Queueing pipeline execution in Azure DevOps is fully possible in several ways. The reason to do it is to orchestrate running a couple of pipelines to automate manual steps in a larger context. In this post I focus on doing it from Logic Apps Consumption using the Azure DevOps connector. Just starting a pipeline with no parameters is straight forward. Add the Queue a new build action, create the connection, fill in the required information and you’re ready to go.

Azure DevOps Queue a new build action in Logic Apps

If your pipeline has some parameters that you need to fill if you run interactively things get a bit more complicated and you need to understand how data flows in your pipeline. First of all your parameters need to have a default value, without defaults your pipeline fails.

Example of parameters when you run the pipeline interactively.

From your Logic App you can send a JSON dictionary with the values you want to use in the pipeline. With the example in the image, you will have available in your pipeline the values using the syntax $(keyName) three parameters.

Input JSON from the Logic App.

Nice but to get things working nicely being able to start your pipeline interactively and from your Logic App we need to bind the incoming values to the parameters you have in the pipeline. You can do this by setting the default value for the parameters.

Parameters section in the pipeline definition.

In the rest of your pipeline always use the normal syntax ${{parameters.%name%}} that way the pipeline will use the right values regardless of if you start from your Logic App or interactively. The picture below shows how data flows from your Logic App into the parameters and then used in the tasks.

Show data flow from the JSON into the parameters and the used in tasks.

Hope this give you a better understanding how you can start your pipelines with arguments and how the data can flow.

Additional reading:

Azure DevOps – Connectors | Microsoft Docs

Use runtime and type-safe parameters – Azure Pipelines | Microsoft Docs

Some more on using the connector Automating Azure DevOps with Logic Apps – Simple Talk (red-gate.com)

Test your Liquid transformations without deployment

I work on a project doing integration with Azure Integration Services and Azure Functions. As always in integration projects doing mapping is a key area. I’m used from BizTalk development to be able to test my maps without doing deployment which makes it easy to develop in an iterative manner. While working with Logic Apps I started using Liquid transformations and did not find any tool to help with that. Logic Apps transforms with the DotLiquid library (C# Naming conventions). With that information in hand I created a program to test. I separated it in two a library, LiquidTransformationLib.dll, and a program, LiquidTransform.exe. The library makes it easy to use in automated tests.

Parameters for LiquidTransform.exe:

Parameter Description Required
-t | –template Full path to the Liquid template to use. Yes
-c | –content Full path to Content file. Yes
-d | –destination Full path to destination file. Will overwrite an existing file Yes
-r | –rootelement Root element to add before render. For Logic Apps you will need to use content No
-u | –rubynaming Use RubyNamingConvention, Logic Apps use C# naming conventions and will be the default. No
-? | -h | –help Show help information No

liquidtransformYou can download source code from the GiHub repo https://github.com/skastberg/LiquidTransformation

If just want the binaries https://github.com/skastberg/LiquidTransformation/tree/master/Binaries

More information about Liquid Transformation in Logic Apps: