Transformation is a key part of integration, information in one format needs to be used in another format and eventually processed in some way. While working with a client a while ago we did a successful PoC to move their integrations to Azure and want to share the findings regarding mapping. We needed to transform and the alternatives Liquid and the Enterprise Integration Pack (EIP) did not fit well. Transformations were far to complex for Liquid and maps with EIP did not offer a natural way between XML and Json. Important for the decision was also that the customer already used MapForce from Altova in their current solution, so they already had an investment in licensing and knowledge. This post is intended to show how we solved the transformations with MapForce and not a detailed step by step description. We generated code from MapForce and performed the transformations in Azure Functions.
Example mappings
One of the things we needed in the PoC was a lookup changing some identifiers in one system to their counterparts in the other, my examples are two flavors of lookups. All code can be found in the GitHub repository MappingFunctions. So you get a picture of the effort to get started, the maps in the example took me about two hours to find out how to do them.
Steps
These are the steps in short I took for the example.
Create a MapForce project
Create the transformations
Generate code (C#)
Create Azure Functions project
Add the generated code to the Functions solution
Write code to execute the mappings in the functions
Call the functions from Logic Apps to test (excluded in the repository)
Simple lookup example
In the simple lookup example, the data is included in the value-map shape, thus to add new values we will need a new release. This can be ok in cases when data changes very seldom.
Simple summary map
Advanced lookup example
Sometimes data changes more often, then we don’t want the data to be hard coded in the map. In the advanced lookup example, the data is provided as a second input that could come from a database or a blob, thus being able change data without a new release. Note that one input is XML and the other is Json
Advanced summary map
The lookup is performed in a user function that uses the second input.
Lookup User Function
Code generation in MapForce
Code can be generated in several languages, C#, Java, XSLT, etc., choose the one that suites your project best. As different languages have different feature sets, not all constructs works on all languages. In the PoC I wrote about we used Java as that was the language that team preferred and used. The example project uses C# (.NET Core 3.1), with the settings in the image.
Code generation settings
The projects MapForce generated could be added without changes to the Azure Functions solution. This is good as if you need to update your transformation it’s possible to just overwrite them and keep them together in source control.
Highlighted projects that are generated with MapForce
The code to execute the transformation is quite straightforward. In the example the documents are small and string variants are used, for larger documents there are variants that use streams.
public static class SummaryAdvanced
{
[FunctionName("SummaryAdvanced")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = null)] HttpRequest req,
ILogger log)
{
log.LogInformation("Starting SummaryAdvanced.");
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
if (string.IsNullOrEmpty(requestBody))
{
return new BadRequestResult();
}
// TODO: Add more error handling
// Create the mapper object
SummaryLookupMapToCatalog_Summary_Schema mapper = new SummaryLookupMapToCatalog_Summary_Schema();
// Sources
Altova.IO.Input booksSource = new Altova.IO.StringInput( requestBody);
Altova.IO.Input shelfsSource = new Altova.IO.StringInput(MappingFunctions.Properties.Resources.Shelfs);
// Mapping result
StringBuilder responseMessageBuilder = new StringBuilder();
Altova.IO.Output Catalog_Summary_SchemaTarget = new Altova.IO.StringOutput(responseMessageBuilder);
// Execute transformation
mapper.Run(booksSource, shelfsSource, Catalog_Summary_SchemaTarget);
return new OkObjectResult(responseMessageBuilder.ToString());
}
}
Final thoughts
Having one tool that can assist creating the maps between different formats is valuable, it saves time not needing to change tooling. I find this a good way to perform mappings and it was reasonable effort to get started. That said the documentation could be better and it can take some time to be up and running with complex transformations. I have not dived deep in all features, see this post as a starting point. Generating code in different languages is good but I assume a team will stick to what suites their environment best. Note: These are my own thoughts and I don’t have any business contact with Altova. I used a 30-day trial version of MapForce that can be downloaded here.
Queueing pipeline execution in Azure DevOps is fully possible in several ways. The reason to do it is to orchestrate running a couple of pipelines to automate manual steps in a larger context. In this post I focus on doing it from Logic Apps Consumption using the Azure DevOps connector. Just starting a pipeline with no parameters is straight forward. Add the Queue a new build action, create the connection, fill in the required information and you’re ready to go.
If your pipeline has some parameters that you need to fill if you run interactively things get a bit more complicated and you need to understand how data flows in your pipeline. First of all your parameters need to have a default value, without defaults your pipeline fails.
From your Logic App you can send a JSON dictionary with the values you want to use in the pipeline. With the example in the image, you will have available in your pipeline the values using the syntax $(keyName) three parameters.
Nice but to get things working nicely being able to start your pipeline interactively and from your Logic App we need to bind the incoming values to the parameters you have in the pipeline. You can do this by setting the default value for the parameters.
In the rest of your pipeline always use the normal syntax ${{parameters.%name%}} that way the pipeline will use the right values regardless of if you start from your Logic App or interactively. The picture below shows how data flows from your Logic App into the parameters and then used in the tasks.
Hope this give you a better understanding how you can start your pipelines with arguments and how the data can flow.
We have four ways to validate requests and responses, content, parameters, headers, and status code. Three actions can be taken, ignore, detect, or prevent. Ignore will skip the validation, detect will log the validation error but not interrupt execution and prevent will stop processing on first error. Validations have a high-level settings that tells what to do with a specified or unspecified settings.
Validates the size of a request or response, also ensure that the body coming in or out follow the specification in the API description. For schema validation we’re limited to json payload. Content-Type validation is checked towards the API specification. Size validation works with all content types up to 100KiB payload size. This validation can be applied on all scopes for sections inbound, outbound and on-error.
To ensure that the specified content-type is honoured, set unspecified-content-type-action to prevent and to limit the size of a request set max-size and size-exceeded-action to prevent.
APIM will return a status code 400 for any request with a body that exceeds the max-size.
Resulting Status code 400 – max size exceeded
If the request doesn’t set the correct content-type a status code 400 is returned. In this case the required type is application/xml but the provided is text/xml.
Resulting Status Code 400 – unspecified content type
Preventions will generate exceptions that can be seen in Application Insights, also you could Trace the errors. The image shows a query in Application Insights showing the exceptions joined with the request information.
Preventions will generate exceptions, here seen in Application Insights
In this example the policy validates size and content-type as the previous one and in addition the content element specifies to validate the payload.
If the payload doesn’t conform the json schema a status code 400 is returned with at descriptive message.
Resulting Status Code 400 – Invalid json
Parameters
Apart from the content(body) we receive data as parameters, header, query, or path. This validation checks the incoming parameters against the API specification. Each parameter type has its own element in the policy, depending on needs one or more are used. The API specification will show how parameters are expected, types and if mandatory or not. This validation can be applied on all scopes for the inbound section.
With the following policy with unspecified-parameter-action set to prevent, any parameter of any kind in the request that is not in the API specification will be stopped.
The resulting status code 400 response from APIM. In this case a header but it could be something else depending on your specification.
Resulting Status Code 400 – Unspecified header
Path
In this example an operation has the required parameter, format, that we want to validate before sending to the backend. If the request with wrong type for format is received, we get a 400 error from APIM. As we set specified-parameter-action and unspecified-parameter-action to ignore other errors will be disregarded.
In this case the response is more generic not specifying the incorrect parameter, that said the exception seen in Application Insights is more specific.
Resulting Status Code 400 – with a generic error message
Headers
In this example an operation has the required header, spoken-language, that we want to validate before sending to the backend. The header is an enumeration, and we want to validate that the correct values are used before sending to the backend.
The required header in APIM
To prevent that the request is forwarded if an invalid value in the spoken-language header we can use the following policy.
A request with an invalid value returns status code 400.
Returned Status Code 400 telling the value is invalid for spoken-language
Query
In this example an operation has the query parameter, dayno, that we want to validate before sending to the backend. The header is an integer, and we want to validate that the correct type is used before sending to the backend.
A request with a non integer value will be prevented and a status code 400 returned by APIM.
Returned Status Code 400 telling the request could not be processed
Headers
Just as we validate incoming parameters it might be necessary to validate that our outbound data adheres to the API specification. This validation checks that the response headers is of the type we have specified in the API description. This validation can be applied on all scopes for sections outbound and on-error.
Returning a response of wrong type will result in a status code 400 with a generic text “The request could not be processed due to an internal error. Contact the API owner.” if you look into Application Insights the message is more clear “header Test-Header cannot be validated. Input string ‘yourvalue’ is not a valid number. Path ”, line 1, position 3.“
Generic response with Status Code 400
Status code
An important part of our API specifications are the status codes we return. This validation check the HTTP status codes in responses against the API schema. This validation can be applied on all scopes for sections outbound and on-error.
In this example an operation that has only status code 200 specified. If the requested session doesn’t exist the backend will return Status Code 404. We want to validate all not specified status codes except 404.
Only Status Code 200 is specified
To avoid the status code 502 that will be the result if we validate any unspecified status codes we add a status-code element with action set to ignore.
The prevented response if we don’t ignore the 404 status code.
Status 502 if an unspecified header is returned
The exception in Application Insights
Summary
This is definitely a set of policies that we can use to ensure that a API specification is honored. It will require some thinking to balance the trade off between the added value and the performance implication of doing the validation.
Having a common set of Hosts and Handlers over different environments is a quite common practice. Being able to script the setup is a great way to ensure you have the same set hosts and settings. A good approach when you write scripts to handle Hosts, instances and handlers is to be able to re-run them over and over and just creating new if not present.
I use the PowerShell Provider that comes with BizTalk Server which I find handy. If you prefer that you could use the WMI classes MSBTS_Host, MSBTS_HostInstance and MSBTS_ServerHost. The sample below uses the PowerShell Provider. Note that for hosts I do a Push-Location to “BizTalk:\Platform Settings\Hosts\“. To create objects with the Provider, navigating to the right location is a must before calling New-Item.
To create a host
To create an instance
The full samples can be found in the BizTalk-Common.psm1.
Making a host handler for an adapter
To configure handlers with WMI use the classes MSBTS_ReceiveHandler and MSBTS_SendHandler2. Doing it with the PowerShell Provider is straight forward using New-Item.
The full sample can be found in the BizTalk-Common.psm1.
Exporting and importing Hosts, instances and handlers
A lot of customers have asked for a way to “copy” their setup in let’s say QA to Production. One tool you could use is the BizTalk Server Migration Tool in which you can select what parts you want to move, that said you might want to change some settings along the way. I have written a script (Export-BizTalkHostsRemote.ps1) to export Hosts, Host Instances and Handler to a CSV file and another (Configure-Hosts.ps1) to import using the edited output from the export. I could have used the existing settings export but wanted to easily be able to edit the file in Excel. I have intentionally left out throttling settings. Hope you find them handy, if you find any issues let me know.
Format of the file
Column
Comment
HostName
Name of the host
HostType
Expected values InProcess or Isolated
GroupName
Windows Group that controls access to the host
AuthTrusted
Authentication Trusted setting in the properties dialogue.
IsTrackingHost
Allow host tracking setting in the properties dialogue.
Is32BitOnly
32-Bit only setting in the properties dialogue.
MessagingMaxReceiveInterval
Messaging Polling Interval for the host
XlangMaxReceiveInterval
Orchestration Polling Interval for the host
InstanceServer
A pipe | separated list of instances
Server1|Server2
InstanceUser
Service Account used for the host instances
InstanceUserPwd
In the export same value as InstanceUser for the import it expects path to a password in a KeePass database
ReceiveHandler
A pipe | separated list of Receive adapters for this host.
WCF-SQL|MSMQ|FILE
SendHandler
A pipe | separated list of Send adapters for this host. A * will mark this as the default host.
To configure BizTalk Server with script you need a configuration file that describes your choices. You can export it from an existing environment or run Configuration.exe do your selections and export. To reuse the file, you will need to edit it to match the new environment. Generally, I create template files with tokens that I replace at configuration time. You can read about the configuration framework here.
The configuration files differ slightly between the first server in which you create the group and the secondaries that joins the group. In the image below you see that the Selected Attribute differs. The same kind of change for SSO if you create the group on the first server.
Handling secrets
In the configuration files you will enter passwords for service accounts and SSO Backup. One alternative could be to use KeePass files that are encrypted or Azure KeyVault. That said at some point you will have secrets in clear text so ensure you delete the configuration file when you’re done. Below a sample function to extract the secrets from a KeePass database using the PoShKeePass module. You can find information extracting KeyVault secrets here.
The execution itself is quite straight forward starting Configuration.exe with /S parameter
Other configurations
Something I see quite often is that customers forget to configure Backups and DTA Purge jobs in a timely manner which leads to unnecessary database growth. You don’t need many lines of code to configure them.
Other potential targets are to register the BizTalk PowerShell provider, WinSCP setup for the SFTP adapter and third-party software.
When installing BizTalk Server with script you will need to tell the setup program what features to install. You can use the /ADDLOCAL parameter using a list of features as described here or use a configuration file with the /S parameter. I prefer to use the later by exporting the features from a reference installation or a previous version installation (Compare an export from previous versions with the list below since some features are no longer valid).
This table shows an export for features from the msi-files, I have matched them with a configuration export from a fully installed BizTalk Server. The feature column is the name that must be used, and it is case sensitive.
Feature
Parent Feature
Description
Documentation
Selecting the Documentation component installs the core documentation, tutorials, UI reference (F1 help), programmer’s reference, and usage instructions for the SDK samples and utilities.
AdditionalApps
Additional Software is a set of optional components that extend the functionality of Microsoft BizTalk Server.
BAMEVENTAPI
AdditionalApps
Select the BAM-Eventing Support component to install necessary software for the BAM-Eventing Interceptors for Windows Workflow Foundation and Windows Communication Foundation. Selecting this component also installs the BAM Event API that is used to send events to the BAM database from custom applications. BAM-Eventing Support is part of the Business Activity Monitoring feature in Microsoft BizTalk Server.
FBAMCLIENT
AdditionalApps
Selecting the BAM Client component installs the necessary client side software that allows business users to work with the Business Activity Monitoring feature of Microsoft BizTalk Server.
MQSeriesAgent
AdditionalApps
Selecting the MQSeries Agent component installs the necessary software that enables Microsoft BizTalk Server to send and receive messages to an MQSeries message bus.
OLAPNS
AdditionalApps
Selecting the BAM Alert Provider component installs the necessary software that enables Microsoft BizTalk Server to provide Business Activity Monitoring (BAM) alerts.
ProjectBuildComponent
AdditionalApps
Project Build Component enables building BizTalk solutions without Visual Studio.
RulesEngine
AdditionalApps
Selecting the Business Rules Composer and Engine component installs the necessary software to compose policies that are consumed by the Business Rules Engine.Engine component provides a mechanism for capturing dynamically changing business policies and the ability to implement those changes quickly within and across business applications.
SSOAdmin
AdditionalApps
Selecting the Enterprise Single Sign-On (SSO) Administration component installs the necessary software for administering, managing, and connecting to SSO Servers.
SSOServer
AdditionalApps
Selecting the Enterprise Single Sign-On (SSO) Master Secret Server component installs the necessary software that enables this server to become the master secret server, store the master secret (encryption key), and generate the key when an SSO administrator requests it.To use this feature, you must also install the following: Enterprise Single Sign-On (SSO) Server.
AdminAndMonitoring
Selecting the Administration Tools component installs the necessary software to administer Microsoft BizTalk Server. To use this feature, you must also install the following: Enterprise Single Sign-On (SSO) Administration.
AdminTools
AdminAndMonitoring
This feature contains tools to monitor, administer, and deploy onto a Microsoft BizTalk Server solution. These tools include MMC snap-ins, Health and Activity Tracking, Deployment Wizards and other tools for monitoring, administration and deployment.
BAMTools
AdminTools
Business Activity Monitoring administration
BizTalkAdminSnapIn
AdminTools
Configure and manage Microsoft BizTalk Server.
HealthActivityClient
AdminTools
Health and Activity Tracking Client
MonitoringAndTracking
AdminTools
Health Monitoring, Reporting and Tracking tools
PAM
AdminTools
PAM
WcfAdapterAdminTools
AdminAndMonitoring
Windows Communication Foundation Administration Tools
Development
Selecting the Developer Tools and SDK component installs samples and utilities that enable the rapid creation of Microsoft BizTalk Server solutions. This includes: SDK samples and supporting documentation, BizTalk Explorer, schema and map designers, and Visual Studio 2015 project templates. This component requires Visual Studio 2015.To use this feature, you must also install the following: Enterprise Single Sign-On (SSO) Administration.
DeploymentWizard
Development
Deploy, import, export and remove BizTalk assembly
Migration
Development
Migration
SDK
Development
Provides programmatic access to Microsoft BizTalk Server
SDKScenarios
SDK
SDK Scenarios Note: I’m not sure of the usage, it is not included in a configuration export but it is accepted and noted in the logs when used.
TrackingProfileEditor
Development
Business Activity Monitoring Tools
VSTools
Development
Visual Studio Tools
WCFDevTools
VSTools
Windows Communication Foundation Tools
BizTalkExtensions
VSTools
Biz Talk Extensions
AdapterImportWizard
BizTalkExtensions
Adapter Import Wizard
BizTalkExplorer
BizTalkExtensions
Manage BizTalk Configuration databases
MsEDISchemaExtension
BizTalkExtensions
Microsoft EDI Schema Design Tools
XMLTools
BizTalkExtensions
XML Tools
Designer
BizTalkExtensions
Orchestration and Pipeline designers
OrchestrationDesigner
Designer
BizTalk Orchestration Designer
PipelineDesigner
Designer
BizTalk Pipeline Designer
MsEDISDK
Development
Microsoft EDI SDK
MsEDIMigration
MsEDISDK
Selecting the Microsoft EDI Migration Wizard installs the necessary software that enables migration of existing EDI documents to
BizTalk
BizTalk
WMI
BizTalk
WMI
InfoWorkerApps
The Portal Components are a set of services used by business people to communicate, collaborate, and reach decisions enabling them to interact, configure, and monitor business processes and workflows. To use this feature, you must also install Internet Information Services (IIS).
BAMPortal
InfoWorkerApps
Selecting the Business Activity Monitoring component installs the necessary software that gives business users a real-time view of their heterogeneous business processes, enabling them to make important business decisions. To use this feature, you must also install Internet Information Services (IIS).
Runtime
Selecting the Server Runtime component installs the runtime services for Microsoft BizTalk Server. These runtime services are an essential part of the BizTalk Server platform. To use this feature, you must also install the following: Enterprise Single Sign-On (SSO) Administration, Enterprise Single Sign-On (SSO) Server.
Engine
Runtime
The Engine feature contains components for performing messaging, orchestration, and tracking functions. This is the core runtime component of Microsoft BizTalk Server. This option also includes Enterprise Single Sign-On components to allow encryption of configuration data.
MOT
Engine
Messaging, Orchestration, and Tracking runtime components.
MSMQ
Engine
BizTalk Adapater for Microsoft Message Queue Service.
MsEDIAS2
Runtime
Selecting the BizTalk EDI/AS2 Runtime components installs the necessary software that enables Microsoft BizTalk Server to process documents in the Electronic Data Interchange (EDI) format.
MsEDIAS2StatusReporting
MsEDIAS2
Microsoft EDI/AS2 Status Reporting
WCFAdapter
Runtime
Selecting the Windows Communication Foundation Adapter component installs the necessary software that enables Microsoft BizTalk Server to integrate with Windows Communication Foundation.
Example of an installation execution in PowerShell
The adapter pack can also be installed silently. The process consists of several steps since you have the SDK and adapters and for each one of them 32 and 64 bit to include. In the setups I do with customers we tweak the installation to match the features that will be used on the specific installation. Detailed description on the parameters can be found here. In the samples you find a function that installs both the SDK and the adapters.
Example of an installation execution in PowerShell that will install WCF-SQL and Oracle DB adapters.
Keeping your systems updated is a good practice and you can install both CUs and Feature packs silently. In the samples you find a function that installs CU. Scripted installation of CUs and Feature pack is straight forward and you can get the required parameters running the installer with /?.
Example of an installation execution in PowerShell that will install a CU or Feature pack.
Add a check to see if your Virus scanner is enabled. The installation process will try to stop WMI which many scanners use thus protect from an it will lead the installation to fail.
While doing your test installations, you will probably want to do some retries. Most features can be uninstalled so you can write another script to uninstall, then you can start over with your installation.
Installing and configuring BizTalk Server can be complex, time consuming and error prone. The complexity comes not from the process itself but more from all the different components and possible configurations. My objective is to share my experiences from working with several customers and some techniques you can use to create your perfect installation. The objective is not to show “the perfect” process, perfect for me might not be perfect for you. In this post I share an overview, later I will do some more with more details and share some of the functions I use.
Before you start
Decide what your main drivers are and let that be your guide through the creation. Repetition and control are generally the drivers to automate, and the goal standardized developer machines, disaster recovery preparation, test environments or ensuring the environments are equally configured over our Dev to Production pipeline.
Decide what your baseline is and document it, think what could change in 6 month or a year. With one customer we created a brilliant setup that started with bare Windows installations and within 2 hours a highly available solution was in place. Discussing a year later we concluded that creating the LUNs and volumes might have being overdoing it since the underlying storage will be changed. It would have been better to have it as a requirement in our baseline or as a distinct step in the process that could be very easily changed or replaced.
Consider internal organization and politics, i.e. if you will never be allowed to install SQL Server or create groups put it in your baseline document as a pre requisite.
Set a timeframe for your work, if not you can spend weeks searching for perfection instead of reaching your goal. If you’re too ambitious you might end up in a overly flexible process that just is a parallel to the normal one, not a good idea.
Document the execution process. Write the running order of scripts and shortly what each one does.
Windows
Generally, Windows installation is already taken care of and I see it as a part of the baseline. That said, you should ensure that the features and configurations you need are in place. You will need the Windows installation source and can use PowerShell Cmdlets like Enable-WindowsOptionalFeature or Install-WindowsFeature (will not work on Windows 10). I find this post good to find features and decide which one to use.
BizTalk Server
Setting up BizTalk Server product consists of two parts, installation and configuration. Installation will add the binaries to the system. Configuration will create or join a BizTalk Group and enable/configure other features as Rules Engine.
When running Setup.exe with the /S command it will use the list of InstalledFeature elements in the configuration file you specify. The silent installation details are documented here.
When running Configuration.exe with the /S command it will use the Feature elements. Each Feature element represents a section in the configuration dialog box. I will look more in depth on this on another post.
Installation of additional software like Adapter Pack, CU/Feature Pack, WinSCP (needed for SFTP adapter) can also be installed silently. Setting up hosts and handlers can also be done.
SQL Server
SQL Server can be silently installed and use configuration files to use the configurations you will need. I leave this with a pointer to the documentation.
Things I have scripted post installation are
Setup of Availability Groups and creating empty BizTalk Databases with the file setup I want to have.
Setting in the Primary Check for availability Groups
Configuration of Backup and DTA Purge.
Wrapping up
Basically, all parts of setting up a BizTalk Server environment can be done with script. Your needs and environment set the limits. I believe scripting your environment is a good way to get to know the components you’re using. I will follow up with some more posts that will go more into hands on approach to the different parts.
I work on a project doing integration with Azure Integration Services and Azure Functions. As always in integration projects doing mapping is a key area. I’m used from BizTalk development to be able to test my maps without doing deployment which makes it easy to develop in an iterative manner. While working with Logic Apps I started using Liquid transformations and did not find any tool to help with that. Logic Apps transforms with the DotLiquid library (C# Naming conventions). With that information in hand I created a program to test. I separated it in two a library, LiquidTransformationLib.dll, and a program, LiquidTransform.exe. The library makes it easy to use in automated tests.
Parameters for LiquidTransform.exe:
Parameter
Description
Required
-t | –template
Full path to the Liquid template to use.
Yes
-c | –content
Full path to Content file.
Yes
-d | –destination
Full path to destination file. Will overwrite an existing file
Yes
-r | –rootelement
Root element to add before render. For Logic Apps you will need to use content
No
-u | –rubynaming
Use RubyNamingConvention, Logic Apps use C# naming conventions and will be the default.
I have created a repository in GitHub to share some scripts I have written. I will add scripts over time. All scripts will be shared As Is under MIT license.
SQL Server 2016 Availability Groups has a limitation, two databases involved in a distributed transaction cannot reside in the same SQL Server instance. For a BizTalk Server installation that will be using Availability Groups this means you need to separate the databases in several SQL Server instances, making the installation more complicated than desired.
SQL Server 2016 SP2 adds the functionality that solves this problem, thus two databases involved in a distributed transaction can be in the same SQL Server instance. BizTalk Server 2016 CU5 and FP3 adds support for this functionality. As a result, BizTalk Server 2016 installations using Availability Groups can be configured using less SQL Server instances if desired.
Database location rules when using Availability Groups
Rules for database location when using Availability Groups with BizTalk Server installations.
Version
Rules
Prior to SQL Server 2016 – BizTalk Server 2016
Distributed transactions not supported, thus not supported for BizTalk Server.