DR environment for Azure Api Management | backup and restore

Azure API Management allows the entire service configuration to be backed up to a storage account.
Either on an ad-hoc basis or on a predefined schedule using Azure Automation, you can back up and restore the configuration to another API Management instance (in another region for DR).

I’ve found that it takes about 15 minutes to restore the configuration, during which you cannot make any other changes to the APIM service, or use the Publisher/Developer portal. This includes any changes that you try to apply using Powershell/Xplat-cli/API, you will receive an error that you cannot make changes whilst the configuration is being applied.

It should be noted that after restoration, the usage statistics are not carried across, the APIM instances usage statistics will remain. The backup/restore is really intended for syncing Production with other non Production environments (DR, Pre-prod, etc). Not to be used to apply release changes to your production environment.

Official MSDN Docs;
Backup-AzureRmApiManagement : https://msdn.microsoft.com/en-us/library/mt619281.aspx
Restore-AzureRmApiManagement : https://msdn.microsoft.com/en-us/library/mt619278.aspx

Powershell script

Azure Batch – Pool Auto scale forumulas

Just playing around with some Azure Batch Auto-scaling formulas.
See the Azure documentation here : https://azure.microsoft.com/en-gb/documentation/articles/batch-automatic-scaling/

Time based – Static

Between 8am and 6pm, Monday – Wednesday (0=Sunday) 5 nodes will be allocated to the pool.

Task based – Dev/Economical

Restrict capacity to 0 unless there is a waiting task – then allocates an additional 1 node.
This will built up to 5 minutes of latency before the pool autoscales, not including node boot time, and any pre-tasks the pool has associated. In my case this took up to 12 minutes before the scale occurred, originally going over my coded task schedulers timeout of 10 minutes.

Combination of task and time

On Monday (8am-6pm) ensure a minimum capacity of 1. When there is a waiting task in the queue, increase the pool capacity by 1 node.

Hard limit

As above, but ensuring that the maximum pool size never exceeds 10.

Azure Batch Start task – JRE windows | Java Runtime Environment

I’ve had to set Azure Batch up for a Java application, on Windows (go figure 🙂 ).

The first thing you need to do is get the Java Offline Installer.

To launch the JRE install sliently, you’ll want to pass the /s flag. This only works with the offline installer.
Ref: http://docs.oracle.com/javase/8/docs/technotes/guides/install/windows_installer_options.html

Upload the installer to an Azure storage account, and note the public url to it.

Go to your Azure Batch account in the portal, click on the compute pool that you want to add the start task for.
Click resources, and enter the Azure Storage URL of the installer along with a destination path.

NB: Resource Paths are relational. Just enter the name of the file you want created, and it’ll be created here “C:\user\tasks\startup\wd\jre-8u101-windows-x64.exe”
Bad | “d:\jre” : “The resource file path is not valid”
Bad | “d:\jre\jre-8u101-windows-x64.exe” : “The resource file path is not valid”
Good | “jre-8u101-windows-x64.exe”

You should choose the options to “Run Elevated” and to “Wait for Success”.

When you add a node to the pool, you’ll find that Java is installed.

Logic Apps – Json Schema Verify

Logic Apps has got a really handy trigger : Request/Response.
It provides a public URL that you can post to, and takes an optional Json Schema.  The benefit of providing a Json schema is purely that the fields defined in the schema are then accessible throughout the Logic App workflow for interaction with other actions.
This is awesome, it really makes the Visual aspect of the designer work well –  but being picky – there’s no built in validation of the Json Request against the schema you provide.

I’ve got a pretty well defined Json schema that uses

  • Required Fields
  • Typed fields (string/integer/etc)
  • Regex patterns for strings

As such i’m pretty keen to first find out if the Json in the request is actually valid.  If it’s not – I can return a 400 pretty quickly to the caller and save some valuable cash on my Logic Apps actions (Logic Apps Price Model).

The solution I went for was to wrap the Newtonsoft Json Schema library in an Azure Function.

First complication is passing in two parameters to an Azure function (the json from the Request body and the Json Schema).  Both are needed in order to perform a schema validation, and the Azure function can’t reach out to Logic Apps.  Switch to Code View in the Logic App designer and use something like this;

Apart from that, the coding of the Azure function is pretty easy.
It’s in GitHub here https://github.com/Gordonby/AzFunctions/tree/master/JsonValidate, but here’s a snapshot of the files.

A simple implementation of this can be seen below.  A Request trigger which passes the body to my Function App, the Response then just relays in information provided by the output of the app.JsonSchemaValidate

Azure API Management Soap Facade

When you’ve got an old web service that’s needing to be consumed by a mobile app (which loves binding json) – but you can’t change the code of your webservice, you’ve got 1 real option available. Create a façade for the web service.
The main benefit is of course that the mobile app can consume the json directly, without conversion. But API Management also gives you the ability to manage multiple Api’s, protect the API’s (security/throttling), monitor/monetise the API’s and more importantly introduce a caching layer.

Here’s a great guide that a fellow Softie’s written, which takes you through the whole process with a great explanation.

Using Azure API Management To Convert SOAP API XML to JSON

To take it a little further, if you don’t want to submit a SOAP-esque json docment then you need to do a little more policy authoring. My service doesn’t require anything in the request body, just 2 querystring parameters.

The key parts in the policy below are;

  • Wrap the SOAP envelope in CDATA tags
  • Use tokens in your SOAP packet, and replace them after setting the request body
  • Set the header to text/xml after setting the body
  • Use an Eventhub to debug the new body of the request you’re making. This site is really helpful in getting the Eventhub working with APIM
  • Convert the response to Json
  • Consider replacing some of the response Soap envelope tags and xml namespace tags

XML to JSON data manipulation with schemas

Starting with an XML file and then working backwards… Old school data munging at it’s worst, and the tools/tips that I used.

Creating a XSD schema from XML
In Visual Studio 2015, open the XML file. Find the XML main menu and select Generate XSD.

Creating a JSON schema from XSD
This seems to be a bit of an issue at the moment. There’s an old BizTalk 2013 component that will create a json schema of a sort, but not a Rev4 one.
Unfortunately this seems to be best tackled manually at the moment.

Creating a JSON from XML
Here’s my process if you have the original XML.
Convert XML to JSON. I did this in visual studio using Newtonsoft JsonConvert.
Use JSON Lint/ to Validate and Format. You might also want to pull of any XML header info from the JSON at this point.
Look at the resulting JSON. If you’ve got properties beginning with “@” (they’re XML attributes), then you need to bin these characters off – I found it caused problems later.

Creating a JSON schema from JSON

Verifying a JSON schema against JSON

Creating a Swagger definition file for a Logic App

Creating a XSLT Map for an Azure Integration Account
Have a read about the Enterprise Integration Pack, Install the VS2015 extension https://aka.ms/vsmapsandschemas
Create a Biztalk Integration projects to be created then a Integration Map file which will allow you to visually map items between two XML schema files.
After building this project, an XSLT document is output into the bin folder.

Testing a Logic App HTTP Request Endpoint

Azure Functions – Convert Json to Xml with newtonsoft

I’m in the midst of putting together a few Azure Functions to use with Logic Apps.
This one is pretty simple, and demonstrates a couple of concepts so I thought I’d share.

Creating a swagger definition for an Azure Logic Apps Http Request endpoint

If you’ve created a Logic App with a HTTP Request trigger, then next logical thing to do is expose it for consumption (Like in Azure API Management).
Swagger is now the defacto way of describing API’s, and it makes sense that you’d want to create one for your logic app. I’m sure this will make it in there as a feature soon, but for the moment you need to roll your own.

The LogicAppsRequestSchema is exactly the same Json schema that you used in the HTTP Request Trigger action, so keep that handy. The text in bold are the areas you need to replace with your own.

I find that http://editor.swagger.io is the best tool for creating the Swagger files.
The validation is great (although it doesn’t like the default values for the parameters, which as far as i can gather are valid).

"swagger": "2.0",
"info": {
"title": "Friendly name for your logic app",
"description": "Some kind of description about what it does",
"version": "1.0.0"
"host": "aprodinstance.anazureregion.logic.azure.com:443",
"schemes": [
"basePath": "/workflows/yourworkflowid/triggers/manual",
"produces": [
"paths": {
"/run": {
"post": {
"consumes": [
"operationId": "Friendly text",
"summary": "Friendly text",
"description": "Friendly text",
"parameters": [
"name": "api-version",
"in": "query",
"description": "api-version",
"required": true,
"default": "2016-06-01",
"type": "string"
"name": "sp",
"in": "query",
"description": "sp",
"required": true,
"default": "%2Ftriggers%2Fmanual%2Frun",
"type": "string"
"name": "sv",
"in": "query",
"description": "sv",
"required": true,
"default": "1.0",
"type": "string"
"name": "sig",
"in": "query",
"description": "sig",
"required": true,
"default": "thesignaturepartinyourlogicapphttpurl",
"type": "string"
"name": "body",
"in": "body",
"description": "The json you want to submit",
"required": true,
"schema": {
"$ref": "#/definitions/LogicAppsRequestSchema"
"responses": {
"200": {
"description": "Logic Apps response",
"schema": {
"$ref": "#/definitions/LogicAppsResponse"
"definitions": {
"type": "object",
"required": [
"properties": {
"Obj1": {
"properties": {
"Prop1": {
"type": "string"
"required": ["rop1"],
"type": "object"
"Obj2": {
"properties": {
"Prop1": {
"type": "string"
"required": ["Prop1"],
"type": "object"
"LogicAppsResponse": {
"type": "object",
"properties": {
"success": {
"type": "boolean",
"description": "Just a simple response property. You could template return objects/ error messages etc."

Upload Azure AAD B2C Premium Policy with Powershell

If you like all of your interactions with Azure to be through Powershell (who doesn’t like to automate), then you’ll want to do the same thing for B2C Policies.

Don’t try this before you have your b2c directory whitelisted to work with the B2C Policy Upload feature. You’ll know if this is possible, if in the Azure Portal you’re able to upload a policy.

Cascading Resource Group Tags in Azure

Resource Manager Policies in Azure are the way to define and enforce a tagging system.
You can define in a json format rules that must be adhered to for new resources that are deployed.

For resources that you’ve already created, you’ll need to decide on the appropriate strategy. One that I’ve recently put together is a script that cascades the tags you define at the Resource Group level down to the individual resources (VM’s, vNETs, etc etc).

It doesn’t override any of the existing tags that a resource has, simply ensuring that each of the resources has at a minimum the tags that are defined at the Resource Group level.

This version isn’t optimised for running on a schedule in Azure Automation as it’s not a powershell workflow so doesn’t parallelise the foreach loops.

For the latest version, use the GitHub link.