Azure API Management Soap Facade | apim soap

When you’ve got an old web service that’s needing to be consumed by a mobile app (which loves binding json) – but you can’t change the code of your webservice, you’ve got 1 real option available. Create a façade for the web service.
The main benefit is of course that the mobile app can consume the json directly, without conversion. But API Management also gives you the ability to manage multiple Api’s, protect the API’s (security/throttling), monitor/monetise the API’s and more importantly introduce a caching layer.

Here’s a great guide that a fellow Softie’s written, which takes you through the whole process with a great explanation.

Using Azure API Management To Convert SOAP API XML to JSON

To take it a little further, if you don’t want to submit a SOAP-esque json docment then you need to do a little more policy authoring. My service doesn’t require anything in the request body, just 2 querystring parameters.

The key parts in the policy below are;

  • Wrap the SOAP envelope in CDATA tags
  • Use tokens in your SOAP packet, and replace them after setting the request body
  • Set the header to text/xml after setting the body
  • Use an Eventhub to debug the new body of the request you’re making. This site is really helpful in getting the Eventhub working with APIM
  • Convert the response to Json
  • Consider replacing some of the response Soap envelope tags and xml namespace tags

XML to JSON data manipulation with schemas

Starting with an XML file and then working backwards… Old school data munging at it’s worst, and the tools/tips that I used.

Creating a XSD schema from XML
In Visual Studio 2015, open the XML file. Find the XML main menu and select Generate XSD.

Creating a JSON schema from XSD
This seems to be a bit of an issue at the moment. There’s an old BizTalk 2013 component that will create a json schema of a sort, but not a Rev4 one.
Unfortunately this seems to be best tackled manually at the moment.

Creating a JSON from XML
Here’s my process if you have the original XML.
Convert XML to JSON. I did this in visual studio using Newtonsoft JsonConvert.
Use JSON Lint/ to Validate and Format. You might also want to pull of any XML header info from the JSON at this point.
Look at the resulting JSON. If you’ve got properties beginning with “@” (they’re XML attributes), then you need to bin these characters off – I found it caused problems later.

Creating a JSON schema from JSON
http://jsonschema.net/#/

Verifying a JSON schema against JSON
http://jsonschemalint.com

Creating a Swagger definition file for a Logic App
http://gordon.byers.me/azure/creating-a-swagger-definition-for-an-azure-logic-apps-http-request-endpoint/

Creating a XSLT Map for an Azure Integration Account
Have a read about the Enterprise Integration Pack, Install the VS2015 extension https://aka.ms/vsmapsandschemas
Create a Biztalk Integration projects to be created then a Integration Map file which will allow you to visually map items between two XML schema files.
After building this project, an XSLT document is output into the bin folder.

Testing a Logic App HTTP Request Endpoint
https://chrome.google.com/webstore/detail/postman

Azure Functions – Convert Json to Xml with newtonsoft

I’m in the midst of putting together a few Azure Functions to use with Logic Apps.
This one is pretty simple, and demonstrates a couple of concepts so I thought I’d share.

Creating a swagger definition for an Azure Logic Apps Http Request endpoint

If you’ve created a Logic App with a HTTP Request trigger, then next logical thing to do is expose it for consumption (Like in Azure API Management).
Swagger is now the defacto way of describing API’s, and it makes sense that you’d want to create one for your logic app. I’m sure this will make it in there as a feature soon, but for the moment you need to roll your own.

The LogicAppsRequestSchema is exactly the same Json schema that you used in the HTTP Request Trigger action, so keep that handy. The text in bold are the areas you need to replace with your own.

I find that http://editor.swagger.io is the best tool for creating the Swagger files.
The validation is great (although it doesn’t like the default values for the parameters, which as far as i can gather are valid).

{
"swagger": "2.0",
"info": {
"title": "Friendly name for your logic app",
"description": "Some kind of description about what it does",
"version": "1.0.0"
},
"host": "aprodinstance.anazureregion.logic.azure.com:443",
"schemes": [
"https"
],
"basePath": "/workflows/yourworkflowid/triggers/manual",
"produces": [
"application/json"
],
"paths": {
"/run": {
"post": {
"consumes": [
"application/json"
],
"operationId": "Friendly text",
"summary": "Friendly text",
"description": "Friendly text",
"parameters": [
{
"name": "api-version",
"in": "query",
"description": "api-version",
"required": true,
"default": "2016-06-01",
"type": "string"
},
{
"name": "sp",
"in": "query",
"description": "sp",
"required": true,
"default": "%2Ftriggers%2Fmanual%2Frun",
"type": "string"
},
{
"name": "sv",
"in": "query",
"description": "sv",
"required": true,
"default": "1.0",
"type": "string"
},
{
"name": "sig",
"in": "query",
"description": "sig",
"required": true,
"default": "thesignaturepartinyourlogicapphttpurl",
"type": "string"
},
{
"name": "body",
"in": "body",
"description": "The json you want to submit",
"required": true,
"schema": {
"$ref": "#/definitions/LogicAppsRequestSchema"
}
}
],
"responses": {
"200": {
"description": "Logic Apps response",
"schema": {
"$ref": "#/definitions/LogicAppsResponse"
}
}
}
}
}
},
"definitions": {
"LogicAppsRequestSchema":
{
"type": "object",
"required": [
"Header",
"StartOfDuty"
],
"properties": {
"Obj1": {
"properties": {
"Prop1": {
"type": "string"
}
"required": ["rop1"],
"type": "object"
},
"Obj2": {
"properties": {
"Prop1": {
"type": "string"
}
"required": ["Prop1"],
"type": "object"
}
}
}
,
"LogicAppsResponse": {
"type": "object",
"properties": {
"success": {
"type": "boolean",
"description": "Just a simple response property. You could template return objects/ error messages etc."
}
}
}
}
}

Upload Azure AAD B2C Premium Policy with Powershell

If you like all of your interactions with Azure to be through Powershell (who doesn’t like to automate), then you’ll want to do the same thing for B2C Policies.

Don’t try this before you have your b2c directory whitelisted to work with the B2C Policy Upload feature. You’ll know if this is possible, if in the Azure Portal you’re able to upload a policy.

Cascading Resource Group Tags in Azure

Resource Manager Policies in Azure are the way to define and enforce a tagging system.
You can define in a json format rules that must be adhered to for new resources that are deployed.
eg.

For resources that you’ve already created, you’ll need to decide on the appropriate strategy. One that I’ve recently put together is a script that cascades the tags you define at the Resource Group level down to the individual resources (VM’s, vNETs, etc etc).

It doesn’t override any of the existing tags that a resource has, simply ensuring that each of the resources has at a minimum the tags that are defined at the Resource Group level.

This version isn’t optimised for running on a schedule in Azure Automation as it’s not a powershell workflow so doesn’t parallelise the foreach loops.

For the latest version, use the GitHub link.
https://github.com/Gordonby/PowershellSnippets/blob/master/Add-ResourceGroupTagsToResources.ps1

Resetting a users Azure AD Multi factor (MFA) requirement | MFA reset

If you find yourself needing to prompt one of your AAD users to re-set up their MFA method, then the following script should serve that purpose.

Shutting down Azure VMs based on Resource Group tags | shutdown tag

Shutting down your non-production VM’s when you’re not using them in a great way to save money. There’s a couple of good Powershell scripts that make this easy to do by Resource Group – but when you want to be a little more granular, and actually automate across resource groups you need a smarter script – this is where Resource tagging comes in.

This script in basically my v1, simple powershell script designed to be run on demand manually. I’ll be publishing the more fully featured version in the Azure Automation Runbook Gallery shortly. That script has had to have a few workarounds put in to deal with issues arising from being a Powershell Workflow and running in Parallel, so I thought it worthwhile to share the simpler version here.

Azure AD B2C – Using the graph API

There’s a really good guide for getting started with CRUD operations in a AAD B2C tenant on the Azure documentation site;
https://azure.microsoft.com/en-gb/documentation/articles/active-directory-b2c-devquickstarts-graph-dotnet/

As per usual, I’ve ended up putting some powershell together to make it a bit more repeatable when I have to do this for multiple AAD tenants.

This particular script creates the application in the AAD tenant. I’ll be posting further scripts that show off doing some clever stuff when I’ve finished testing and polishing them.

Azure B2C Unified sign up with Page UI customization

When crafting a new Unified sign-up or sign-in page policy in the Azure Portal I managed to get this error

#error=server_error&error_description=AADB2C90001: The server hosting resource 'https://meetr.azurewebsites.net/account/signinorsignup' is not enabled for CORS requests. Ensure that the 'Access-Control-Allow-Origin' header has been configured.
Correlation ID: 613d1479-d146-4b89-abb8-3264730f5991
Timestamp: 2016-04-13 18:33:30Z

Of course, i’d been a bit quick off the mark and not yet changed my Asp.net website to accept Cross Origin Requests.

Here’s what you’ll need to add to your unified Sign In page to fix the error

Response.AppendHeader("Access-Control-Allow-Origin", "https://login.microsoftonline.com");

Code wise, here’s how the Controller Action and View look;