Azure Application Gateway – solving 502 errors with httpd

I’ve recently been playing with the Application Gateway in Azure. Use case is pretty simple, serving as a simple load balancer / waf / dmz for an application that lives on some RHEL VM’s.
To Run App Gateway in its simplest configuration, you just have to;
– Create the vnet
– Create a subnet for the App Gateway (something like a /27 should do)
– Create a subnet for the VM’s
– Create the Application Gateway
– Create the VM’s (just with private ip’s)
– Add the VM’s to the backend pool
– Add an App Gateway NSG to allow port 80/443 and 65503-65534 for the health probe
– Add a VM subnet NSG to allow port 80 from the App Gateway subnet.

However, sometimes these things just don’t work and you need to fault find. The problem I had was that the gateway kept reporting a 502 error, and stating that all the nodes in the backend pool were unhealthy. This is a great article to start with, but didn’t help my problem.

What I then did was create another subnet and dropped a Windows VM jumpbox into it. From here, I could adequately browse the Linux VM’s in IE, as well as SSH onto them to check configuration from inside the same virtual network.

In the end, I created 6 different VM’s with different configuration before the problem became clear. Httpd was the problem.

OS Web server Healthy?
Windows Server 2016 IIS Yes
Ubuntu 17.04 Nginx Yes
CentOs 7 httpd No
RHEL 7.3 httpd No
RHEL 6.7 httpd No
RHEL 7.3 nginx Yes

Because all the traffic on the private IP’s worked correctly from the jumpoff box, it’s clear that it wasn’t a firewall or nic configuration problem. The one consistent issue was httpd. After trying a few different settings changes in http.conf, I found the solution.
The default welcome page was causing the probe not to work. By creating an index.html file in /var/www/html/ the health probe began reporting success.

Damn you welcome page!
RHEL

Azure Logic Apps, tips from the real world

About 2 years ago I wrote some PowerShell as a workflow function that could be used in Azure Automation. The purpose was simple, get a daily email informing me of yesterdays Azure spend and then predict my annual charge based on it. I had a colleague ask me about it, and it got me thinking – pretty much the entirety of my script could be replicated in an Azure Logic App. Here’s the script if you’re interested : http://gordon.byers.me/azure/estimating-your-annual-azure-bill/

Translating the PowerShell function into a Logic App wouldn’t really add much value to the end product, i’d still get an email… However it would be a good experience for eating the Logic App dog food and seeing where the capabilities fall short and I have to use some code. It also makes it very easy to customise with other connectors in the Logic Apps library. To that end, this post isn’t so much about Azure billing, but Logic Apps and the tips and tweaks that you might find useful when considering it for your own workloads.

So, lets start with the end result, then pick it apart.
LogicAppBilling

  1. The Trigger. A schedule. Once every day at approx. 9am.
  2. Initialise Variable. Setting an integer variable to the value for my Azure Enterprise Enrolment.
  3. Initialise Variable. Setting a string variable to the value for my EA Billing API key. (Ok, i’m cheating here as at the time Logic Apps doesn’t have string variables, but it’s coming soon!)
  4. Http call. Doing a GET to the EA billing api and asking for a summary of all usage reports.
  5. Blob create. Persisting the Summarised data from the api down to a container for future analysis
  6. Parse Json. Looking to what was returned in the Http call, and putting a Json Schema over the top to allow me to use this data in further Logic Apps.
  7. Parse Json. Filtering to the last month in the Array and putting a Json Schema over the top of it
  8. Blob create. Persisting the output to blob storage purely as part of the development process, so I can see what’s going on
  9. Http call. Doing a secondary GET to the EA billing api and asking for the detailed usage for the latest month
  10. Blob create. Persisting the detailed data down to blob storage for future analysis
  11. Azure function. At this point I need something to take a CSV file, clean it up a little, do some maths and just return me the sum of my usage in a currency unit
  12. Blob create. Persisting the output to blob storage purely as part of the development process, so I can see what’s going on
  13. Gmail send. Sending an HTML email with the monthly cost and predicted cost

So as it stands I managed to get quite far through the process without reaching out to Azure Functions. When I did have to use my own code, I created a Powershell Azure Function which let me keep all my previously written code for parsing…. RESULT! Nothing worse than rewriting something that works.
Here’s how that code looks;

Tips and Tricks

Ok, so it wasn’t all plain sailing writing the logic app. Here’s a few areas where I needed to do more than connect the dots with the visual designer.

Parsing Json
These are really cool little actions. They let you put a JSON schema on top of an Json that you’ve taken into the Logic App. Previously you’d have multiple Logic Apps talking to each other over Request/Response triggers. For ultimate ease, they’ll create the schema for you based on the json data, so make sure you’ve got this to hand. I found that using the Run History UI in Logic Apps pretty handy, as was persisting it down to blob storage and then grabbing it out.

Filtering to the last item in an array
Using a Json Parse action worked well again here, and allowed me to use the LAST function to quickly focus on what I wanted.
json-summary-filter


Concatenating implicitly in an Http action’s URI field

Ok, it didn’t like this much at all. I had to break into code mode and use the CONCAT function to combine the base URL with the one resulted from the HTTP call.
@{concat('https://ea.azure.com/',body('Grab Latest Months Summary')?['LinkToDownloadDetailReport'])}

Wrapping the output in Json
An easy one, but one worth mentioning, when passing data around between functions and API’s you’ll often have to start wrapping it in JSON.
powershellwrap

Type conversion and maths
My PowerShell function returns a string, which I then need to convert to an integer and do some basic maths on.
I used MUL to perform a multiplication, and FLOAT to convert the string to a float that I could do the MUL on.
You spent £@{body('Parse-Azure-EA-Detail-CSV')} yesterday.
If every day was like yesterday, your 12 monthly Azure Consumption would be £@{mul(365,float(body('Parse-Azure-EA-Detail-CSV')))} 

I’ll be creating an ARM template for this and putting it on GitHub soon, so if you’re wanting to get a daily email with your Azure spend you’ll be able to quickly and easily.

Dealing with code view quotes in an Azure Logic App | JSON

The visual designer in Logic Apps is pretty good, but all too often you need to break out to the code view. For me, this is usually because I want to construct some Json to pass into an Azure Function for processing. What’s not entirely obvious is the correct way in which to deal with double quotes inside the code view. As ever, when I google something and find nothing – I’ll share a blog about the answer was, and this is no exception 🙂

The correct escape characterto use is a backslash.

This snippet shows the final version of what i’m doing. In this case i’m writing the json out to a blob storage (as a debug step), before sending to an azure function for basic JSON validation and then giving the data onward to PowerBi for visualisation.
dealing with quotes in logic app JSON

And here’s the text;
"body": "{\"tweetId\": @{triggerBody()?['TweetId']}, \"tweet\": @{triggerBody()}, \"cogSentiment\": @{body('Detect_Sentiment')}, \"cogKeyPhrases\": @{body('Key_Phrases')}}",

Azure Powershell on linux on windows

In the Windows 10 Anniversary update you’re able to install the “Windows Subsystem for Linux”, see the Bash on Ubuntu on Windows blog.

Then in August we announced that Powershell had been opensourced and available on linux : https://azure.microsoft.com/en-us/blog/powershell-is-open-sourced-and-is-available-on-linux/.

The version of Ubuntu that gets installed is 14.04 – which is supported by Powershell. So obviously the first thing you’ll want to do on your Windows 10 OS, is install Bash then Powershell and then the Azure module for Powershell. If only for a change of scenery and a bit of script-inception.

Installing Azure Powershell on Linux, in Windows

DR environment for Azure Api Management | backup and restore

Azure API Management allows the entire service configuration to be backed up to a storage account.
Either on an ad-hoc basis or on a predefined schedule using Azure Automation, you can back up and restore the configuration to another API Management instance (in another region for DR).

I’ve found that it takes about 15 minutes to restore the configuration, during which you cannot make any other changes to the APIM service, or use the Publisher/Developer portal. This includes any changes that you try to apply using Powershell/Xplat-cli/API, you will receive an error that you cannot make changes whilst the configuration is being applied.

It should be noted that after restoration, the usage statistics are not carried across, the APIM instances usage statistics will remain. The backup/restore is really intended for syncing Production with other non Production environments (DR, Pre-prod, etc). Not to be used to apply release changes to your production environment.

Official MSDN Docs;
Backup-AzureRmApiManagement : https://msdn.microsoft.com/en-us/library/mt619281.aspx
Restore-AzureRmApiManagement : https://msdn.microsoft.com/en-us/library/mt619278.aspx

Powershell script

Azure Batch – Pool Auto scale forumulas

Just playing around with some Azure Batch Auto-scaling formulas.
See the Azure documentation here : https://azure.microsoft.com/en-gb/documentation/articles/batch-automatic-scaling/

Time based – Static

Between 8am and 6pm, Monday – Wednesday (0=Sunday) 5 nodes will be allocated to the pool.

Task based – Dev/Economical

Restrict capacity to 0 unless there is a waiting task – then allocates an additional 1 node.
This will built up to 5 minutes of latency before the pool autoscales, not including node boot time, and any pre-tasks the pool has associated. In my case this took up to 12 minutes before the scale occurred, originally going over my coded task schedulers timeout of 10 minutes.

Combination of task and time

On Monday (8am-6pm) ensure a minimum capacity of 1. When there is a waiting task in the queue, increase the pool capacity by 1 node.

Hard limit

As above, but ensuring that the maximum pool size never exceeds 10.

Azure Batch Start task – JRE windows | Java Runtime Environment

I’ve had to set Azure Batch up for a Java application, on Windows (go figure 🙂 ).

The first thing you need to do is get the Java Offline Installer.
http://java.com/en/download/manual.jsp

To launch the JRE install sliently, you’ll want to pass the /s flag. This only works with the offline installer.
Ref: http://docs.oracle.com/javase/8/docs/technotes/guides/install/windows_installer_options.html

Upload the installer to an Azure storage account, and note the public url to it.

Go to your Azure Batch account in the portal, click on the compute pool that you want to add the start task for.
Click resources, and enter the Azure Storage URL of the installer along with a destination path.

NB: Resource Paths are relational. Just enter the name of the file you want created, and it’ll be created here “C:\user\tasks\startup\wd\jre-8u101-windows-x64.exe”
EG;
Bad | “d:\jre” : “The resource file path is not valid”
Bad | “d:\jre\jre-8u101-windows-x64.exe” : “The resource file path is not valid”
Good | “jre-8u101-windows-x64.exe”

You should choose the options to “Run Elevated” and to “Wait for Success”.
AzureBatchStartupTask

When you add a node to the pool, you’ll find that Java is installed.
JavaInstalledBatch

Logic Apps – Json Schema Verify

Logic Apps has got a really handy trigger : Request/Response.
It provides a public URL that you can post to, and takes an optional Json Schema.  The benefit of providing a Json schema is purely that the fields defined in the schema are then accessible throughout the Logic App workflow for interaction with other actions.
This is awesome, it really makes the Visual aspect of the designer work well –  but being picky – there’s no built in validation of the Json Request against the schema you provide.

I’ve got a pretty well defined Json schema that uses

  • Required Fields
  • Typed fields (string/integer/etc)
  • Regex patterns for strings

As such i’m pretty keen to first find out if the Json in the request is actually valid.  If it’s not – I can return a 400 pretty quickly to the caller and save some valuable cash on my Logic Apps actions (Logic Apps Price Model).

The solution I went for was to wrap the Newtonsoft Json Schema library in an Azure Function.

First complication is passing in two parameters to an Azure function (the json from the Request body and the Json Schema).  Both are needed in order to perform a schema validation, and the Azure function can’t reach out to Logic Apps.  Switch to Code View in the Logic App designer and use something like this;

Apart from that, the coding of the Azure function is pretty easy.
It’s in GitHub here https://github.com/Gordonby/AzFunctions/tree/master/JsonValidate, but here’s a snapshot of the files.

A simple implementation of this can be seen below.  A Request trigger which passes the body to my Function App, the Response then just relays in information provided by the output of the app.JsonSchemaValidate

Azure API Management Soap Facade

When you’ve got an old web service that’s needing to be consumed by a mobile app (which loves binding json) – but you can’t change the code of your webservice, you’ve got 1 real option available. Create a façade for the web service.
The main benefit is of course that the mobile app can consume the json directly, without conversion. But API Management also gives you the ability to manage multiple Api’s, protect the API’s (security/throttling), monitor/monetise the API’s and more importantly introduce a caching layer.

Here’s a great guide that a fellow Softie’s written, which takes you through the whole process with a great explanation.

Using Azure API Management To Convert SOAP API XML to JSON

To take it a little further, if you don’t want to submit a SOAP-esque json docment then you need to do a little more policy authoring. My service doesn’t require anything in the request body, just 2 querystring parameters.

The key parts in the policy below are;

  • Wrap the SOAP envelope in CDATA tags
  • Use tokens in your SOAP packet, and replace them after setting the request body
  • Set the header to text/xml after setting the body
  • Use an Eventhub to debug the new body of the request you’re making. This site is really helpful in getting the Eventhub working with APIM
  • Convert the response to Json
  • Consider replacing some of the response Soap envelope tags and xml namespace tags

XML to JSON data manipulation with schemas

Starting with an XML file and then working backwards… Old school data munging at it’s worst, and the tools/tips that I used.

Creating a XSD schema from XML
In Visual Studio 2015, open the XML file. Find the XML main menu and select Generate XSD.

Creating a JSON schema from XSD
This seems to be a bit of an issue at the moment. There’s an old BizTalk 2013 component that will create a json schema of a sort, but not a Rev4 one.
Unfortunately this seems to be best tackled manually at the moment.

Creating a JSON from XML
Here’s my process if you have the original XML.
Convert XML to JSON. I did this in visual studio using Newtonsoft JsonConvert.
Use JSON Lint/ to Validate and Format. You might also want to pull of any XML header info from the JSON at this point.
Look at the resulting JSON. If you’ve got properties beginning with “@” (they’re XML attributes), then you need to bin these characters off – I found it caused problems later.

Creating a JSON schema from JSON
http://jsonschema.net/#/

Verifying a JSON schema against JSON
http://jsonschemalint.com

Creating a Swagger definition file for a Logic App
http://gordon.byers.me/azure/creating-a-swagger-definition-for-an-azure-logic-apps-http-request-endpoint/

Creating a XSLT Map for an Azure Integration Account
Have a read about the Enterprise Integration Pack, Install the VS2015 extension https://aka.ms/vsmapsandschemas
Create a Biztalk Integration projects to be created then a Integration Map file which will allow you to visually map items between two XML schema files.
After building this project, an XSLT document is output into the bin folder.

Testing a Logic App HTTP Request Endpoint
https://chrome.google.com/webstore/detail/postman