External monitoring of Azure web apps with Statuscake

The problem with monitoring from within the platform your app lives in, is that you’re unaware of any connectivity problems that exists outside of your environment. True, you’re not wanting to be monitor Internet weather either but in reality having both monitoring systems is a must.

Statuscake, is an endpoint monitoring service that offers a bunch of features around web endpoint monitoring – but i’ll just be looking at the most basic http check which will look for a 200 response.
*There are many other services available that perform this function – including Pingdom which I’ve used for years (and actually monitors this blog).*
When I first found Statuscake, years ago – it offered more monitoring servers in the UK than any other provider so provided the USP. What I liked about it was the API and the service it provides as part of its free tier.

statuscake monitoring

I’ve written several automation scripts against it’s API and have one blog post here.

Now that we’re happy with Endpoint monitoring through Statuscake, here’s the powershell functions I’ve written to perform a bunch of standard tasks.

So how can we get some monitoring action happening from Azure? Let’s stick to powershell and start using the Azure cmdlets to query all the Web App endpoints. Once we have the list of Azure addresses, we can then query Statuscake to see which one’s do not yet have a monitor set up – then add it. In this way we can host the powershell in Azure Automation to run on a scheduled basis.

What this will give you for free, is a record of availability and performance metrics. It can be extended to alert you upon detected downtime. In short, you’ve got nothing to lose by monitoring all your Azure App Services automatically. In fact a side benefit of the monitoring will be that in the event your Web Application app pools go to sleep, monitoring will keep them alive.

Powershell scripts available on GitHub.

Building an Azure FormFlow Bot – the Job Bot

The Bot framework is some amazing tech to play with. It has some real practical business applications, but you can also have some fun with it – which is what I’ve done.

There are many different types of bot, but I’ve chosen a Forms Bot as being the most applicable to the task. It uses FormFlow and I’ve chosen to use c#, but I could have also used the Node.js template.

The Job Bot.
hedon-bot-a-job-bot-seriously
So I designed this to be the first point of contact for recruiters that contact me about a job. The bot asks the questions I care about, and submits the answers to an Azure Logic App which is where the interest I will have in the job is calculated and then returned to the bot.
There is a lot of different channels that you can have the bot interact on, I’ve focussed solely on Web chat. I’ve then created a basic web page to host it in : job-bot.byers.me

Good points of the demo
So here’s the features of the demo that I’ll call out as being useful to you when creating your own bot;
– Defining the forms elements explicitly
– Making some form questions dependant on the answers of previous questions
– Calling out to an http web service to submit the answers to
– Referencing the environment variables in order to get the url for the web service

Bad points of the demo
The first couple of real world outings of the bot weren’t especially smooth. The recruiters were confused. They also couldn’t understand why after answering the questions they received such a low score. As a technology exercise, it’s been great but I’m not sure if I’m completely happy tormenting recruiters with it.

Source code
It’s all on GitHub, https://github.com/Gordonby/JobBot with the exception of the logic app. For that you’ll just want to create a Request/Response logic app or a function app etc.

Try it!
Try the bot here : job-bot.byers.me

Azure Application Gateway – solving 502 errors with httpd

I’ve recently been playing with the Application Gateway in Azure. Use case is pretty simple, serving as a simple load balancer / waf / dmz for an application that lives on some RHEL VM’s.
To Run App Gateway in its simplest configuration, you just have to;
– Create the vnet
– Create a subnet for the App Gateway (something like a /27 should do)
– Create a subnet for the VM’s
– Create the Application Gateway
– Create the VM’s (just with private ip’s)
– Add the VM’s to the backend pool
– Add an App Gateway NSG to allow port 80/443 and 65503-65534 for the health probe
– Add a VM subnet NSG to allow port 80 from the App Gateway subnet.

However, sometimes these things just don’t work and you need to fault find. The problem I had was that the gateway kept reporting a 502 error, and stating that all the nodes in the backend pool were unhealthy. This is a great article to start with, but didn’t help my problem.

What I then did was create another subnet and dropped a Windows VM jumpbox into it. From here, I could adequately browse the Linux VM’s in IE, as well as SSH onto them to check configuration from inside the same virtual network.

In the end, I created 6 different VM’s with different configuration before the problem became clear. Httpd was the problem.

OS Web server Healthy?
Windows Server 2016 IIS Yes
Ubuntu 17.04 Nginx Yes
CentOs 7 httpd No
RHEL 7.3 httpd No
RHEL 6.7 httpd No
RHEL 7.3 nginx Yes

Because all the traffic on the private IP’s worked correctly from the jumpoff box, it’s clear that it wasn’t a firewall or nic configuration problem. The one consistent issue was httpd. After trying a few different settings changes in http.conf, I found the solution.
The default welcome page was causing the probe not to work. By creating an index.html file in /var/www/html/ the health probe began reporting success.

Damn you welcome page!
RHEL

Azure Logic Apps, tips from the real world

About 2 years ago I wrote some PowerShell as a workflow function that could be used in Azure Automation. The purpose was simple, get a daily email informing me of yesterdays Azure spend and then predict my annual charge based on it. I had a colleague ask me about it, and it got me thinking – pretty much the entirety of my script could be replicated in an Azure Logic App. Here’s the script if you’re interested : http://gordon.byers.me/azure/estimating-your-annual-azure-bill/

Translating the PowerShell function into a Logic App wouldn’t really add much value to the end product, i’d still get an email… However it would be a good experience for eating the Logic App dog food and seeing where the capabilities fall short and I have to use some code. It also makes it very easy to customise with other connectors in the Logic Apps library. To that end, this post isn’t so much about Azure billing, but Logic Apps and the tips and tweaks that you might find useful when considering it for your own workloads.

So, lets start with the end result, then pick it apart.
LogicAppBilling

  1. The Trigger. A schedule. Once every day at approx. 9am.
  2. Initialise Variable. Setting an integer variable to the value for my Azure Enterprise Enrolment.
  3. Initialise Variable. Setting a string variable to the value for my EA Billing API key. (Ok, i’m cheating here as at the time Logic Apps doesn’t have string variables, but it’s coming soon!)
  4. Http call. Doing a GET to the EA billing api and asking for a summary of all usage reports.
  5. Blob create. Persisting the Summarised data from the api down to a container for future analysis
  6. Parse Json. Looking to what was returned in the Http call, and putting a Json Schema over the top to allow me to use this data in further Logic Apps.
  7. Parse Json. Filtering to the last month in the Array and putting a Json Schema over the top of it
  8. Blob create. Persisting the output to blob storage purely as part of the development process, so I can see what’s going on
  9. Http call. Doing a secondary GET to the EA billing api and asking for the detailed usage for the latest month
  10. Blob create. Persisting the detailed data down to blob storage for future analysis
  11. Azure function. At this point I need something to take a CSV file, clean it up a little, do some maths and just return me the sum of my usage in a currency unit
  12. Blob create. Persisting the output to blob storage purely as part of the development process, so I can see what’s going on
  13. Gmail send. Sending an HTML email with the monthly cost and predicted cost

So as it stands I managed to get quite far through the process without reaching out to Azure Functions. When I did have to use my own code, I created a Powershell Azure Function which let me keep all my previously written code for parsing…. RESULT! Nothing worse than rewriting something that works.
Here’s how that code looks;

Tips and Tricks

Ok, so it wasn’t all plain sailing writing the logic app. Here’s a few areas where I needed to do more than connect the dots with the visual designer.

Parsing Json
These are really cool little actions. They let you put a JSON schema on top of an Json that you’ve taken into the Logic App. Previously you’d have multiple Logic Apps talking to each other over Request/Response triggers. For ultimate ease, they’ll create the schema for you based on the json data, so make sure you’ve got this to hand. I found that using the Run History UI in Logic Apps pretty handy, as was persisting it down to blob storage and then grabbing it out.

Filtering to the last item in an array
Using a Json Parse action worked well again here, and allowed me to use the LAST function to quickly focus on what I wanted.
json-summary-filter


Concatenating implicitly in an Http action’s URI field

Ok, it didn’t like this much at all. I had to break into code mode and use the CONCAT function to combine the base URL with the one resulted from the HTTP call.
@{concat('https://ea.azure.com/',body('Grab Latest Months Summary')?['LinkToDownloadDetailReport'])}

Wrapping the output in Json
An easy one, but one worth mentioning, when passing data around between functions and API’s you’ll often have to start wrapping it in JSON.
powershellwrap

Type conversion and maths
My PowerShell function returns a string, which I then need to convert to an integer and do some basic maths on.
I used MUL to perform a multiplication, and FLOAT to convert the string to a float that I could do the MUL on.
You spent £@{body('Parse-Azure-EA-Detail-CSV')} yesterday.
If every day was like yesterday, your 12 monthly Azure Consumption would be £@{mul(365,float(body('Parse-Azure-EA-Detail-CSV')))} 

I’ll be creating an ARM template for this and putting it on GitHub soon, so if you’re wanting to get a daily email with your Azure spend you’ll be able to quickly and easily.

Dealing with code view quotes in an Azure Logic App | JSON

The visual designer in Logic Apps is pretty good, but all too often you need to break out to the code view. For me, this is usually because I want to construct some Json to pass into an Azure Function for processing. What’s not entirely obvious is the correct way in which to deal with double quotes inside the code view. As ever, when I google something and find nothing – I’ll share a blog about the answer was, and this is no exception 🙂

The correct escape characterto use is a backslash.

This snippet shows the final version of what i’m doing. In this case i’m writing the json out to a blob storage (as a debug step), before sending to an azure function for basic JSON validation and then giving the data onward to PowerBi for visualisation.
dealing with quotes in logic app JSON

And here’s the text;
"body": "{\"tweetId\": @{triggerBody()?['TweetId']}, \"tweet\": @{triggerBody()}, \"cogSentiment\": @{body('Detect_Sentiment')}, \"cogKeyPhrases\": @{body('Key_Phrases')}}",

Azure Powershell on linux on windows

In the Windows 10 Anniversary update you’re able to install the “Windows Subsystem for Linux”, see the Bash on Ubuntu on Windows blog.

Then in August we announced that Powershell had been opensourced and available on linux : https://azure.microsoft.com/en-us/blog/powershell-is-open-sourced-and-is-available-on-linux/.

The version of Ubuntu that gets installed is 14.04 – which is supported by Powershell. So obviously the first thing you’ll want to do on your Windows 10 OS, is install Bash then Powershell and then the Azure module for Powershell. If only for a change of scenery and a bit of script-inception.

Installing Azure Powershell on Linux, in Windows

DR environment for Azure Api Management | backup and restore

Azure API Management allows the entire service configuration to be backed up to a storage account.
Either on an ad-hoc basis or on a predefined schedule using Azure Automation, you can back up and restore the configuration to another API Management instance (in another region for DR).

I’ve found that it takes about 15 minutes to restore the configuration, during which you cannot make any other changes to the APIM service, or use the Publisher/Developer portal. This includes any changes that you try to apply using Powershell/Xplat-cli/API, you will receive an error that you cannot make changes whilst the configuration is being applied.

It should be noted that after restoration, the usage statistics are not carried across, the APIM instances usage statistics will remain. The backup/restore is really intended for syncing Production with other non Production environments (DR, Pre-prod, etc). Not to be used to apply release changes to your production environment.

Official MSDN Docs;
Backup-AzureRmApiManagement : https://msdn.microsoft.com/en-us/library/mt619281.aspx
Restore-AzureRmApiManagement : https://msdn.microsoft.com/en-us/library/mt619278.aspx

Powershell script

Azure Batch – Pool Auto scale forumulas

Just playing around with some Azure Batch Auto-scaling formulas.
See the Azure documentation here : https://azure.microsoft.com/en-gb/documentation/articles/batch-automatic-scaling/

Time based – Static

Between 8am and 6pm, Monday – Wednesday (0=Sunday) 5 nodes will be allocated to the pool.

Task based – Dev/Economical

Restrict capacity to 0 unless there is a waiting task – then allocates an additional 1 node.
This will built up to 5 minutes of latency before the pool autoscales, not including node boot time, and any pre-tasks the pool has associated. In my case this took up to 12 minutes before the scale occurred, originally going over my coded task schedulers timeout of 10 minutes.

Combination of task and time

On Monday (8am-6pm) ensure a minimum capacity of 1. When there is a waiting task in the queue, increase the pool capacity by 1 node.

Hard limit

As above, but ensuring that the maximum pool size never exceeds 10.

Azure Batch Start task – JRE windows | Java Runtime Environment

I’ve had to set Azure Batch up for a Java application, on Windows (go figure 🙂 ).

The first thing you need to do is get the Java Offline Installer.
http://java.com/en/download/manual.jsp

To launch the JRE install sliently, you’ll want to pass the /s flag. This only works with the offline installer.
Ref: http://docs.oracle.com/javase/8/docs/technotes/guides/install/windows_installer_options.html

Upload the installer to an Azure storage account, and note the public url to it.

Go to your Azure Batch account in the portal, click on the compute pool that you want to add the start task for.
Click resources, and enter the Azure Storage URL of the installer along with a destination path.

NB: Resource Paths are relational. Just enter the name of the file you want created, and it’ll be created here “C:\user\tasks\startup\wd\jre-8u101-windows-x64.exe”
EG;
Bad | “d:\jre” : “The resource file path is not valid”
Bad | “d:\jre\jre-8u101-windows-x64.exe” : “The resource file path is not valid”
Good | “jre-8u101-windows-x64.exe”

You should choose the options to “Run Elevated” and to “Wait for Success”.
AzureBatchStartupTask

When you add a node to the pool, you’ll find that Java is installed.
JavaInstalledBatch

Logic Apps – Json Schema Verify

Logic Apps has got a really handy trigger : Request/Response.
It provides a public URL that you can post to, and takes an optional Json Schema.  The benefit of providing a Json schema is purely that the fields defined in the schema are then accessible throughout the Logic App workflow for interaction with other actions.
This is awesome, it really makes the Visual aspect of the designer work well –  but being picky – there’s no built in validation of the Json Request against the schema you provide.

I’ve got a pretty well defined Json schema that uses

  • Required Fields
  • Typed fields (string/integer/etc)
  • Regex patterns for strings

As such i’m pretty keen to first find out if the Json in the request is actually valid.  If it’s not – I can return a 400 pretty quickly to the caller and save some valuable cash on my Logic Apps actions (Logic Apps Price Model).

The solution I went for was to wrap the Newtonsoft Json Schema library in an Azure Function.

First complication is passing in two parameters to an Azure function (the json from the Request body and the Json Schema).  Both are needed in order to perform a schema validation, and the Azure function can’t reach out to Logic Apps.  Switch to Code View in the Logic App designer and use something like this;

Apart from that, the coding of the Azure function is pretty easy.
It’s in GitHub here https://github.com/Gordonby/AzFunctions/tree/master/JsonValidate, but here’s a snapshot of the files.

A simple implementation of this can be seen below.  A Request trigger which passes the body to my Function App, the Response then just relays in information provided by the output of the app.JsonSchemaValidate