Create Kubernetes Cluster in Azure

ACS

Azure has 2 container service offerings, ACS and AKS.
ACS was the first to be released, gives a choice of orchestrators but is little more than an ARM template with no management capability. These are some of the issues that AKS address. I’m confident that when AKS is Generally Available, ACS will become deprecated. Until that point however, i like to stay with the GA container service.

I have a shell script that creates my cluster with my optimal “cheapo” settings. Probably worth noting that this config is pretty slow, and not great at taking load tests – but hey, you get what you pay for.

I usually kick this off in the Azure Cloud Shell, and i pass in simply one parameter which is the name of the Resource Group.
The reason for the script is as follows.
1) I want to consistently add tags to my resource group for automation
2) I use a service principal to access Azure which has a much lower set of permissions. At point of creation i want it to automatically have Contributor access.
3) I want the cluster to be small, and sized to be cheap.
4) I want the ssh credentials zipped and ready for me to download to other clients to access the cluster. I do this partly so i can easily get away from the cloudshell and its aggressive timeouts. It’s probably worth saying that this is a sledgehammer approach, i could just go into the /.kube/ directory and copy out the specific kube config file.

Hope this proves useful

Azure Arm Template – VM Domain Join Automation

Domain

In this post we’re going to look at the ARM template and steps needed to simply Domain Join a Windows VM using Powershell DSC and Azure Automation.

The Arm template will do the following;

  • Create an Automation Account
  • Add a Automation Credential
  • Add a Automation Variable
  • Import a Powershell module
  • Add a DSC Configuration

We’ll then use a simple powershell script to

  • Specify the ConfigData
  • Compile the DSC Node Configuration

Then we’ll take to the portal and tell a VM to use this config.

Arm template

Take particular note of the parameters. They set up the Automation variable that holds the domain name, and the credentials for the user account that has permission in the domain to perform domain join.

The script can also be found here on github: https://github.com/Gordonby/DscLab/blob/master/AutomationAccountDeploy/AzureDeploy.json

Config. Compliation script

This Powershell script will connect to Azure and submit a compilation of the DSC Configuration defined in the Arm template.

Assign configuration to existing VM.

Now that the Configuration has compiled, it’s ready to be used.
To assign it to a VM that already exists, the quickest way is to use the Portal.
Follow the wizard instructions to complete the enrolment, wait whilst the config is applied.
DSC Nodes

Applying VM Config during VM Arm template build

If you wanted to domain join a VM or VM Scaleset which is created from an Arm template then you can leverage this script. The 3 variables that need to be set;

  1. automationRegistrationUrl – Registration url from your azure automation account
  2. automationKey – access key from your azure automation account
  3. automationConfiguration – name of the configuration to apply.

Arm templates – Importing modules from the Azure Automation gallery

automation

In this post we’re going to look at referencing Powershell modules from the gallery, in your own Arm templates. This will save having manual steps to complete after your Azure Automation Account has been deployed.
If you are looking to package up your own modules, then this guide is not for you.


Using ARM templates to deploy any kinds of infrastrure is foundational when it comes to cloud skills.
As part of an ARM template i’ve been creating; I’m wanting to deploy an Azure Automation Account, fully configured to be able to domain join VM’s. A good first step in creating an ARM template is to manually create the infrastructure in the Azure Portal and copy out the Automation Script before neatening up. Unfortunately this doesn’t work so well with several of the assets in an Automation Account, modules being the first issue.

I tend to heavily use Modules from the gallery, but it’s a little akward in inserting them into an ARM template. The Automation Script generated in Azure offers you something like this;

Which if you look at it, seems reasonable. Its type has been set to Microsoft.Automation/automationAccounts/modules and the version looks good. However this won’t pull down the module from the gallery.
Here’s what you need to do…

Find the Module

For me, my module is v1.1 of xDSCDomainJoin
https://www.powershellgallery.com/items?q=xDSCDomainjoin&x=16&y=18
https://www.powershellgallery.com/packages/xDSCDomainjoin/1.1

Root Template

So on this page; https://www.powershellgallery.com/packages/xDSCDomainjoin/1.1is a link to “Deploy to Azure Automation”
Clicking the link, takes you to the Azure Portal with the template pre-loaded to deploy.
https://portal.azure.com/?microsoft_azure_automation_armTemplateUrl=https%3A%2F%2Fdevopsgallerystorage.blob.core.windows.net%2Farmtemplates%2FxDSCDomainjoin%255C1.1%255CRootTemplate.json

If you extract the TemplateUrl out, and UrlDecode it, you’re left with;
https://devopsgallerystorage.blob.core.windows.net/armtemplates/xDSCDomainjoin/1.1/RootTemplate.json
When you hit this url you get prompted to download RootTemplate.zip

Once it’s downlaoded, rename the files extension back to .json from .zip
Here’s what’s inside;

This is a pretty simple template that calls another template on the url held in this variable;
"templatelink": "[concat('https://devopsgallerystorage.blob.core.windows.net/armtemplates/xDSCDomainjoin/1.1/', parameters('New or existing Automation account'), 'AccountTemplate.json')]",

Because we’re going to be using this in a New Automation Account, lets compose the url properly;

https://devopsgallerystorage.blob.core.windows.net/armtemplates/xDSCDomainjoin/1.1/NewAccountTemplate.json

As parameters for this template, it passes in the Name and Uri for the xDSCDomainjoin module. We’ll come back for these variables later.

New account template

Follow the url from above; https://devopsgallerystorage.blob.core.windows.net/armtemplates/xDSCDomainjoin/1.1/NewAccountTemplate.json
You’ll be prompted to download NewAccountTemplate.zip. Again, rename the extension back to .json.
Open the file and you’ll see;

Snipping the bits that matter

It’s simply the module resource that we want to steal to lift into our Automation Account template. Grab that definition, and the two variables from the last file and you have enough to plug into your own Arm template.


"xDSCDomainjoin:1.1.0:Name": "xDSCDomainjoin",
"xDSCDomainjoin:1.1.0:Uri": "https://devopsgallerystorage.blob.core.windows.net/packages/xdscdomainjoin.1.1.0.nupkg"


{
"name": "[parameters('xDSCDomainjoin:1.1.0:Name')]",
"type": "modules",
"apiVersion": "2015-10-31",
"location": "[parameters('accountLocation')]",
"dependsOn": [
"[concat('Microsoft.Automation/automationAccounts/', parameters('accountName'))]"
],
"tags": {},
"properties": {
"contentLink": {
"uri": "[parameters('xDSCDomainjoin:1.1.0:Uri')]"
}
}
}

Azure ARM templates – Example use of Outputs

The Outputs part of an ARM template allow data to be retrieved as part of that templates creation.

Given a simple example of creating an vNet, here’s what we get.
Defining 3 outputs of differing types,
– Virtual Network [Object]
– Virtual Network Addresses [Array]
– Virtual Network Address Prefix [String] (first string in the array)

We get this output.

Powershell script for deploying and retrieving values.

Taking a more practical example, retrieving the StaticIP’s from APIM after creation;

Template

Powershell Output

Minikube cheat sheet

minikube

Some of the common commands needed for a local Minikube install on a Windows10 Hyper-v setup.

Ref: https://github.com/kubernetes/minikube

install choclatey
Set-ExecutionPolicy Bypass -Scope Process -Force; iex ((New-Object System.Net.WebClient).DownloadString('https://chocolatey.org/install.ps1'))

install kubectl
choco install kubernetes-cli

start minikube (my hyper-v switch is called vSwitch)
minikube.exe start --kubernetes-version="v1.8.0" --vm-driver="hyperv" --memory=1024 --hyperv-virtual-switch="vSwitch" --v=7 --alsologtostderr

get cluster information
kubectl cluster-info

open k8s web dashboard
minikube dashboard

External monitoring of Azure web apps with Statuscake

The problem with monitoring from within the platform your app lives in, is that you’re unaware of any connectivity problems that exists outside of your environment. True, you’re not wanting to be monitor Internet weather either but in reality having both monitoring systems is a must.

Statuscake, is an endpoint monitoring service that offers a bunch of features around web endpoint monitoring – but i’ll just be looking at the most basic http check which will look for a 200 response.
*There are many other services available that perform this function – including Pingdom which I’ve used for years (and actually monitors this blog).*
When I first found Statuscake, years ago – it offered more monitoring servers in the UK than any other provider so provided the USP. What I liked about it was the API and the service it provides as part of its free tier.

statuscake monitoring

I’ve written several automation scripts against it’s API and have one blog post here.

Now that we’re happy with Endpoint monitoring through Statuscake, here’s the powershell functions I’ve written to perform a bunch of standard tasks.

So how can we get some monitoring action happening from Azure? Let’s stick to powershell and start using the Azure cmdlets to query all the Web App endpoints. Once we have the list of Azure addresses, we can then query Statuscake to see which one’s do not yet have a monitor set up – then add it. In this way we can host the powershell in Azure Automation to run on a scheduled basis.

What this will give you for free, is a record of availability and performance metrics. It can be extended to alert you upon detected downtime. In short, you’ve got nothing to lose by monitoring all your Azure App Services automatically. In fact a side benefit of the monitoring will be that in the event your Web Application app pools go to sleep, monitoring will keep them alive.

Powershell scripts available on GitHub.

Building an Azure FormFlow Bot – the Job Bot

The Bot framework is some amazing tech to play with. It has some real practical business applications, but you can also have some fun with it – which is what I’ve done.

There are many different types of bot, but I’ve chosen a Forms Bot as being the most applicable to the task. It uses FormFlow and I’ve chosen to use c#, but I could have also used the Node.js template.

The Job Bot.
hedon-bot-a-job-bot-seriously
So I designed this to be the first point of contact for recruiters that contact me about a job. The bot asks the questions I care about, and submits the answers to an Azure Logic App which is where the interest I will have in the job is calculated and then returned to the bot.
There is a lot of different channels that you can have the bot interact on, I’ve focussed solely on Web chat. I’ve then created a basic web page to host it in : job-bot.byers.me

Good points of the demo
So here’s the features of the demo that I’ll call out as being useful to you when creating your own bot;
– Defining the forms elements explicitly
– Making some form questions dependant on the answers of previous questions
– Calling out to an http web service to submit the answers to
– Referencing the environment variables in order to get the url for the web service

Bad points of the demo
The first couple of real world outings of the bot weren’t especially smooth. The recruiters were confused. They also couldn’t understand why after answering the questions they received such a low score. As a technology exercise, it’s been great but I’m not sure if I’m completely happy tormenting recruiters with it.

Source code
It’s all on GitHub, https://github.com/Gordonby/JobBot with the exception of the logic app. For that you’ll just want to create a Request/Response logic app or a function app etc.

Try it!
Try the bot here : job-bot.byers.me

Azure Application Gateway – solving 502 errors with httpd

I’ve recently been playing with the Application Gateway in Azure. Use case is pretty simple, serving as a simple load balancer / waf / dmz for an application that lives on some RHEL VM’s.
To Run App Gateway in its simplest configuration, you just have to;
– Create the vnet
– Create a subnet for the App Gateway (something like a /27 should do)
– Create a subnet for the VM’s
– Create the Application Gateway
– Create the VM’s (just with private ip’s)
– Add the VM’s to the backend pool
– Add an App Gateway NSG to allow port 80/443 and 65503-65534 for the health probe
– Add a VM subnet NSG to allow port 80 from the App Gateway subnet.

However, sometimes these things just don’t work and you need to fault find. The problem I had was that the gateway kept reporting a 502 error, and stating that all the nodes in the backend pool were unhealthy. This is a great article to start with, but didn’t help my problem.

What I then did was create another subnet and dropped a Windows VM jumpbox into it. From here, I could adequately browse the Linux VM’s in IE, as well as SSH onto them to check configuration from inside the same virtual network.

In the end, I created 6 different VM’s with different configuration before the problem became clear. Httpd was the problem.

OS Web server Healthy?
Windows Server 2016 IIS Yes
Ubuntu 17.04 Nginx Yes
CentOs 7 httpd No
RHEL 7.3 httpd No
RHEL 6.7 httpd No
RHEL 7.3 nginx Yes

Because all the traffic on the private IP’s worked correctly from the jumpoff box, it’s clear that it wasn’t a firewall or nic configuration problem. The one consistent issue was httpd. After trying a few different settings changes in http.conf, I found the solution.
The default welcome page was causing the probe not to work. By creating an index.html file in /var/www/html/ the health probe began reporting success.

Damn you welcome page!
RHEL

Azure Logic Apps, tips from the real world

About 2 years ago I wrote some PowerShell as a workflow function that could be used in Azure Automation. The purpose was simple, get a daily email informing me of yesterdays Azure spend and then predict my annual charge based on it. I had a colleague ask me about it, and it got me thinking – pretty much the entirety of my script could be replicated in an Azure Logic App. Here’s the script if you’re interested : http://gordon.byers.me/azure/estimating-your-annual-azure-bill/

Translating the PowerShell function into a Logic App wouldn’t really add much value to the end product, i’d still get an email… However it would be a good experience for eating the Logic App dog food and seeing where the capabilities fall short and I have to use some code. It also makes it very easy to customise with other connectors in the Logic Apps library. To that end, this post isn’t so much about Azure billing, but Logic Apps and the tips and tweaks that you might find useful when considering it for your own workloads.

So, lets start with the end result, then pick it apart.
LogicAppBilling

  1. The Trigger. A schedule. Once every day at approx. 9am.
  2. Initialise Variable. Setting an integer variable to the value for my Azure Enterprise Enrolment.
  3. Initialise Variable. Setting a string variable to the value for my EA Billing API key. (Ok, i’m cheating here as at the time Logic Apps doesn’t have string variables, but it’s coming soon!)
  4. Http call. Doing a GET to the EA billing api and asking for a summary of all usage reports.
  5. Blob create. Persisting the Summarised data from the api down to a container for future analysis
  6. Parse Json. Looking to what was returned in the Http call, and putting a Json Schema over the top to allow me to use this data in further Logic Apps.
  7. Parse Json. Filtering to the last month in the Array and putting a Json Schema over the top of it
  8. Blob create. Persisting the output to blob storage purely as part of the development process, so I can see what’s going on
  9. Http call. Doing a secondary GET to the EA billing api and asking for the detailed usage for the latest month
  10. Blob create. Persisting the detailed data down to blob storage for future analysis
  11. Azure function. At this point I need something to take a CSV file, clean it up a little, do some maths and just return me the sum of my usage in a currency unit
  12. Blob create. Persisting the output to blob storage purely as part of the development process, so I can see what’s going on
  13. Gmail send. Sending an HTML email with the monthly cost and predicted cost

So as it stands I managed to get quite far through the process without reaching out to Azure Functions. When I did have to use my own code, I created a Powershell Azure Function which let me keep all my previously written code for parsing…. RESULT! Nothing worse than rewriting something that works.
Here’s how that code looks;

Tips and Tricks

Ok, so it wasn’t all plain sailing writing the logic app. Here’s a few areas where I needed to do more than connect the dots with the visual designer.

Parsing Json
These are really cool little actions. They let you put a JSON schema on top of an Json that you’ve taken into the Logic App. Previously you’d have multiple Logic Apps talking to each other over Request/Response triggers. For ultimate ease, they’ll create the schema for you based on the json data, so make sure you’ve got this to hand. I found that using the Run History UI in Logic Apps pretty handy, as was persisting it down to blob storage and then grabbing it out.

Filtering to the last item in an array
Using a Json Parse action worked well again here, and allowed me to use the LAST function to quickly focus on what I wanted.
json-summary-filter


Concatenating implicitly in an Http action’s URI field

Ok, it didn’t like this much at all. I had to break into code mode and use the CONCAT function to combine the base URL with the one resulted from the HTTP call.
@{concat('https://ea.azure.com/',body('Grab Latest Months Summary')?['LinkToDownloadDetailReport'])}

Wrapping the output in Json
An easy one, but one worth mentioning, when passing data around between functions and API’s you’ll often have to start wrapping it in JSON.
powershellwrap

Type conversion and maths
My PowerShell function returns a string, which I then need to convert to an integer and do some basic maths on.
I used MUL to perform a multiplication, and FLOAT to convert the string to a float that I could do the MUL on.
You spent £@{body('Parse-Azure-EA-Detail-CSV')} yesterday.
If every day was like yesterday, your 12 monthly Azure Consumption would be £@{mul(365,float(body('Parse-Azure-EA-Detail-CSV')))} 

I’ll be creating an ARM template for this and putting it on GitHub soon, so if you’re wanting to get a daily email with your Azure spend you’ll be able to quickly and easily.

Dealing with code view quotes in an Azure Logic App | JSON

The visual designer in Logic Apps is pretty good, but all too often you need to break out to the code view. For me, this is usually because I want to construct some Json to pass into an Azure Function for processing. What’s not entirely obvious is the correct way in which to deal with double quotes inside the code view. As ever, when I google something and find nothing – I’ll share a blog about the answer was, and this is no exception 🙂

The correct escape characterto use is a backslash.

This snippet shows the final version of what i’m doing. In this case i’m writing the json out to a blob storage (as a debug step), before sending to an azure function for basic JSON validation and then giving the data onward to PowerBi for visualisation.
dealing with quotes in logic app JSON

And here’s the text;
"body": "{\"tweetId\": @{triggerBody()?['TweetId']}, \"tweet\": @{triggerBody()}, \"cogSentiment\": @{body('Detect_Sentiment')}, \"cogKeyPhrases\": @{body('Key_Phrases')}}",