Getting started with domain management and DNS in Azure

One of the features in Azure that i hadn’t used until lately was the DNS Zone management for your own domain. It’s easy to use, but crucially allows a better degree of configuration than the previous company I’ve used for years.

  • Changing the Time To Live of specific DNS entries.
  • It’s API accessible, which means much better integration with automation scripts.
  • The cost of the domain comes out of your Azure bill which is actually pretty convenient for me.

You can see some of the other features here: https://azure.microsoft.com/en-gb/blog/app-service-domain/

Domain registration

I registered Azdemo.co.uk, and it took about 10 minutes before it was ready to use. You can find the feature under “App Service Domains”, although the naming can be a little confusing because you don’t need to use them just with App Service.

Automatic management of DNS for the domain.

The DNS Zones for the domain were automatically created as Azure is the default name server to provide DNS management. It also makes Custom Domain assignment much faster in App Service because you don’t have to perform the same validation steps.

DNS entities can then be added with Powershell, eg.

SSL Certificates

The next logical step is to deal with SSL Certificates for your subdomains/domain. You can either buy your SSL certificate through the Azure portal
https://docs.microsoft.com/en-gb/azure/app-service/web-sites-purchase-ssl-web-site or you can Bring Your Own Certificate. My personal preference is to leverage a free CA such as https://letsencrypt.org/, I’ll cover how I use Lets Encrypt in my next blog post.

Application Gateway with Public facing Web Apps

Azure Application Gateway

The most common use of Application Gateway is to expose web sites running on VM’s. I’m going to walkthrough configuring an existing App Gateway to target a Web App running on the public Azure App Service, and then securing the Web App to only take traffic from the Application Gateway. By putting an Application Gateway in front of your website, you can make use of the Web Application Firewall that it provides.

Lets start with creating the Web App. I’m using a standard ASP.NET web forms app from Visual Studio, and have tweaked the main page to output all of the HTTP headers onto the default web page so i can see what’s going on. Here’s the code needed to do that.

Once i’ve published it to Azure Web Apps, i now need to add my custom domains. This is pretty easy – with your Domain Registrar, you add a CNAME entry for the right subdomain to point to the FQDN for your web app. EG. WestEurope4.byers.me maps to WestEurope4.azurewebsites.net.

After you’ve done this for all the subdomains you want, you come back to the Azure Portal, Verify each domain and Add each hostname in the Custom Domains section of your Web App.

At this point we’re ready to configure/create the Application Gateway. You can find several scripts here; https://github.com/Gordonby/Snippets/tree/master/Powershell/AppGatewayForPublicWebApps
I can recommend the CreateAppGW-ForWebApp-v2 script because it creates the AppGW. This means that it rules out any config mistakes/issues you may have on an existing gateway. However there is a script there for when you have an existing gateway which i’ve used a couple of times.
NOTE: If you’re using a script that creates the Application Gateway, it will take over 30 minutes to provision.

After the Application Gateway has been created, you can now flip the DNS CNAME(s) to target the Application Gateway instead of the Web App directly. You should use the cloudapp.net address associated with the Application Gateways IP address. For me, this is f7a6e9e7-be60-4c41-b5a5-d211f5f56a91.cloudapp.net.

Once you have made the DNS changes, and they have been acknowledged by the various DNS servers and your cache you’ll get a page that looks like this.

This just leaves the process to lockdown the Web Application to only take traffic once it’s originated from the Application Gateway.
Azure App Service Static IP Restrictions
https://docs.microsoft.com/en-us/azure/app-service/app-service-ip-restrictions

After configuring the IP restriction with the IP address of your Application Gateway, when navigating to your Web App directly, you will be told;

This just leaves one remaining gotcha. Application Gateway only supports Dynamic Public IP Addresses. This of course means that should your Application Gateway change IP (not an expected operation, but still possible) you will need to adjust the configuration of your Web App IP Restriction.

New-AzureRmApplicationGateway : FrontendIpConfiguration /providers/Microsoft.N
etwork/applicationGateways/AppGwForWebAppsStatic/frontendIPConfigurations/fipcon
fig01 of Application Gateway with SKU tier WAF can only reference a
PublicIPAddress with IpAllocationMethod as Dynamic.

Application Gateway will support a Static IP address soon, but as of the time of writing – there is only a private preview programme open for it. https://docs.microsoft.com/en-us/azure/application-gateway/create-zone-redundant

Yet another rehost of my blog

I’m putting this post up, partly to act as a line in the sand to make sure i’m looking at the right version of my blog. Yes I’ve re-platformed it again. Not off WordPress, no I like it for all its foibles.

It did give me an excuse to think about all the different places I’ve hosted it over the years.

1. VPS, costing me about $20/year for multiple sites. I installed WordPress using the Plesk dashboard that the Hosting Provider had given me.
2. Azure App Service. Leveraging an old marketplace template that no longer exists.
3. Azure App Service and SQL Server. Installed using Project Nami.
4. Azure Container Service (AKS, K8S 1.7.7-1.8.2). Installed using Helm.
5. Azure Container Service (AKS, K8S 1.9.6). Installed using Helm.

The reason for the latest rehost is that my AKS Cluster kept on entering a failed state. I was using version 1.8.2 of Kubernetes, and because of the errors was unable to upgrade.

The latest AKS Cluster is using version 1.9.6 of Kubernetes. The process for switching took but a few minutes since I use UpdatePlus for backups, and an Azure Traffic Manager for routing.

Scheduled json post in linux using cron and du – to Logic Apps and PowerBi

This post digs into getting regular disk space data out of linux, sending to an Azure Logic App for submission into PowerBi for visualisation.

So in my last post, i set up an rsync server to take a load of files from my “on-premises” storage. I have a reasonable internet connection, but am impatient and was always checking progress by logging into my rsync server and checking directory size using the du command.

-h gives you a human readable disk size, and max-depth only reports on the top level directories

du output to json

So the first step is take the output that i like, and get it into a better format.
Here’s what i came up with, it’s a little hacky as to create valid json i have an empty object at the end of the array to avoid messing around with last-comma removal.

This outputs valid json, which i can then curl out to a waiting Logic App, Power Bi dataset, or just a web service.

Logic App

So i chose to post the data to an Azure Logic App. This way i can take advantage of it’s pre-built connectors for email/power-bi whenever i change my mind about what to do with the data.

I new up a Logic App in Azure, choosing a Http Request/Response trigger template, pasting in the json – it creates the schema for me and i’m ready to go.

logic app new

Curling the json

So now that i’ve got a URL from Logic Apps to post to i can create the curl command on my linux box.

Lets see if that worked by checking the Logic App run history
logic-app-history

Scheduling the curl

Ok, so all works as planned. Lets get this reporting the data every hour.

First lets put the code in a shell script file

Then lets schedule it for every 30 minutes.

Logic App run history, oh good it’s working every 30mins 🙂

Developing the Logic App

So up until this point the Logic App contains 1 action, a trigger to receive the data which it does nothing with.
Lets send the data over to PowerBi so i can quickly check it from my mobile app when I’m interested.

First up, head to PowerBi, click on a workspace and then add a dataset. I’m going to use a streaming dataset, with the API model.
You provide the field names and that’s it.

Next, we add add a few more actions to the Logic App.
– Filter Array (so we’re just working with the total size item)
– Add rows to a Powerbi dataset
– Use a simple calculation to work out the GB size from the KB size provided in the request

PowerBi Reports

So, once we have data going into a PowerBi hybrid streaming dataset we can craft some reports.

Streaming dashboard

Data over time

Troubleshooting rsync with Readynas, Azure and Centos

Years ago i bought a couple of Netgear Readynas devices.  A duo, and then subsequently a duo v2.  They’re pretty basic, but offered good squeezebox support and a cheap way of me storing TB of data in a RAID config.

Both of the Readynas devices support backup operations to send their data on a scheduled basis.  I’d normally opt for the simplicity of CIFS/Samba, but my internet provider has decided to block those ports and the Readynas devices don’t allow you to use a non-standard port.  Thusly the only other way to get the job done is to use rsync.

My desired location for the data backup is in Azure (naturally!).  Ideally in Azure files as my data will be the most accessible to me over an smb share in the same way that i’ve always accessed my ReadyNas devices.

Here’s a run down of a bunch of the errors i received when doing this and how to get around them.

rsync: getaddrinfo: myserver.northeurope.cloudapp.azure.com 873: Name or service not known

It turned out that my rsync daemon wasn’t listening correctly.

Check with

The quick command to get it running is

rsync: failed to set times on “.” (in frontbut): Operation not permitted (1)

At first i thought this problem was because of the way i was mounting Azure files and that it’s filesystem didn’t support it. Most of the solutions on the web tell you to use the -O flag to omit updating directory times.

However the solution was that the username my Readynas was using was not the owner of the directory.

This statement changes the ownership (recursively) of the directory to user1. This should match the username you are using in the Readynas and the rsyncd.conf file

ERROR: The remote path must start with a module name not a /

Pretty easy one here. The path must only represent the module defined in the rsyncd.conf file – not the directory path.
Rnas backup destination

@ERROR: Unknown module ‘mnt’

I was having an issue whereby the config file i was using wasn’t being picked up by rsync (typo).
I was editing /etc/rsync.conf when it should have been /etc/rsyncd.conf.
Inside this configuration files are various module definitions (specifying the path etc), the module must be used.
rsync module

@ERROR: chroot failed

In your rsyncd.conf file make sure that chroot = false

@ERROR: chdir failed

Ensure that the directory has the correct permissions allocated.

My final Rsyncd.conf file