Scheduled json post in linux using cron and du – to Logic Apps and PowerBi

This post digs into getting regular disk space data out of linux, sending to an Azure Logic App for submission into PowerBi for visualisation.

So in my last post, i set up an rsync server to take a load of files from my “on-premises” storage. I have a reasonable internet connection, but am impatient and was always checking progress by logging into my rsync server and checking directory size using the du command.

-h gives you a human readable disk size, and max-depth only reports on the top level directories

du output to json

So the first step is take the output that i like, and get it into a better format.
Here’s what i came up with, it’s a little hacky as to create valid json i have an empty object at the end of the array to avoid messing around with last-comma removal.

This outputs valid json, which i can then curl out to a waiting Logic App, Power Bi dataset, or just a web service.

Logic App

So i chose to post the data to an Azure Logic App. This way i can take advantage of it’s pre-built connectors for email/power-bi whenever i change my mind about what to do with the data.

I new up a Logic App in Azure, choosing a Http Request/Response trigger template, pasting in the json – it creates the schema for me and i’m ready to go.

logic app new

Curling the json

So now that i’ve got a URL from Logic Apps to post to i can create the curl command on my linux box.

Lets see if that worked by checking the Logic App run history

Scheduling the curl

Ok, so all works as planned. Lets get this reporting the data every hour.

First lets put the code in a shell script file

Then lets schedule it for every 30 minutes.

Logic App run history, oh good it’s working every 30mins 🙂

Developing the Logic App

So up until this point the Logic App contains 1 action, a trigger to receive the data which it does nothing with.
Lets send the data over to PowerBi so i can quickly check it from my mobile app when I’m interested.

First up, head to PowerBi, click on a workspace and then add a dataset. I’m going to use a streaming dataset, with the API model.
You provide the field names and that’s it.

Next, we add add a few more actions to the Logic App.
– Filter Array (so we’re just working with the total size item)
– Add rows to a Powerbi dataset
– Use a simple calculation to work out the GB size from the KB size provided in the request

PowerBi Reports

So, once we have data going into a PowerBi hybrid streaming dataset we can craft some reports.

Streaming dashboard

Data over time

Dealing with code view quotes in an Azure Logic App | JSON

The visual designer in Logic Apps is pretty good, but all too often you need to break out to the code view. For me, this is usually because I want to construct some Json to pass into an Azure Function for processing. What’s not entirely obvious is the correct way in which to deal with double quotes inside the code view. As ever, when I google something and find nothing – I’ll share a blog about the answer was, and this is no exception 🙂

The correct escape characterto use is a backslash.

This snippet shows the final version of what i’m doing. In this case i’m writing the json out to a blob storage (as a debug step), before sending to an azure function for basic JSON validation and then giving the data onward to PowerBi for visualisation.
dealing with quotes in logic app JSON

And here’s the text;
"body": "{\"tweetId\": @{triggerBody()?['TweetId']}, \"tweet\": @{triggerBody()}, \"cogSentiment\": @{body('Detect_Sentiment')}, \"cogKeyPhrases\": @{body('Key_Phrases')}}",