tl;dr - I set up automated backups of our office security camera footage and used Power Automate to automate alerting if anything went wrong.
At endjin HQ, we have an office security setup using multiple cameras managed by Surveillance Station, an application on our on-site Synology NAS drive. Surveillance Station handles the recording schedules and stores the video files locally on the NAS. This is great, but ideally we want off-site backups too (the videos are no good if any miscreants find and take the NAS).
Fortunately, there is another application for the Synology called Cloud Sync, which does exactly what you'd expect, and gives you the ability to sync your files with a cloud storage provider. It offers many options for providers, but we choose OneDrive, mainly because it offered 15GB of storage for free (at the time; it's only 5GB now unfortunately). So we have video files being saved by Surveillance Station every few minutes to our Synology NAS, and immediately being automatically synced off-site to OneDrive.
When everything is working correctly, this setup is fine. But what happens if one of the cameras has an issue? Or there is a powercut? Or the NAS drive stops syncing?
Our golden path scenario has video files appearing in OneDrive every few minutes, so it makes sense to put in some checks to see whether that is happening, and possibly add some alerts if it's not.
My first thought was to write some custom code to query the OneDrive API (probably using PowerShell), and host in some sort of cron job (like Azure Automation). Probably not too difficult... until you think about having to authenticate with OneDrive, which means registering a client application and implementing an OAuth flow, plus whatever integration is needed for the alerting part of the solution...
I decided there must be a simpler way, which is when I remembered Power Automate. This is Microsoft's service for building and managing automated workflows (similiar to IFTTT or Zapier, if you are familiar with them). They have hundreds of actions across multiple services that you can compose together to create your workflows, and I figured that there must be some combination I could use to achieve what I wanted.
To build your workflow, you can either start with one of the pre-built templates from Microsoft and other contributors, or start from scratch. Starting from scratch, you first have to pick a trigger for your workflow. This can either be a timer, a manual button, or a reactive trigger from one of the providers (e.g. 'when a file is added to OneDrive', or 'when a video is uploaded to YouTube'). Note that the frequency that these triggers are checked are determined by your pricing plan.
I decided that since I wanted my workflow to periodically check OneDrive to see if new files have appeared, a timer trigger was most appropriate (called 'Recurrence' in Flow).
Once you have your trigger, you can then create a series of actions and conditions to build up your workflow. My first action is the 'List files in folder' action for OneDrive. We have store videos for each camera in separate folders, so I'm going to create a separate workflow for each camera that monitors the corresponding folder.
The first time you select an action from a service that requires authentication, Flow will prompt you for credentials and create the connection for you. You can re-use this connection in other actions, or in other workflows.
Given an array of files in my folder, I'm only interested in ones that have (or haven't) been created since I last ran the workflow. So I add a 'Filter Array' action, and filter on the last modified property on the files. You'll notice that Flow suggests dynamic content based on the data being fed into the action.
In the basic mode of the 'Filter Array' action, you can do basic comparison of values (equals to, greater than, less than, etc.), but I need to get the current date/time so that I can calculate what time it was an hour ago, and filter the array down to items where the last modified is greater than that.
To do that, you can switch to advanced mode and edit the query, using functions from the Workflow Definition Language, which includes date functions.
Now with the filtered down list of files, I can create a condition to ask the question: does the list contain anything? Again, this requires the advanced editing mode to use the 'length' function to get the size of the array.
If the answer is yes, then everything is fine - we've seen at least one new video file uploaded to OneDrive in the last hour, and we don't have to do anything. If the answer is no, then we need to send out an alert so that someone can investigate.
Since the team use Slack, I set up the alerts to come through there, but there are options to send email too, using either SendGrid, Office365 or Outlook.
And that's it. Save the workflow and it will run when triggered, and send through alerts in Slack if no new videos are detected in the period.
You can view the run history for any of your flows, see whether they succeeded or failed (or didn't run), and see which path they took in the workflow tree.
In summary, we set up scheduled security camera recordings using Surveillance Station, storing videos to the Synology NAS, syncing the videos to an off-site OneDrive backup using Cloud Sync, and created an automated workflow with Power Automate, to alert us via Slack if videos stop appearing in OneDrive for any reason.