As promised all the way back in my NDC Wrap-up, I wanted to share my findings using Azure Functions on a recent project. As always, the story begins with a user requirement. The use case was to build a news monitor which would check a given list of sites and email through any new stories over the last day. My immediate thoughts turned to Azure Functions as a solution. Amongst other reasons, they are quick to spin up/code, would cost little-to-nothing to run and have built in support for timers, file monitors etc…
Right tooling for the job
Technology decided, I next needed to get the tooling in order to develop a solution. Luckily, Visual Studio 2017 has excellent build in support for Azure Functions. To create a new solution, you can simply select the Azure Functions project type and have all the plumbing setup. Along with this, VS also packages up a local deployment emulator in order for you to run and debug your code. The whole end-to-end process is pretty slick and had me writing and running code within 5 minutes.
One handy titbit is the use of local.settings.json. This file allows you to store local configuration data and essentially mimic application settings in the cloud instance. Note that these are purely local and will not be deployed with the app. If you wish to use these settings in Azure, make sure you put them into the relevant application settings.
Show me the code
I wanted to write two Functions to accomplish this requirement. The first would be a Timer Function which would run on a set interval to check the relevant websites. It would then serialise any relevant data and store as json in blob storage. The second would be a Storage Function which would trigger upon new files being written to blob storage. It would deserialise the data and email out a nicely formatted news update.
The code was all fairly trivial and below is an extract from the blob trigger. I share this as an example of how to process the blob stream presented back from storage.
[FunctionName("NewsEmailFunction")] public static void Run([BlobTrigger("{name}", Connection = "BlobConnection")]Stream blobStream, string name, TraceWriter log) { using (var reader = new StreamReader(blobStream)) { var blob = reader.ReadToEnd(); var relevantArticles = JsonConvert.DeserializeObject<IEnumerable<Article>>(blob); // code to send the email } }
Note how defining the Function is simply a case of adding the FunctionName attribute and trigger type. The rest of the code is just regular C#. My biggest issue writing the Function was trying to get the correct CRON expression for the timer.
Pushing made simple
Code written, it was time to deploy. I chose the quick and dirty option of right-click, publish. This did the job and had the Function running within minutes. However, as with all things Azure there are of course a plethora of options available. As Functions are basically just Web Apps, you can use systems such as linking with GitHub or other repositories using Webhooks. Or you can simply push to the resource from whichever flavour of CD tool you are using. All very simple.
Once up and running, the Azure dashboard makes it simple to check on the status of Functions. You can delve into each run and read through the logs which makes solving issues that little bit easier. Of course there is also the option to link to Application Insights for deeper logging. As all of this comes out of the box, I found the whole experience simple and intuitive.
There must have been issues
I’m aware I currently sound like a Microsoft salesman and so in the interest of balance, what didn’t I like? The main issue I had was around the maturity of the platform. As Azure Functions are still relatively new and moving fast, I often found documentation to be out dated and nuget package conflicts. The most frustrating manifestation was when I had obtuse errors about not being able to run the function locally. It transpired that new 2.0 version had been released but my emulator had failed to keep track.
Overall, I have to say I still believe Serverless to be the way of the future. Just as we have seen Virtual Machines abstracted away, I believe Containers and PaaS will too head that way. There are clearly a lot of maturity issues and I would like to see decent vendor agnostic solutions. And of course there will always be exceptions, but for the majority of use cases this just seems to be a pain free way of developing.