DEV Community

Santanu Paul
Santanu Paul

Posted on • Edited on • Originally published at sntnupl.com

Test Driven Development of Azure Functions with C# Part 1: Introduction to the application

Over the last few years, Serverless computing has become one of the most appealing choice for developing backend application.

With PaaS offerings like Azure App Service, we already had the benefit of not having to manage and maintain our Server infrastructure.

The cloud service provider will do all the heavy lifting required, to manage the infrastructure, perform auto scaling and auto provisioning of the application.

Now with FaaS offerings like Azure Functions, AWS Lambda, backend developers have been given the additional benefit of not having to maintain all the application server code.

This is a very powerful abstraction, which makes perfect sense for event-driven applications. Once we write the piece of code that you want to execute for certain event(s), Azure Functions will do the hard work of

  • hosting this code (in some abstracted server infrastructure)
  • provide the glue between the actual event and your application (ensure that your code actually gets triggered as and when the actual event gets triggered)
  • auto-scale the application based on the volume of events received

For event driven applications that runs on the cloud, this is a very powerful model indeed.

 

These days, we are seeing more and more organizations curating their backend systems running on the Cloud as a Microservice based architecture.

In Azure for example, these microservices will leverage cloud services like Blob Storage, Message Queues, Service Bus, Event Hubs, Table Storage etc to communicate with each other.

Azure Functions allows us to register our code to get notified for activities happening on these services, by registering to get notified for specific "events".

For example, an Azure Function can get notified for a "Blob Event", that indicates that a blob was created at a certain path. Similarly, a message arriving on a Service Bus Topic can notify an Azure Function who has registered for it.

The icing on the cake is that registration for a certain event is simply via adding event bindings of these services in the methos parameters - as simple as it gets!

 

In this post, I would like to share a practical guide towards Test Driven Development (TDD) of Serverless Applications using Azure Functions.

When I started developing serverless applications using Azure Functions, writing Unit Tests and Integration tests were two aspects in which I could not find good, detailed guides.

Over the last few months, I do see ample resources guiding developers on how to Unit Test their Azure Functions app, but there is not much guidance on how to implement Integration Tests in these apps.

In this series of posts, I would try to share my learnings, which hopefully would help future developers in their journey.

This post will be the first of a three-part blog series, in which I shall introduce a sample Azure Functions application written in C#.

 


Other Posts in this series

  1. Part 1 - This Article
  2. Part 2 - Unit Tests
  3. Part 3 - Integration Tests

Also, I have created this repository on Github, which contains all the associated source code used in this blog series:

GitHub logo sntnupl / azure-functions-sample-unittest-integrationtest

Sample application to showcase how one can implement Unit and Integration testing for an Azure Functions application

 


The Sample Application

Our sample application is an Azure Function, which will get triggered by an event triggered by Azure Service Bus, when a message is sent to Service Bus Topic.
This message essentially contains location to a blob in Azure Blob Storage.
Once our app is able to read the text contents of this blob, it will try to parse its contents as (fictious) Invoice entries of Acme Corp.
These parsed Acme Corp invoice entries will then be persisted in Azure Table Storage.

If you have worked with Azure functions a question you might be wondering, is that instead of using Service Bus trigger, why don't we use the trigger generated by the Blob storage itself.
That can be a valid solution, in certain use cases. I chose to architect our application such that we read the location from a service bus message because of the following reasons:

  1. I can send additional metadata about the upload blob in a cleaner fashion. For e.g., I can pass metadata like transaction-id, user-id in the Service Bus message

  2. I can choose to be flexible on the actual location of the uploaded file (what if some client wants to use AWS S3?)

  3. Another application who wants these uploaded invoices to be processed, can choose to delay the processing, or may even want to batch them up. Both of these can be easily achieved if we choose Service Bus Trigger.

 

Project InvoiceProcessor hosts our Azure Function. inside InvoiceWorkitemEventHandler.cs.

[FunctionName("InvoiceWorkitemEventHandler")]
public static async Task Run(
    [ServiceBusTrigger("workiteminvoice", "InvoiceProcessor", Connection = "ServiceBusReceiverConnection")] string msg,
    IBinder blobBinder,
    [Table("AcmeOrders", Connection = "StorageConnection")] IAsyncCollector<AcmeOrderEntry> tableOutput,
    ILogger logger)
{
...
}
Enter fullscreen mode Exit fullscreen mode

The line [ServiceBusTrigger("workiteminvoice", "InvoiceProcessor", Connection = "ServiceBusReceiverConnection")] string msg, shows just how easy it is, to hook to event triggers generated from Azure Service Bus.

 

All we need is to use the ServiceBusTrigger attribute to decorate the 1st method parameter.

It takes 3 arguments itself:

  • Topic Name,
  • Subscription Name, and
  • the name of the Configuration property, that stores the ServiceBus Connection string.

If you want more details on this attribute, I would suggest you to go through this article - which explains the Azure Service Trigger in more details.

 

The next method parameter to our Azure Function is an IBinder.

We will be using this to perform input-binding (i.e., bind to perform read operations) to a particular blob in Azure Blob Storage.

ServiceBusTrigger allowed us to imperatively perform input-binding to Service Bus messages. This works because we know that we will always be looking for messages in workiteminvoice topic, using InvoiceProcessor subscription.

We can't use the corresponding Blob Trigger - because we will know about the blob location at runtime, ONLY after we parse the message received from Service Bus.

For scenarios like these, when we need run time binding we can use the IBinder input parameter.

We will soon see, that once we know the blob location, how we can use BindAsync method on the IBinder instance to perform input binding to the blob.

 

The last parameter, allows us to perform output binding to Azure Table Storage: [Table("AcmeOrders", Connection = "StorageConnection")] IAsyncCollector<AcmeOrderEntry> tableOutput.

Imperative binding works here, because we know we will always be writing to the table named AcmeOrders, withing the Azure Storage account whose connection string is specified in the configuration property StorageConnection.

Also, since we might be writing (output-ing) multiple entries to this table, we need to use IAsyncCollector<T> as the concrete type. Think of these as out parameters to an async Azure Function.

We will be invoking AddAsync() method on this to write multiple entries to the Azure Table Storage.

 


The Code

Let us now take a look into the first part of this function:

InvoiceWorkitemMessage invoiceWorkItem;
logger.LogInformation($"C# ServiceBus topic trigger function processed message: {msg}");

if (string.IsNullOrEmpty(msg)) {
    logger.LogError("Empty Invoice Workitem.");
    return;
}

try {
    invoiceWorkItem = JsonSerializer.Deserialize<InvoiceWorkitemMessage>(msg, jsonSerializationOptions);
}
catch (JsonException ex) {
    logger.LogError(ex, "Invalid Invoice Workitem.");
    return;
}
catch (Exception ex) {
    logger.LogError(ex, "Unexpected Error.");
    return;
}

if (invoiceWorkItem.DataLocationType != DataLocationType.AzureBlob) {
    logger.LogError($"Unsupported data location type {invoiceWorkItem.DataLocationType}.");
    return;
}

if (string.IsNullOrEmpty(invoiceWorkItem.DataLocation)) {
    logger.LogError("Empty data location.");
    return;
}
Enter fullscreen mode Exit fullscreen mode

Nothing too fancy here. We are essentially trying to parse the message received from Service Bus as an InvoiceWorkitemMessage.

If the message is valid, and we are able to parse it, we check that the blob location is of type Azure Blob Storage (currently we support ONLY Azure Blob Storage).

 

Once we know the blob path, we can use the IBinder instance to bind to the specific blob path. As discussed earlier, we will be using the BindAsync method to get a Stream of the blob contents.

var blobAttr = new BlobAttribute(invoiceWorkItem.DataLocation, FileAccess.Read) {
    Connection = "StorageConnection"
};

using (var blobStream = await blobBinder.BindAsync<Stream>(blobAttr)) {
   //...
}
Enter fullscreen mode Exit fullscreen mode

 

We will then parse the contents of this blob, using InvoiceParser class instance.

if (!InvoiceParser.TryParse(blobStream, logger, out List<AcmeOrder> orders)) {
    logger.LogError("Failed to parse invoice.");
    return;
}
Enter fullscreen mode Exit fullscreen mode

I will skip the inner behavior of the TryParse method, as it doesn't really add to our scope of discussion here.
You can look into the InvoiceParser.cs file for more details.

 

If everything goes fine, and we are able to parse the contents of the blob as one or more AcmeOrders, we can persist them in Azure Table Storage.
As discussed earlier, we use AddAsync() method on the IAsyncCollector<AcmeOrderEntry> to achieve this.

foreach (var order in orders) {
    await tableOutput.AddAsync(new AcmeOrderEntry(invoiceWorkItem.UserEmail, order));
    logger.LogInformation($"Added table record for BigBasket Order number: {order.OrderNumber}");
}
Enter fullscreen mode Exit fullscreen mode

 

To perform TDD with this sample application, we need to write Unit Tests and Integration Tests against this code.

 


In the next post, we will discuss how we can implement Unit Tests for this Azure Function. See you there!

Top comments (0)