How to create multiple output bindings in Azure Functions using — .NET

Shagun
3 min readMar 18, 2021

These are the key takeaways from Konfhub contest and submission for #azuredevstories

Code and Azure functions

Introduction

Azure is a cloud platform that offers different solutions in various domains, In serverless and storage, azure provides many different services, and here I will be using Azure storage accounts — Table and Queue services and in serverless - Azure functions.

Objective

To create a queue triggered azure function, which will be triggered when any message will come to queue and process that message. After processing it will be storing the data into azure table storage as well as blob storage using a .Net core program.

using System;
using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
using Microsoft.WindowsAzure.Storage.Queue;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
namespace Function_Queue_Table
{
public static class MyFunction
{
[FunctionName("QueueTable")]
// The ICollector interface is to write multiple to the //output
public static void Run([QueueTrigger("MyQueueName", Connection = "storageConnectionString")]JObject myQueueItem,
[Table("TableName", Connection = "storageConnectionString")]ICollector<CustomerClass> outputTable,
[Blob("blobname/{rand-guid}", FileAccess.Write,Connection = "storageConnectionString")] TextWriter blobOutput,
ILogger log)
{
log.LogInformation("Adding Customer");
CustomerClass obj = new CustomerClass();
obj.PartitionKey= myQueueItem["Id"].ToString();
obj.RowKey=myQueueItem["Quantity"].ToString();
outputTable.Add(obj); // Use ICollector<T> blobOutput.Write($"Partition Key {obj.PartitionKey}");
// For blob, you have an output of Stream, //string,CloudBlockBlob
}
}
}

The Above .cs file is the function file that will create the bindings into the azure function.

  1. Ensure you have installed the Microsoft Azure web jobs package into the Visual Studio.
  2. Whenever the messages will come into the azure queue then it will go ahead and trigger the azure function.
  3. We will be needing a connection string for the Azure storage account. We will be adding the connection string in the local settings.
  4. Messages that will be sent to the queue will be in JSON format.
  5. For the multiple outputs, we will be processing the messages and storing them into the table storage as well as one blob storage. (This will show the multiple bindings)
  6. You need to ensure that the storage account is in place in the azure, secondly, you have to create a queue in which you will be sending the messages.
  7. For the table binding, we will be using the interface called ICollector which will help us to collect the messages.
  8. For the blob, you need to ensure that you have file write access into the blob.
  9. In the main function create the logic and define the class as per the msg you will be sent to the queue, with all the parameters, and create setters and getters in the class so the program can read and parse the messages properly.
  10. For the Table, identify what will be the Partition key and the Row key.

Important points:

In Host.json you can also specify the function timeout into it. The time out will vary as you go through different service plans.

You can also add different settings, you can specify the batch size and maxPollingInterval.

From the batch size, you can also run multiple functions in parallel.

You can also have retry support, azure will retry sending your messages 5 times and if it failed even after 5 times the messages will then go to a poison queue.

Conclusion:

Based on the learning about azure and the functions, you can create many different architectures as per your need with the integration of different services, eg: fan out with queue and notification service.

This marks the end of the multiple bindings with azure functions with the help of .NET and Azure.

Happy learning.

--

--

Shagun

Cloud Engineer | Developer | AWS | Azure | Microservices