Azure Functions: Integrating with Azure Storage

This tutorial guides you through integrating Azure Functions with Azure Storage services, specifically focusing on Blob Storage and Table Storage. You'll learn how to trigger functions based on storage events and how to bind to storage to read and write data.

Prerequisites

Scenario

We will build an Azure Function that:

  1. Triggers when a new blob is added to a specific Blob Storage container.
  2. Reads the content of the blob.
  3. Processes the data (e.g., transforming it).
  4. Writes the processed data to Azure Table Storage.

Step 1: Set up Azure Storage

Create Azure Storage Account

You need a storage account to host your Blob Storage container and Table Storage. You can create one using the Azure portal, Azure CLI, or PowerShell.

Using Azure CLI:

az storage account create --name --resource-group --location --sku Standard_LRS

Replace ``, ``, and `` with your desired values.

Create Blob Container

Inside your storage account, create a container to hold the incoming blob files.

Using Azure CLI:

az storage container create --name input-blobs --account-name

This creates a container named input-blobs.

Get Storage Connection String

You'll need the connection string to authenticate your Azure Function with your storage account.

Using Azure CLI:

az storage account show-connection-string --name --resource-group --output tsv

Copy the output connection string.

Step 2: Create an Azure Function Project

Initialize Project

Open your terminal or command prompt and create a new Azure Functions project.

Using Azure Functions Core Tools (JavaScript example):

func init BlobStorageTriggerJS --javascript cd BlobStorageTriggerJS

Using Azure Functions Core Tools (C# example):

func init BlobStorageTriggerCS --dotnet cd BlobStorageTriggerCS

Add Blob Trigger Function

Add a new function that is triggered by blob creations.

Using Azure Functions Core Tools (JavaScript):

func new --template "Azure Blob Storage trigger" --name ProcessBlob

Using Azure Functions Core Tools (C#):

func new --template "Azure Blob Storage trigger" --name ProcessBlob

Step 3: Configure Function Bindings

Configure function.json (JavaScript) or BlobStorageTriggerCS.csproj and local.settings.json (C#)

Your function.json (for JavaScript) or the project file and local.settings.json (for C#) will define the triggers and bindings. You need to set the storage connection string and the blob container name.

For JavaScript (function.json in the ProcessBlob folder):

{
  "scriptFile": "../run.js",
  "bindings": [
    {
      "name": "myBlob",
      "type": "blobTrigger",
      "direction": "in",
      "path": "input-blobs/{name}",
      "connection": "AzureWebJobsStorage"
    },
    {
      "name": "outputBlob",
      "type": "blob",
      "direction": "out",
      "path": "output-blobs/{name}.processed",
      "connection": "AzureWebJobsStorage"
    },
    {
      "name": "outputTable",
      "type": "table",
      "direction": "out",
      "tableName": "ProcessedBlobData",
      "connection": "AzureWebJobsStorage"
    }
  ]
}

For C#

In your local.settings.json, add your connection string:

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "YOUR_STORAGE_CONNECTION_STRING",
    "FUNCTIONS_WORKER_RUNTIME": "dotnet"
  }
}

In your function's C# file (e.g., ProcessBlob.cs), define the bindings using attributes:

using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Extensions.Logging;
using Azure.Storage.Blobs;
using Azure.Data.Tables;
using System.IO;
using System.Threading.Tasks;

public static class ProcessBlob
{
    [FunctionName("ProcessBlob")]
    public static async Task Run(
        [BlobTrigger("input-blobs/{name}", Connection = "AzureWebJobsStorage")] Stream blobStream,
        string name,
        [Blob("output-blobs/{name}.processed", FileAccess.Write, Connection = "AzureWebJobsStorage")] Stream outputBlobStream,
        [Table("ProcessedBlobData", Connection = "AzureWebJobsStorage")] IAsyncCollector<TableEntity> tableOutput,
        ILogger log)
    {
        log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {blobStream.Length} Bytes");

        // Read blob content
        string blobContent;
        using (var reader = new StreamReader(blobStream))
        {
            blobContent = await reader.ReadToEndAsync();
        }

        // Simulate processing (e.g., convert to uppercase)
        string processedContent = blobContent.ToUpper();

        // Write processed content to an output blob
        using (var writer = new StreamWriter(outputBlobStream))
        {
            await writer.WriteAsync(processedContent);
        }

        // Write processed data to Table Storage
        var tableEntity = new TableEntity("BlobData", name)
        {
            { "Content", processedContent },
            { "OriginalSize", blobStream.Length },
            { "Processed", true }
        };
        await tableOutput.AddAsync(tableEntity);

        log.LogInformation($"Blob '{name}' processed. Output written to output-blobs/{name}.processed and Table Storage.");
    }
}

Remember to replace YOUR_STORAGE_CONNECTION_STRING in local.settings.json with your actual connection string.

Create Output Container (if needed)

If you are writing to an output blob container, ensure it exists. The table will be created automatically by Azure Functions if it doesn't exist.

Using Azure CLI:

az storage container create --name output-blobs --account-name

Step 4: Write Function Logic

Process Blob Content (JavaScript Example)

Update your run.js file for the ProcessBlob function.

module.exports = async function (context, myBlob) {
    context.log('JavaScript blob trigger function processed blob');
    context.log(\`Name: ${context.bindingData.name}\`);
    context.log(`Blob Size: ${myBlob.length} Bytes`);

    // Read blob content
    const blobContent = myBlob.toString();

    // Simulate processing (e.g., convert to uppercase)
    const processedContent = blobContent.toUpperCase();

    // Write processed content to an output blob
    context.bindings.outputBlob = processedContent;

    // Write processed data to Table Storage
    context.bindings.outputTable = {
        partitionKey: 'BlobData',
        rowKey: context.bindingData.name,
        Content: processedContent,
        OriginalSize: myBlob.length,
        Processed: true
    };

    context.log(\`Blob '${context.bindingData.name}' processed. Output written to output-blobs/${context.bindingData.name}.processed and Table Storage.\`);
};

JavaScript Data Types

In JavaScript, the blob content is passed as a Buffer. You can convert it to a string using myBlob.toString().

Step 5: Run and Test Locally

Start the Function Host

Navigate to your project directory in the terminal and start the Azure Functions host.

func start

The output will show the function host starting and listening for events.

Upload a Blob

Use Azure Storage Explorer or the Azure CLI to upload a text file (e.g., sample.txt) into the input-blobs container of your storage account.

Using Azure CLI:

echo "Hello Azure Functions Storage!" > sample.txt az storage blob upload --file sample.txt --container-name input-blobs --name sample.txt --account-name

Verify Output

Check the terminal where the func start command is running. You should see logs indicating that your ProcessBlob function was triggered and executed successfully.

You can then verify the results:

Step 6: Deploy to Azure

Deploy Function App

Once you're satisfied with the local testing, you can deploy your function app to Azure.

Using Azure Functions Core Tools:

  1. Create a Function App in Azure (if you haven't already).
  2. Link your local project to the remote Function App.
  3. Deploy using:
func azure functionapp publish

Make sure to configure your Function App's application settings in Azure to include the AzureWebJobsStorage connection string pointing to your storage account.

Best Practices and Further Learning

Security Note

Never hardcode connection strings directly in your code. Always use application settings (local.settings.json locally, and Application Settings in Azure Function App) to manage secrets.