Azure Functions Blob Trigger Bindings
This article provides detailed information about the Azure Blob Storage trigger and bindings for Azure Functions. The blob trigger allows your function to be executed in response to changes in a blob container in Azure Blob Storage.
Key Concept: Blob triggers are ideal for event-driven processing of new or updated blob data.
When to use a blob trigger
Use a blob trigger when you need to:
- Process uploaded files as they arrive in storage.
- Perform transformations or validations on blob data.
- Trigger downstream workflows based on blob creation or modification.
- Ingest and process large data files from blob storage.
Trigger configuration
The blob trigger is configured using attributes in your function code or through function.json. The core properties include:
path
            This property defines the blob path that the trigger monitors. It can include wildcards to match specific file patterns or subdirectories.
- samples-workitems/{name}.csv: Triggers when a CSV file named `*.csv` is added to the- samples-workitemscontainer.
- input/{*blobname}: Triggers for any blob added to the- inputcontainer, making the full blob name available.
connection
            Specifies the name of an app setting that contains the Azure Storage connection string. If not specified, the default connection string named AzureWebJobsStorage is used.
direction
            For triggers, this is implicitly in.
Binding Types
The blob trigger binding allows you to read the content of the blob directly into your function as a specific data type.
Blob as a stream
You can bind the blob trigger to a Stream object to read the blob's content.
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
using System.IO;
namespace MyFunctionApp
{
    public static class BlobTriggerStream
    {
        [Function("BlobTriggerStream")]
        public static void Run(
            [BlobTrigger("samples-workitems/{name}.csv", Connection = "AzureWebJobsStorage")] Stream myBlob,
            string name,
            FunctionContext context)
        {
            var logger = context.GetLogger();
            logger.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
            using (var reader = new StreamReader(myBlob))
            {
                string content = reader.ReadToEnd();
                logger.LogInformation($"Blob content:\n{content}");
            }
        }
    }
}
                 Blob as text
For text-based blobs, you can bind directly to a string.
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
namespace MyFunctionApp
{
    public static class BlobTriggerString
    {
        [Function("BlobTriggerString")]
        public static void Run(
            [BlobTrigger("samples-workitems/{name}.txt", Connection = "AzureWebJobsStorage")] string myBlob,
            string name,
            FunctionContext context)
        {
            var logger = context.GetLogger();
            logger.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Content: {myBlob}");
        }
    }
}
                 Blob as byte array
For binary data, binding to a byte[] is convenient.
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
namespace MyFunctionApp
{
    public static class BlobTriggerByteArray
    {
        [Function("BlobTriggerByteArray")]
        public static void Run(
            [BlobTrigger("samples-workitems/{name}.png", Connection = "AzureWebJobsStorage")] byte[] myBlob,
            string name,
            FunctionContext context)
        {
            var logger = context.GetLogger();
            logger.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
            // Process the byte array, e.g., save to another location or analyze
        }
    }
}
                 Binding to other types
Functions runtime can automatically deserialize common types like JSON objects if you specify the correct target type.
Blob Metadata
In addition to the blob content, you can access metadata like the blob name, size, and ETag by declaring parameters in your function signature.
- name: The name of the blob file.
- length: The size of the blob in bytes.
- etag: The ETag of the blob.
- lastModified: The last modified timestamp of the blob.
Note: The exact available metadata parameters may vary slightly based on the Functions runtime version and language.
Example: Processing JSON data
If your blob contains JSON data, you can bind directly to a POCO (Plain Old C# Object).
// Define your data model
public class MyData
{
    public string Id { get; set; }
    public string Message { get; set; }
}
// Function definition
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
namespace MyFunctionApp
{
    public static class BlobTriggerJson
    {
        [Function("BlobTriggerJson")]
        public static void Run(
            [BlobTrigger("input-json/{name}.json", Connection = "AzureWebJobsStorage")] MyData myBlobData,
            string name,
            FunctionContext context)
        {
            var logger = context.GetLogger();
            logger.LogInformation($"C# Blob trigger function Processed blob\n Name:{name}");
            logger.LogInformation($"Data Id: {myBlobData.Id}, Message: {myBlobData.Message}");
        }
    }
}
                 Advanced Scenarios
Processing multiple blobs
You can configure blob triggers to monitor multiple containers or use wildcards extensively in the path property to trigger on various blob patterns.
Conditional triggers
While not directly part of the blob trigger binding itself, you can implement conditional logic within your function to decide whether to process the blob content based on its name, size, or metadata.
Multiple blob bindings
A single function can be triggered by multiple blob triggers, or have input/output blob bindings in addition to the trigger.