This tutorial will guide you through integrating Azure Blob Storage with Azure Functions using input and output bindings. This allows your functions to read from and write to Blob Storage with minimal code.
We'll create an Azure Function that is triggered when a new blob is added to a specific container in Azure Blob Storage. This function will read the blob content, potentially perform some processing (like generating a thumbnail, though we'll keep it simple here), and then write a confirmation message to another blob.
If you don't have one already, create an Azure Storage account in the Azure portal. Once created, navigate to your storage account and go to Containers. Create a new container named input-images
. Note down your storage account name and access key (found under Access keys).
Open your terminal or command prompt and use Azure Functions Core Tools to create a new project:
func init BlobTriggerCSharp --dotnet --target-framework net6.0
cd BlobTriggerCSharp
func new --name ProcessImage --template "Azure Blob Storage trigger"
This creates a new C# Azure Functions project and adds a function named ProcessImage
.
Open the Function.cs
file for your ProcessImage
function. You'll see an attribute defining the trigger. We need to configure it to point to our input-images
container and set up an output binding.
First, open the local.settings.json
file and add your Azure Storage connection string:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true", // Replace with your actual connection string or keep for local dev
"FUNCTIONS_WORKER_RUNTIME": "dotnet"
}
}
Replace UseDevelopmentStorage=true
with your actual storage account connection string. You can find this in the Azure portal under your storage account's Access keys. For local development, using the Azure Storage Emulator is also an option.
Now, modify Function.cs
:
using System;
using System.IO;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Extensions.Logging;
namespace BlobTriggerCSharp
{
public static class ProcessImage
{
[Function("ProcessImage")]
public static void Run(
[BlobTrigger("input-images/{name}")] Stream inputBlob,
[Blob("output-logs/{name}.txt")] out string outputBlob,
string name,
FunctionContext context)
{
var logger = context.GetLogger();
logger.LogInformation($"C# Blob trigger function processed blob\n Name: {name} \n Size: {inputBlob.Length} Bytes");
// Simulate some processing
string processingResult = $"Successfully processed blob: {name}. Size: {inputBlob.Length} bytes.";
// Write the result to the output blob
outputBlob = processingResult;
logger.LogInformation($"Output log written for blob: {name}");
}
}
}
In this code:
[BlobTrigger("input-images/{name}")]
: This defines the input binding. The function triggers when a blob is added to the input-images
container. {name}
is a placeholder that captures the blob's name.[Blob("output-logs/{name}.txt")] out string outputBlob
: This defines an output binding. It specifies that a string named outputBlob
will be written to a blob named after the input blob (with a .txt
extension) in the output-logs
container. If output-logs
doesn't exist, Azure Functions will create it.Stream inputBlob
: This parameter receives the content of the triggered blob.string name
: This parameter receives the name of the blob that triggered the function.Ensure your Azure Storage Emulator is running or that your local.settings.json
is configured with a valid connection string.
Run your function project from the root directory:
func start
The Azure Functions host will start and listen for triggers.
Go to your Azure Storage account in the Azure portal, navigate to the input-images
container, and upload a small text file (e.g., my-test-file.txt
). You can also use tools like Azure Storage Explorer.
Check the output in your terminal where func start
is running. You should see log messages indicating that the blob was processed.
Now, navigate to the output-logs
container in your storage account. You should find a new blob named my-test-file.txt
containing the processing result message.
You can bind to different types:
Stream
: For reading the blob content as a stream.byte[]
: For reading the blob content as a byte array.string
: For reading the blob content as a string (Azure Functions will attempt to decode it).Similar to input bindings, you can write various types:
Stream
: Write data to a stream.byte[]
: Write a byte array to a blob.string
: Write a string to a blob.You can specify a fixed blob name or use parameters from the trigger:
[Blob("my-fixed-container/my-file.txt")] out string outputBlob
[Blob("my-container/{queueTrigger}.txt")] out Stream outputBlob
Blob storage bindings can be used with HTTP triggers, Queue triggers, Timer triggers, etc., to read data from or write data to Blob Storage as part of your function's logic.
Tip: For more complex scenarios, like generating thumbnails or performing transformations, you would read the blob content into memory (e.g., using Stream
or byte[]
), perform your operations, and then write the transformed data to a different blob using an output binding.
You've successfully set up an Azure Function triggered by blob creations and used both input and output bindings to interact with Azure Blob Storage. This pattern is fundamental for building event-driven, serverless workflows in Azure.