Azure Functions Blob Storage Bindings
This document details how to use Blob Storage bindings with Azure Functions to read from, write to, and trigger functions based on events in Azure Blob Storage.
Overview
Blob Storage bindings provide a declarative way to integrate your Azure Functions with Azure Blob Storage. They simplify common operations like reading blob content, writing new blobs, and executing functions when blobs are created or updated.
Input Bindings
Input bindings allow you to read the content of a blob directly into your function's parameters.
- Blob Content: Read the entire content of a blob into a string, byte array, or stream.
- Blob Metadata: Access metadata properties of a blob.
Example Configuration:
{
"bindings": [
{
"name": "myBlob",
"type": "blob",
"direction": "in",
"path": "samples-workitems/{name}.txt",
"connection": "AzureWebJobsStorage"
}
]
}
In C#, the function signature might look like:
public static void Run(Stream myBlob, string name, ILogger log)
{
log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
// Process the blob stream here
}
Output Bindings
Output bindings enable you to write data to blobs from your function.
- Blob Content: Write data from a function parameter (e.g., string, byte array) to a new blob.
Example Configuration:
{
"bindings": [
{
"name": "outputBlob",
"type": "blob",
"direction": "out",
"path": "output-container/{newFileName}.json",
"connection": "AzureWebJobsStorage"
}
]
}
In JavaScript, you would use the output binding like this:
module.exports = async function (context, req) {
const outputBlob = {
data: "This is the content to write to the blob.",
timestamp: new Date().toISOString()
};
context.bindings.outputBlob = JSON.stringify(outputBlob);
context.res = { status: 200, body: "Blob written successfully." };
};
Trigger Bindings
Blob trigger bindings cause a function to run in response to the creation or update of a blob in a specified container.
- OnBlobCreated: The function executes when a new blob is uploaded or an existing blob is updated.
Example Configuration:
{
"scriptFile": "index.js",
"bindings": [
{
"name": "myBlob",
"type": "blobTrigger",
"direction": "in",
"path": "input-container/{name}",
"connection": "AzureWebJobsStorage"
}
]
}
The {name} in the path parameter is a token that captures the name of the blob that triggered the function.
Configuration
Blob Storage bindings require the following configuration properties:
name: The name of the parameter in your function code that represents the blob.type: Must beblobfor input/output bindings andblobTriggerfor trigger bindings.direction: Eitherin,out, orinout.path: The path to the blob. This can include wildcard patterns likecontainer/{name}.{extension}.connection: The name of the application setting that contains the Blob Storage connection string. Defaults toAzureWebJobsStorage.
path property can include curly braces { } to capture parts of the blob name, which can then be used as parameters in your function or for constructing output paths.
Examples
Reading Blob Content
// C# Example
public static void ReadBlobContent(
Stream inputBlob,
string name,
ILogger log)
{
log.LogInformation($"Blob name: {name}");
using (var reader = new StreamReader(inputBlob))
{
string content = reader.ReadToEnd();
log.LogInformation($"Blob content: {content}");
}
}
Writing Blob Content
// JavaScript Example
module.exports = async function (context, req) {
const blobContent = req.body;
context.bindings.outputBlob = blobContent;
context.res = {
status: 200,
body: "Successfully wrote blob content."
};
};
Blob Trigger Example
// Python Example
import logging
import azure.functions as func
def main(myblob: func.InputStream, name: str):
logging.info(f"Python blob trigger function processed blob\n"
f"Name: {name}\n"
f"Blob Size: {myblob.length} bytes")
content = myblob.read().decode('utf-8')
logging.info(f"Content: {content}")
Best Practices
- Use application settings for connection strings rather than hardcoding them.
- Leverage blob path patterns effectively to manage blob interactions.
- For large blobs, consider using streams or asynchronous operations to avoid memory issues.
- When using output bindings, be mindful of idempotency if your function might be retried.
blob binding in combination with other bindings, such as Queue Storage triggers, to process blobs asynchronously.