Blob Input Bindings
Blob input bindings allow your Azure Function to read data directly from Azure Blob Storage. This simplifies common scenarios like processing files uploaded to a blob container.
When a blob input binding is configured, the function runtime automatically retrieves the specified blob and makes its content available as a parameter in your function code. This eliminates the need for manual SDK calls to download blob content.
How it Works
The binding maps a blob in your storage account to a parameter in your function. You specify the blob path using a combination of container name and blob name. The function runtime handles the rest:
- It connects to your Azure Storage account using connection string settings.
- It resolves the blob based on the provided path.
- It downloads the blob content.
- It passes the content to your function parameter in the appropriate type (e.g., string, byte array, stream).
You can use a blob trigger to invoke a function when a new blob is created, or you can use an input binding to read a specific blob within a function triggered by another event.
Configuration
Blob input bindings are configured in your function's function.json file (for JavaScript, Python, and PowerShell) or via attributes in your code (for C# and Java).
function.json Example:
{
"bindings": [
{
"name": "myBlob",
"type": "blob",
"direction": "in",
"path": "samples-workitems/{name}.txt",
"connection": "AzureWebJobsStorage"
}
]
}
Key properties:
name: The name of the parameter in your function code.type: Must be"blob"for blob input.direction: Must be"in"for input bindings.path: Specifies the container and blob name. You can use binding expressions, such as{name}, which can be populated from a trigger or other bindings.connection: The name of an app setting that contains the Azure Storage connection string. Defaults to"AzureWebJobsStorage".
Attribute Example (C#):
[Blob("samples-workitems/{name}.txt", Connection = "AzureWebJobsStorage")] Stream myBlob
Examples
C#
Reading a blob as a string:
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
public static class ReadBlob
{
[FunctionName("ReadBlobContent")]
public static void Run(
[BlobTrigger("samples-workitems/{name}.txt", Connection = "AzureWebJobsStorage")] string myBlob,
string name,
ILogger log)
{
log.LogInformation($"C# Blob trigger function Processed blob\\n Name:{name} \\n Size: {myBlob.Length} Bytes");
log.LogInformation($"Blob Content: {myBlob}");
}
}
Reading a blob as a stream:
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using System.IO;
public static class ReadBlobStream
{
[FunctionName("ReadBlobStreamContent")]
public static void Run(
[BlobTrigger("samples-workitems/{name}.txt", Connection = "AzureWebJobsStorage")] Stream myBlobStream,
string name,
ILogger log)
{
log.LogInformation($"C# Blob trigger function Processed blob\\n Name:{name}");
using (var reader = new StreamReader(myBlobStream))
{
string content = reader.ReadToEnd();
log.LogInformation($"Blob Content: {content}");
}
}
}
JavaScript (Node.js)
Reading a blob as a string:
module.exports = async function (context, myBlob) {
context.log('JavaScript blob trigger function processed blob');
context.log('Name: ' + context.bindingData.name);
context.log('Blob Size: ' + myBlob.length + ' Bytes');
context.log('Blob Content: ' + myBlob);
};
Python
Reading a blob as a string:
import logging
import azure.functions as func
def main(myBlob: func.InputStream) -> None:
logging.info(f"Python blob trigger function processed blob\n"
f"Name: {myBlob.name}\n"
f"Blob Size: {myBlob.length} Bytes")
content = myBlob.read().decode('utf-8')
logging.info(f"Blob Content: {content}")
Advanced Topics
- Binding Expressions: Use binding expressions like
{name},{extension}, or custom parameters from other triggers to dynamically specify the blob path. - Mime Types: For HTTP triggers, you can specify the MIME type of the blob content.
- Multiple Blobs: You can bind to collections of blobs, though this is less common for simple input scenarios.
- Error Handling: Implement robust error handling for cases where the blob might not exist or the connection fails.