This tutorial guides you through integrating Azure Functions with Azure Storage services, specifically focusing on Blob Storage and Table Storage. You'll learn how to trigger functions based on storage events and how to bind to storage to read and write data.
We will build an Azure Function that:
You need a storage account to host your Blob Storage container and Table Storage. You can create one using the Azure portal, Azure CLI, or PowerShell.
Using Azure CLI:
az storage account create --name --resource-group --location --sku Standard_LRS
Replace `
Inside your storage account, create a container to hold the incoming blob files.
Using Azure CLI:
az storage container create --name input-blobs --account-name
This creates a container named input-blobs.
You'll need the connection string to authenticate your Azure Function with your storage account.
Using Azure CLI:
az storage account show-connection-string --name --resource-group --output tsv
Copy the output connection string.
Open your terminal or command prompt and create a new Azure Functions project.
Using Azure Functions Core Tools (JavaScript example):
func init BlobStorageTriggerJS --javascript
cd BlobStorageTriggerJS
Using Azure Functions Core Tools (C# example):
func init BlobStorageTriggerCS --dotnet
cd BlobStorageTriggerCS
Add a new function that is triggered by blob creations.
Using Azure Functions Core Tools (JavaScript):
func new --template "Azure Blob Storage trigger" --name ProcessBlob
Using Azure Functions Core Tools (C#):
func new --template "Azure Blob Storage trigger" --name ProcessBlob
function.json (JavaScript) or BlobStorageTriggerCS.csproj and local.settings.json (C#)Your function.json (for JavaScript) or the project file and local.settings.json (for C#) will define the triggers and bindings. You need to set the storage connection string and the blob container name.
For JavaScript (function.json in the ProcessBlob folder):
{
"scriptFile": "../run.js",
"bindings": [
{
"name": "myBlob",
"type": "blobTrigger",
"direction": "in",
"path": "input-blobs/{name}",
"connection": "AzureWebJobsStorage"
},
{
"name": "outputBlob",
"type": "blob",
"direction": "out",
"path": "output-blobs/{name}.processed",
"connection": "AzureWebJobsStorage"
},
{
"name": "outputTable",
"type": "table",
"direction": "out",
"tableName": "ProcessedBlobData",
"connection": "AzureWebJobsStorage"
}
]
}
For C#
In your local.settings.json, add your connection string:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "YOUR_STORAGE_CONNECTION_STRING",
"FUNCTIONS_WORKER_RUNTIME": "dotnet"
}
}
In your function's C# file (e.g., ProcessBlob.cs), define the bindings using attributes:
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Extensions.Logging;
using Azure.Storage.Blobs;
using Azure.Data.Tables;
using System.IO;
using System.Threading.Tasks;
public static class ProcessBlob
{
[FunctionName("ProcessBlob")]
public static async Task Run(
[BlobTrigger("input-blobs/{name}", Connection = "AzureWebJobsStorage")] Stream blobStream,
string name,
[Blob("output-blobs/{name}.processed", FileAccess.Write, Connection = "AzureWebJobsStorage")] Stream outputBlobStream,
[Table("ProcessedBlobData", Connection = "AzureWebJobsStorage")] IAsyncCollector<TableEntity> tableOutput,
ILogger log)
{
log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {blobStream.Length} Bytes");
// Read blob content
string blobContent;
using (var reader = new StreamReader(blobStream))
{
blobContent = await reader.ReadToEndAsync();
}
// Simulate processing (e.g., convert to uppercase)
string processedContent = blobContent.ToUpper();
// Write processed content to an output blob
using (var writer = new StreamWriter(outputBlobStream))
{
await writer.WriteAsync(processedContent);
}
// Write processed data to Table Storage
var tableEntity = new TableEntity("BlobData", name)
{
{ "Content", processedContent },
{ "OriginalSize", blobStream.Length },
{ "Processed", true }
};
await tableOutput.AddAsync(tableEntity);
log.LogInformation($"Blob '{name}' processed. Output written to output-blobs/{name}.processed and Table Storage.");
}
}
Remember to replace YOUR_STORAGE_CONNECTION_STRING in local.settings.json with your actual connection string.
If you are writing to an output blob container, ensure it exists. The table will be created automatically by Azure Functions if it doesn't exist.
Using Azure CLI:
az storage container create --name output-blobs --account-name
Update your run.js file for the ProcessBlob function.
module.exports = async function (context, myBlob) {
context.log('JavaScript blob trigger function processed blob');
context.log(\`Name: ${context.bindingData.name}\`);
context.log(`Blob Size: ${myBlob.length} Bytes`);
// Read blob content
const blobContent = myBlob.toString();
// Simulate processing (e.g., convert to uppercase)
const processedContent = blobContent.toUpperCase();
// Write processed content to an output blob
context.bindings.outputBlob = processedContent;
// Write processed data to Table Storage
context.bindings.outputTable = {
partitionKey: 'BlobData',
rowKey: context.bindingData.name,
Content: processedContent,
OriginalSize: myBlob.length,
Processed: true
};
context.log(\`Blob '${context.bindingData.name}' processed. Output written to output-blobs/${context.bindingData.name}.processed and Table Storage.\`);
};
In JavaScript, the blob content is passed as a Buffer. You can convert it to a string using myBlob.toString().
Navigate to your project directory in the terminal and start the Azure Functions host.
func start
The output will show the function host starting and listening for events.
Use Azure Storage Explorer or the Azure CLI to upload a text file (e.g., sample.txt) into the input-blobs container of your storage account.
Using Azure CLI:
echo "Hello Azure Functions Storage!" > sample.txt
az storage blob upload --file sample.txt --container-name input-blobs --name sample.txt --account-name
Check the terminal where the func start command is running. You should see logs indicating that your ProcessBlob function was triggered and executed successfully.
You can then verify the results:
output-blobs container for a file named sample.txt.processed containing the uppercase content.ProcessedBlobData table in your storage account for a new entity with the processed data. You can use Azure Storage Explorer to view tables.Once you're satisfied with the local testing, you can deploy your function app to Azure.
Using Azure Functions Core Tools:
func azure functionapp publish
Make sure to configure your Function App's application settings in Azure to include the AzureWebJobsStorage connection string pointing to your storage account.
Never hardcode connection strings directly in your code. Always use application settings (local.settings.json locally, and Application Settings in Azure Function App) to manage secrets.