Azure Table Storage Bindings
Integrate Azure Table Storage with Azure Functions for efficient NoSQL data operations.
Overview
Azure Functions provides powerful input and output bindings for Azure Table Storage, enabling seamless interaction with your tables directly from your function code. You can read entities from a table as input, or write entities to a table as an output.
Input Bindings
You can use Table Storage as an input binding to retrieve a single entity or a collection of entities from a table.
Retrieving a Single Entity
To retrieve a single entity, you typically specify the partition key and row key.
Example: C#
// function.json
{
"bindings": [
{
"name": "inputBlob",
"type": "blobTrigger",
"direction": "in",
"path": "input/{name}.txt",
"connection": "AzureWebJobsStorage"
},
{
"name": "tableEntity",
"type": "table",
"direction": "in",
"tableName": "MyTable",
"partitionKey": "MyPartition",
"rowKey": "{name}",
"connection": "AzureWebJobsStorage"
},
{
"name": "outputBlob",
"type": "blob",
"direction": "out",
"path": "output/{name}.txt",
"connection": "AzureWebJobsStorage"
}
]
}
// MyFunction.cs
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using System.IO;
public static class MyFunction
{
public static void Run(Stream inputBlob, string tableEntity, TraceWriter log)
{
log.Info($"C# Blob trigger function processed");
log.Info($"Table Entity: {tableEntity}");
// Process tableEntity here
}
}
Retrieving a Collection of Entities
For more complex queries or retrieving multiple entities, you can use a query expression.
Example: Python
# function_app.py
import azure.functions as func
app = func.FunctionApp()
@app.blob_trigger(path="input/{name}.txt",
connection="AzureWebJobsStorage")
@app.table_input(name="entities",
table_name="MyTable",
partition_key="MyPartition",
filter="RowKey ge '100'", # Example filter
connection="AzureWebJobsStorage")
@app.blob_output(path="output/{name}.txt",
connection="AzureWebJobsStorage")
def main(inputBlob: func.InputStream, entities: func.Table, outputBlob: func.Out[str]):
logging.info(f"Python blob trigger function processed")
for entity in entities:
logging.info(f"Entity: {entity}")
outputBlob.set("Processed")
Output Bindings
You can use Table Storage as an output binding to write entities to a table.
Example: JavaScript
// index.js
module.exports = async function (context, myQueueItem) {
context.log('JavaScript queue trigger function processed work item', myQueueItem);
const tableInsert = {
partitionKey: "Orders",
rowKey: myQueueItem.orderId,
customerName: myQueueItem.customer,
orderDate: new Date().toISOString(),
amount: myQueueItem.total
};
context.bindings.tableOutput = tableInsert;
context.log('Inserted into table.');
};
// function.json
{
"bindings": [
{
"name": "myQueueItem",
"type": "queueTrigger",
"direction": "in",
"queueName": "orders",
"connection": "AzureWebJobsStorage"
},
{
"name": "tableOutput",
"type": "table",
"direction": "out",
"tableName": "Orders",
"connection": "AzureWebJobsStorage"
}
]
}
Configuration Options
Here are common configuration properties for Table Storage bindings:
| Property | Description | Required |
|---|---|---|
tableName |
The name of the table to interact with. | Yes |
partitionKey |
The partition key for single entity operations. Can use binding expressions. | No (unless retrieving a single entity) |
rowKey |
The row key for single entity operations. Can use binding expressions. | No (unless retrieving a single entity) |
filter |
An OData filter expression for querying entities (input bindings only). | No |
connection |
The name of an app setting that contains the Azure Storage connection string. | Yes |
Best Practices
- Design your table schema with appropriate partition and row keys to optimize query performance.
- Use filtering expressions for input bindings to retrieve only the data you need.
- Consider using Table Storage for scenarios that require high availability and massive scalability for structured NoSQL data.
For more detailed information, refer to the official Azure Table Storage documentation.