Azure Functions – Table Storage bindings

Overview

Table storage bindings simplify interaction with Azure Table storage from your function. You can declaratively read entities (input binding) or insert/update/delete entities (output binding) without writing any storage SDK code.

Note: Table bindings are available for C#, JavaScript, Python, Java, and PowerShell.

Input binding

Use the input binding to retrieve a single entity or a collection of entities based on a partition key.

function.json
{
  "bindings": [
    {
      "type": "table",
      "direction": "in",
      "name": "entity",
      "tableName": "Customers",
      "partitionKey": "USA",
      "rowKey": "{queueTrigger}"
    }
  ]
}

In the example above, the rowKey is bound to the value of a queue trigger.

Output binding

The output binding lets you write an entity back to table storage. You can also set InsertOrReplace or InsertOrMerge operations using the operation property.

function.json
{
  "bindings": [
    {
      "type": "table",
      "direction": "out",
      "name": "$return",
      "tableName": "Orders",
      "partitionKey": "region",
      "rowKey": "orderId",
      "connection": "AzureWebJobsStorage"
    }
  ]
}

Binding properties

PropertyRequiredDescription
typeYesAlways table.
directionYesin for input, out for output.
nameYesVariable name used in function code.
tableNameYesTarget Azure Table name.
partitionKeyNoStatic value or binding expression.
rowKeyNoStatic value or binding expression.
connectionNoApp setting name that contains storage connection string. Defaults to AzureWebJobsStorage.
filterNoOData filter expression for query‑based input bindings.
takeNoMaximum number of entities to return (for query input).
operationNoFor output: Insert, InsertOrReplace, InsertOrMerge, Delete.

Sample code

C# (in‑process)

Function.cs
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
using Azure.Data.Tables;

public static class TableInputFunction
{
    [FunctionName("TableInputFunction")]
    public static void Run(
        [QueueTrigger("customer-ids")] string queueItem,
        [Table("Customers", "USA", "{queueItem}", Connection = "AzureWebJobsStorage")] CustomerEntity entity,
        ILogger log)
    {
        if (entity != null)
        {
            log.LogInformation($"Customer {entity.RowKey} from {entity.PartitionKey}: {entity.Name}");
        }
        else
        {
            log.LogWarning($"No entity found for RowKey = {queueItem}");
        }
    }
}

public class CustomerEntity : ITableEntity
{
    public string PartitionKey { get; set; }
    public string RowKey { get; set; }
    public DateTimeOffset? Timestamp { get; set; }
    public ETag ETag { get; set; }

    public string Name { get; set; }
    public string Email { get; set; }
}

JavaScript (Node.js)

index.js
module.exports = async function (context, myQueueItem) {
    const entity = context.bindings.entity; // Input binding
    if (entity) {
        context.log(`Fetched entity: ${entity.RowKey} - ${entity.Name}`);
        // Modify and write back using output binding
        entity.Email = "updated@example.com";
        context.bindings.outputEntity = entity; // Output binding name set to $return in function.json
    } else {
        context.log.warn(`No entity found for key ${myQueueItem}`);
    }
};

Python (3.9)

__init__.py
import logging
import azure.functions as func

def main(msg: func.QueueMessage, entity: func.TableEntity) -> func.TableEntity:
    logging.info(f"Processing rowKey={entity['RowKey']}")
    # Update a property
    entity['Status'] = 'Processed'
    return entity  # Output binding (named $return)

FAQ

Can I query multiple entities?
Yes. Omit rowKey and provide a filter expression with take to limit results.
How do I handle concurrency?
Use the ETag property on the entity. Set operation to InsertOrReplace for optimistic concurrency.
Is there a limit on entity size?
Each entity can be up to 1 MiB. Large payloads should be stored in Blob storage and referenced from the table.