Enable and Use Blob Storage Logging

Learn how to enable and configure logging for Azure Blob Storage to monitor and troubleshoot access and operations on your blob data.

What is Blob Logging?

Azure Blob Storage logging provides detailed information about requests made to your storage account. This includes information about authenticated and anonymous requests, operation types, the status code of the operation, and other relevant details. Logging is essential for auditing, security analysis, troubleshooting, and understanding usage patterns of your blob data.

Note: Blob logging is distinct from diagnostic settings for metrics. Logging captures individual request details, while metrics provide aggregated performance data.

Enabling Blob Logging

You can enable and configure blob logging through the Azure portal or Azure CLI.

Using the Azure Portal

Follow these steps to enable logging for your storage account:

  1. Navigate to your storage account in the Azure portal.
  2. In the left-hand menu, under the "Monitoring" section, select Diagnostic settings.
  3. Click Add diagnostic setting.
  4. Provide a name for the diagnostic setting.
  5. Under "Logs", select the categories you want to log. For blob operations, typically you'll want to select StorageRead, StorageWrite, and StorageDelete.
  6. Under "Destination details", choose where to send the logs. Common options include:
    • Send to Log Analytics workspace (recommended for analysis)
    • Archive to a storage account
    • Stream to an event hub
  7. Click Save.

Using Azure CLI

You can use the Azure CLI to enable logging. First, ensure you have the Azure CLI installed and logged in.

To set up diagnostic settings to send logs to a Log Analytics workspace:


az monitor diagnostic-settings create \
    --name "MyStorageAccountDiagnosticSettings" \
    --resource "/subscriptions/{subscription-id}/resourceGroups/{resource-group-name}/providers/Microsoft.Storage/storageAccounts/{storage-account-name}" \
    --workspace "/subscriptions/{subscription-id}/resourceGroups/{resource-group-name}/providers/Microsoft.OperationalInsights/workspaces/{log-analytics-workspace-name}" \
    --logs '[{
        "category": "StorageRead",
        "enabled": true,
        "retentionPolicy": {
            "enabled": true,
            "days": 30
        }
    },
    {
        "category": "StorageWrite",
        "enabled": true,
        "retentionPolicy": {
            "enabled": true,
            "days": 30
        }
    },
    {
        "category": "StorageDelete",
        "enabled": true,
        "retentionPolicy": {
            "enabled": true,
            "days": 30
        }
    }]'
                    

Replace the placeholders like {subscription-id}, {resource-group-name}, {storage-account-name}, and {log-analytics-workspace-name} with your actual values.

Log Data Details

Each log entry typically contains the following key information:

  • time_ts: Timestamp of the request.
  • operation_name: The type of operation performed (e.g., GetBlob, PutBlob, ListContainers).
  • caller_ip_address: The IP address of the client making the request.
  • account_name: The name of the storage account.
  • resource_type: The type of resource accessed (e.g., container, blob).
  • requested_object_key: The specific blob or container name.
  • request_id_header: A unique identifier for the request.
  • operation_status: The status code of the operation (e.g., Success, ClientThrottlingError, ServerBusy).
  • authentication_type: The type of authentication used (e.g., SAS, AccountKey, Anonymous).
Tip: For a comprehensive list of logged fields, refer to the official Azure Storage logging documentation.

Analyzing Logs

Once logging is enabled, you need to analyze the generated logs to gain insights.

Downloading Logs

If you chose to archive logs to a storage account, you can download the log files. They are typically stored in containers named $logs within a specified blob container.

You can use tools like Azure Storage Explorer or the Azure CLI to download these logs.


az storage blob download-batch \
    --account-name {storage-account-name} \
    --destination . \
    --pattern "*.json" \
    --source $logs/y=2023/m=10/d=26/h=10
                    

Using Azure Log Analytics

Sending logs to a Log Analytics workspace is the most powerful way to analyze them. You can write Kusto Query Language (KQL) queries to filter, aggregate, and visualize your log data.

Here's a sample KQL query to find all successful GetBlob operations:


StorageBlobLogs
| where OperationName == "GetBlob"
| where StatusCode == 200
| project TimeGenerated, CallerIpAddress, AccountName, RequestedObjectKey, AuthenticationType, DurationMs
| order by TimeGenerated desc
                    

You can also create dashboards and alerts based on your log data within Log Analytics.

Best Practices

  • Enable logging for all relevant operations: Ensure you capture read, write, and delete operations for comprehensive auditing.
  • Choose an appropriate destination: Log Analytics is recommended for real-time analysis and alerting. Archiving to storage is good for long-term retention.
  • Configure retention policies: Set appropriate retention periods for your logs based on compliance and operational needs.
  • Regularly review logs: Proactively monitor logs for suspicious activity, performance issues, and access patterns.
  • Secure your log data: Ensure that the storage account or Log Analytics workspace where logs are stored is adequately protected.