☁️ Azure Storage Documentation

Uploading Blobs to Azure Storage

Azure Blob Storage is a service that stores unstructured data such as text or binary data. Blobs can be any type of text or binary data, such as images, documents, or streaming media. This document outlines the common methods for uploading blobs to your Azure Storage account.

Prerequisites

Uploading a Single Block Blob

The most common scenario is uploading a single file as a block blob. Block blobs are optimized for storing large amounts of unstructured data, such as media files and application installation files.

Python Example (using Azure SDK)


from azure.storage.blob import BlobServiceClient

# Replace with your actual connection string
connection_string = "YOUR_AZURE_STORAGE_CONNECTION_STRING"
container_name = "my-container"
blob_name = "my-blob.txt"
file_path = "path/to/your/local/file.txt"

try:
    # Create the BlobServiceClient object
    blob_service_client = BlobServiceClient.from_connection_string(connection_string)

    # Get a client to interact with the container
    container_client = blob_service_client.get_container_client(container_name)

    # Create a blob client
    blob_client = container_client.get_blob_client(blob_name)

    # Open the file in binary mode
    with open(file_path, "rb") as data:
        # Upload the blob
        blob_client.upload_blob(data)
        print(f"Blob '{blob_name}' uploaded successfully.")

except Exception as ex:
    print('Exception: {}'.format(ex))
            

Uploading a Large File (with Chunking)

For very large files, the SDKs often handle chunking automatically. However, understanding the concept is important. Larger blobs are uploaded in blocks, and the SDK manages the process of uploading these blocks.

JavaScript Example (using @azure/storage-blob)


import { BlobServiceClient } from "@azure/storage-blob";

async function uploadFile(accountName, accountKey, containerName, blobName, filePath) {
    const connectionString = `DefaultEndpointsProtocol=https;AccountName=${accountName};AccountKey=${accountKey};EndpointSuffix=core.windows.net`;
    const blobServiceClient = BlobServiceClient.fromConnectionString(connectionString);
    const containerClient = blobServiceClient.getContainerClient(containerName);

    const blockBlobClient = containerClient.getBlockBlobClient(blobName);

    const fs = require('fs');
    const fileStream = fs.createReadStream(filePath);
    const uploadOptions = { blobHTTPHeaders: { blobContentType: "text/plain" } };

    try {
        await blockBlobClient.uploadStream(fileStream, undefined, undefined, uploadOptions);
        console.log(`Blob '${blobName}' uploaded successfully.`);
    } catch (error) {
        console.error("Error uploading blob:", error);
    }
}

// Example usage:
// const accountName = "your_account_name";
// const accountKey = "your_account_key";
// const containerName = "your_container_name";
// const blobName = "large-file.bin";
// const filePath = "./path/to/your/large-file.bin";
// uploadFile(accountName, accountKey, containerName, blobName, filePath);
            

Using Shared Access Signatures (SAS)

For more granular control and security, you can generate a Shared Access Signature (SAS) for your blobs or containers. This allows clients to upload data without needing direct access to storage account keys.

Security Note: Always restrict the permissions and expiry time of SAS tokens to the minimum necessary to enhance security.

Common Upload Scenarios & Considerations

Refer to the official Azure Storage SDK documentation for your specific programming language for detailed API references and more advanced upload options.