Upload and Retrieve Blobs with Azure Storage
This guide provides detailed instructions on how to upload and retrieve blobs from Azure Blob Storage using various SDKs and tools.
Note: This document assumes you have an Azure subscription and a storage account. If not, please create them first.
Prerequisites
- An Azure account.
- An Azure Storage account.
- The Azure CLI installed (optional, for some examples).
- A supported SDK installed (e.g., .NET, Java, Python, Node.js).
Uploading Blobs
You can upload blobs to Azure Blob Storage programmatically using the Azure Storage SDKs or via tools like the Azure CLI or Azure Storage Explorer.
Using the Azure CLI
The Azure CLI provides a simple command to upload files.
az storage blob upload \
--account-name \
--container-name \
--name \
--file \
--auth-mode login
Replace placeholders like <storage-account-name>, <container-name>, <blob-name>, and <local-file-path> with your specific values.
Using the .NET SDK
Here's a C# example for uploading a blob:
using Azure.Storage.Blobs;
using System;
using System.IO;
public class BlobUploader
{
public static async Task UploadFileAsync(string connectionString, string containerName, string blobName, string filePath)
{
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
BlobClient blobClient = containerClient.GetBlobClient(blobName);
using (FileStream uploadFileStream = File.OpenRead(filePath))
{
await blobClient.UploadAsync(uploadFileStream, true);
Console.WriteLine($"Successfully uploaded blob '{blobName}' to container '{containerName}'.");
}
}
}
Using the Python SDK
And a Python example:
from azure.storage.blob import BlobServiceClient
def upload_blob(connection_string, container_name, blob_name, file_path):
blob_service_client = BlobServiceClient.from_connection_string(connection_string)
container_client = blob_service_client.get_container_client(container_name)
with open(file_path, "rb") as data:
container_client.upload_blob(name=blob_name, data=data)
print(f"Successfully uploaded blob '{blob_name}' to container '{container_name}'.")
Retrieving Blobs
Downloading blobs is equally straightforward. You can download the entire blob content or stream it.
Using the Azure CLI
az storage blob download \
--account-name \
--container-name \
--name \
--file \
--auth-mode login
Using the .NET SDK
using Azure.Storage.Blobs;
using System;
using System.IO;
public class BlobDownloader
{
public static async Task DownloadFileAsync(string connectionString, string containerName, string blobName, string filePath)
{
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
BlobClient blobClient = containerClient.GetBlobClient(blobName);
using (FileStream downloadFileStream = File.OpenWrite(filePath))
{
await blobClient.DownloadToAsync(downloadFileStream);
Console.WriteLine($"Successfully downloaded blob '{blobName}' to '{filePath}'.");
}
}
}
Using the Python SDK
from azure.storage.blob import BlobServiceClient
def download_blob(connection_string, container_name, blob_name, file_path):
blob_service_client = BlobServiceClient.from_connection_string(connection_string)
container_client = blob_service_client.get_container_client(container_name)
with open(file_path, "wb") as download_file:
download_stream = container_client.download_blob(blob_name)
download_file.write(download_stream.readall())
print(f"Successfully downloaded blob '{blob_name}' to '{file_path}'.")
Tip: For large files, consider using streaming downloads to manage memory efficiently.
Error Handling and Best Practices
- Always implement robust error handling (e.g., try-catch blocks) for network issues or permission errors.
- Use Shared Access Signatures (SAS) or Azure Active Directory for secure authentication.
- Consider using blob indexing for efficient querying.
- Optimize for performance by choosing appropriate blob types (Block, Append, Page).
Warning: Ensure your connection strings and access keys are kept secure and not hardcoded in production code. Use Azure Key Vault.