Multipart Upload Tips for Large Files

Posted by Jane Doe • 3 days ago
multipart upload cloud-storage performance

When dealing with files larger than 100 MB, multipart uploads can drastically improve reliability and speed. Below are some proven techniques:

1. Choose the Right Part Size

Most providers recommend a part size between 5 MB and 100 MB. Larger parts reduce the number of HTTP requests, but too large can increase memory usage.

const PART_SIZE = 10 * 1024 * 1024; // 10 MiB

2. Parallelize Uploads

Uploading parts concurrently can saturate bandwidth. Limit concurrency to avoid overwhelming the network.

const MAX_CONCURRENCY = 5;
await Promise.all(parts.slice(0, MAX_CONCURRENCY).map(uploadPart));

3. Implement Retry Logic

Transient errors happen. Retry failed parts with exponential back‑off.

async function uploadWithRetry(part, attempt = 1) {
    try {
        await uploadPart(part);
    } catch (e) {
        if (attempt < 4) {
            await new Promise(r => setTimeout(r, 2 ** attempt * 100));
            return uploadWithRetry(part, attempt + 1);
        }
        throw e;
    }
}

4. Verify Integrity

Compute an MD5 or SHA‑256 hash for each part and include it in the request header.

openssl dgst -sha256 part-001.bin

Following these tips should reduce failed uploads and improve overall throughput.

Comments (3)

2 days ago
Great summary! I found that using a part size of 8 MiB works best for my network.
Reply
1 day ago
Don't forget to enable checksum verification on the server side. It saved me a lot of headaches.
Reply
5 hours ago
I tried this approach with AWS S3 and had to increase the timeout for each part. Otherwise the request timed out.
Reply

Leave a comment