When dealing with files larger than 100 MB, multipart uploads can drastically improve reliability and speed. Below are some proven techniques:
1. Choose the Right Part Size
Most providers recommend a part size between 5 MB and 100 MB. Larger parts reduce the number of HTTP requests, but too large can increase memory usage.
const PART_SIZE = 10 * 1024 * 1024; // 10 MiB
2. Parallelize Uploads
Uploading parts concurrently can saturate bandwidth. Limit concurrency to avoid overwhelming the network.
const MAX_CONCURRENCY = 5;
await Promise.all(parts.slice(0, MAX_CONCURRENCY).map(uploadPart));
3. Implement Retry Logic
Transient errors happen. Retry failed parts with exponential back‑off.
async function uploadWithRetry(part, attempt = 1) {
try {
await uploadPart(part);
} catch (e) {
if (attempt < 4) {
await new Promise(r => setTimeout(r, 2 ** attempt * 100));
return uploadWithRetry(part, attempt + 1);
}
throw e;
}
}
4. Verify Integrity
Compute an MD5 or SHA‑256 hash for each part and include it in the request header.
openssl dgst -sha256 part-001.bin
Following these tips should reduce failed uploads and improve overall throughput.
Comments (3)
Leave a comment