Disclaimer: This website requires Please enable JavaScript in your browser settings for the best experience.

Dev GuideAPI ReferenceChangelog
Dev GuideAPI ReferenceUser GuideDev CommunityOptimizely AcademySubmit a ticketLog In
Dev Guide

Multipart upload

How to upload large files faster using Optimizely Content Management Platform's multipart upload feature.

The multipart upload feature in Optimizely Content Management Platform (CMP) lets you do the following:

  • Upload large files by splitting them into smaller parts.
  • Upload parts in parallel for better performance.
  • Resume interrupted uploads.
  • Handle files up to 5 TB in size.

Size Requirements

  • File size

    • Minimum – 1 B (5,242,881 bytes)
    • Maximum – 5 TB (5,497,558,138,880 bytes)
  • Part size

    • Minimum – 5 MB (5,242,880 bytes)
    • Maximum – 5 GB (5,368,709,120 bytes)
    • Default – 5 MB if not specified

Upload Process

Initiate the upload

Make a POST request to /v3/multipart-uploads with the following:

  • file_size – Total size of the file in bytes.
  • part_size – Size of each part in bytes (optional, defaults to 5 MB).

The response includes

  • id – Unique identifier for the upload.
  • upload_part_urls – Array of pre-signed URLs for each part.
  • upload_part_count – Total number of parts.
  • expires_at – Expiration timestamp for the upload URLs.

Upload the file parts

  • Divide the file into equal parts of the specified part_size. The last part may be smaller than part_size.
  • Upload each part using its corresponding pre-signed URL from the upload_part_urls array using the PUT request. Do not add a Content-Type header with the PUT request.

Tips:

  • Use parallel uploads for better performance.
  • Use retry mechanism to retry failed uploads due to a network issue.

Complete the upload and monitor the status

  1. Complete Upload
    Call POST /v3/multipart-uploads/{id}/complete after you upload all parts.
    A success response indicates the completion process has started.

  2. Monitor Status

    • Poll GET /v3/multipart-uploads/{id}/status to check progress.
    • Add a delay between polls (recommended: 1-2 seconds) to avoid exceeding the rate limit.
    • Continue until you reach the final status.
    • Use the returned key to register the file with other CMP resources.

Status values

The upload status endpoint can return the following values:

  • UPLOAD_COMPLETION_NOT_STARTED – Completion process has not been initiated.
  • UPLOAD_COMPLETION_IN_PROGRESS – Upload is being processed.
  • UPLOAD_COMPLETION_SUCCEEDED – Upload completed successfully.
  • UPLOAD_COMPLETION_FAILED – Upload failed. Check status_message in the response for details.

Error handling

Common HTTP Status Codes

CodeDescriptionSolution
401UnauthorizedRefresh/verify API token.
403ForbiddenCheck permissions and URL expiration.
404Not FoundVerify upload ID.
422Invalid SizeCheck file/part size requirements.

Using the File Key

After a successful upload, do the following:

  1. Save the returned file key.
  2. Use the key with other CMP APIs, such as POST /v3/assets for adding to library and POST /v3/campaigns/{id}/attachments for adding as attachment to a campaign.

Code Example

See the following JavaScript for a working example.

const fs = require('fs');
const fetch = require('node-fetch');


const API_BASE_URL = process.env.API_BASE_URL || 'https://api.cmp.optimizely.com';


class MultipartUploader {
    constructor(apiToken, partSize) {
        this.apiToken = apiToken;
        this.partSize = partSize;
    }

    async uploadLargeFile(file) {
        console.log(`Initiating multipart upload..., file size: ${file.size}, part size: ${this.partSize}`);
        const { uploadId, uploadUrls } = await this.initiateMultipartUpload(file.size);

        console.log('Uploading parts...');
        await this.uploadParts(file, uploadUrls);

        console.log('Completing upload...');
        const fileKey = await this.completeUpload(uploadId);

        console.log('Waiting for completion...');
        await this.waitForCompletion(uploadId);

        console.log('Upload complete. File key:', fileKey);
        return fileKey;
    }

    async getFileSize(filePath) {
        const stats = fs.statSync(filePath);
        return stats.size;
    }

    async initiateMultipartUpload(fileSize) {
        const response = await fetch(`${API_BASE_URL}/v3/multipart-uploads`, {
            method: 'POST',
            headers: {
                'Authorization': `Bearer ${this.apiToken}`,
                'Content-Type': 'application/json'
            },
            body: JSON.stringify({
                file_size: fileSize,
                part_size: this.partSize
            })
        });

        if (!response.ok) {
            const responseJson = await response.json();
            throw new Error(`Failed to initiate upload. response: ${JSON.stringify(responseJson)}`);
        }

        const result = await response.json();
        return {
            uploadId: result.id,
            uploadUrls: result.upload_part_urls,
            partCount: result.upload_part_count,
            expiresAt: result.expires_at
        };
    }

    async uploadParts(file, uploadUrls) {
        const uploadPromises = uploadUrls.map(async (url, index) => {
            const start = index * this.partSize;
            const end = Math.min(start + this.partSize, file.size);
            const chunk = await file.slice(start, end);

            const response = await fetch(url, {
                method: 'PUT',
                body: chunk,
            });

            if (!response.ok) {
                const responseText = await response.text();
                throw new Error(`Failed to upload part ${index + 1}, status: ${response.status}, statusText: ${response.statusText}, responseText: ${responseText}`);
            }

            console.log(`Part ${index + 1} uploaded successfully`);
        });

        // Upload all parts in parallel
        await Promise.all(uploadPromises);
    }

    async completeUpload(uploadId) {
        const response = await fetch(`${API_BASE_URL}/v3/multipart-uploads/${uploadId}/complete`, {
            method: 'POST',
            headers: {
                'Authorization': `Bearer ${this.apiToken}`
            }
        });

        if (!response.ok) {
            throw new Error('Failed to complete upload');
        }

        const result = await response.json();
        return result.key;
    }

    async checkUploadStatus(uploadId) {
        const response = await fetch(`${API_BASE_URL}/v3/multipart-uploads/${uploadId}/status`, {
            headers: {
                'Authorization': `Bearer ${this.apiToken}`
            }
        });

        if (!response.ok) {
            throw new Error('Failed to check upload status');
        }
        return await response.json();
    }

    async waitForCompletion(uploadId) {
        while (true) {
            const status = await this.checkUploadStatus(uploadId);
            console.log(`status: ${status.status}`);

            switch (status.status) {
                case 'UPLOAD_COMPLETION_SUCCEEDED':
                    return status.key;
                case 'UPLOAD_COMPLETION_FAILED':
                    throw new Error(`Upload failed: ${status.status_message}`);
                case 'UPLOAD_COMPLETION_IN_PROGRESS':
                    await new Promise(resolve => setTimeout(resolve, 1000));
                    continue;
                case 'UPLOAD_COMPLETION_NOT_STARTED':
                    throw new Error('Upload completion not initiated');
                default:
                    throw new Error(`Unexpected status: ${status.status}`);
            }
        }
    }
}

async function uploadFile(filePath, partSize, apiToken) {
    const uploader = new MultipartUploader(apiToken, partSize);

    const stats = fs.statSync(filePath);

    const file = {
        size: stats.size,
        slice: (start, end) => {
            return new Promise((resolve, reject) => {
                const chunk = [];
                const sliceStream = fs.createReadStream(filePath, { start, end: end - 1 });

                sliceStream.on('data', data => chunk.push(data));
                sliceStream.on('end', () => resolve(Buffer.concat(chunk)));
                sliceStream.on('error', reject);
            });
        }
    };

    try {
        const fileKey = await uploader.uploadLargeFile(file);
        return fileKey;
    } catch (error) {
        console.error('Upload failed:', error);
        throw error;
    }
}

async function addFileToLibrary(key, name, apiToken) {
    const response = await fetch(`${API_BASE_URL}/v3/assets`, {
        method: 'POST',
        headers: {
            'Authorization': `Bearer ${apiToken}`,
            'Content-Type': 'application/json'
        },
        body: JSON.stringify({
            key,
            title: name,
        })
    });

    if (!response.ok) {
        const responseJson = await response.json();
        throw new Error(`Failed to add file to library. response: ${JSON.stringify(responseJson)}`);
    }

    const result = await response.json();
    return result.id;
}

async function main() {
    const FILE_PATH = process.env.FILE_PATH;
    const PART_SIZE = parseInt(process.env.PART_SIZE) || 5 * 1024 * 1024;
    const API_TOKEN = process.env.API_TOKEN;

    const key = await uploadFile(FILE_PATH, PART_SIZE, API_TOKEN);
    console.log('File uploaded successfully. Key:', key);

    console.log('Adding file to library...');
    const libraryAssetId = await addFileToLibrary(key, 'large-file', API_TOKEN);

    console.log('File added to library successfully. ID:', libraryAssetId);
}

main().catch(console.error);

Command to execute to run the example:

FILE_PATH=sample-large-file.dat API_TOKEN=********** node mulitpart_uploader.js