Get started with 33% off your first certification using code: 33OFFNEW

Create a chunked uploader in JS and PHP to handle large files

2 min read
Published on 4th August 2023

File uploads are a fundamental aspect of many web applications. However, handling large file uploads can be challenging. Users with slow or unstable internet connections can experience timeouts or broken connections. Server-side settings might limit the maximum file size for uploads. A solution to these problems is to split the file into smaller chunks and upload each piece separately. This process is known as chunked uploads.

In this guide, we will explain how to implement chunked uploads with JavaScript on the frontend and PHP on the backend.

This guide covers both the frontend and backend implementations, but if you're just looking for the frontend we have a more comprehensive article on uploading large files in JS.

Frontend: JavaScript

Step 1: Splitting the File into Chunks

First, we need to split the file into smaller chunks. We can use the slice method of the Blob interface to achieve this. The slice method returns a new Blob object containing data in the specified range.

const CHUNK_SIZE = 1000000; // 1MB
let start = 0;
let end = CHUNK_SIZE;
let chunks = [];
let file = document.getElementById('file-input').files[0];

while (start < file.size) {
  let chunk = file.slice(start, end);
  chunks.push(chunk);
  start = end;
  end = start + CHUNK_SIZE;
}

Step 2: Uploading the Chunks

Next, we upload each chunk separately using the Fetch API.

chunks.forEach((chunk, index) => {
  let formData = new FormData();
  formData.append('file', chunk);
  formData.append('name', file.name);
  formData.append('index', index);

  fetch('/upload.php', {
    method: 'POST',
    body: formData
  }).then(response => {
    // Handle response
  }).catch(error => {
    // Handle error
  });
});

Backend: PHP

On the server-side, we will use PHP to handle the uploaded chunks. Our goal is to combine all chunks back into the original file.

Step 1: Handling the Uploaded Chunks

First, we need to save each chunk in a temporary directory.

<?php
$tempDir = '/path/to/temp/dir';
$fileName = $_POST['name'];
$fileIndex = $_POST['index'];

if (!is_dir($tempDir)) {
    mkdir($tempDir, 0777, true);
}

move_uploaded_file($_FILES['file']['tmp_name'], "$tempDir/$fileName.part.$fileIndex");

Step 2: Combining the Chunks

After all chunks are uploaded, we combine them back into the original file.

$finalDir = '/path/to/final/dir';
$filePath = "$tempDir/$fileName.part.*";
$fileParts = glob($filePath);
sort($fileParts, SORT_NATURAL);

$finalFile = fopen("$finalDir/$fileName", 'w');

foreach ($fileParts as $filePart) {
    $chunk = file_get_contents($filePart);
    fwrite($finalFile, $chunk);
    unlink($filePart);  // Optionally delete the chunk
}

fclose($finalFile);

With this setup, you can upload large files efficiently and reliably, even on slow or unstable internet connections. Furthermore, this approach bypasses server limitations on the maximum upload size. However, please note that this is a simplified example. In a production environment, you should add proper error handling, security measures, and perhaps even progress updates for a better user experience.

You may also be interested in how to read and serve large files to the browser using streaming.