AJAX - Multipart Data — File uploads with AJAX
Uploading files from the browser to a server via AJAX usually uses the multipart/form-data
encoding. Below I’ll explain how it works, how to implement it on the client and server, advanced techniques (progress, chunking, resumable uploads, presigned uploads), security and best practices, and show concrete code examples.
1) What is multipart/form-data
(high level)
-
multipart/form-data
is an HTTP request encoding for sending form fields and binary data (files) together in a single request. -
The request body is split into “parts” separated by a boundary string. Each part has its own headers and body (field value or binary file content).
-
Browsers generate the boundary and set the
Content-Type: multipart/form-data; boundary=----...
header when you send aFormData
object — you should not set this header manually (the boundary must match).
2) Client-side: simple file upload with FormData
HTML
<input id="fileInput" type="file" name="file" />
<button id="uploadBtn">Upload</button>
Using fetch
(simple, no upload progress)
const input = document.getElementById('fileInput');
document.getElementById('uploadBtn').addEventListener('click', async () => {
const file = input.files[0];
if (!file) return;
const formData = new FormData();
formData.append('file', file); // field name 'file'
formData.append('userId', '123'); // other form fields are fine too
const res = await fetch('/upload', {
method: 'POST',
body: formData
});
const result = await res.json();
console.log(result);
});
Notes:
-
fetch
+FormData
is the simplest path. -
fetch
does not provide a built-in upload progress event (see below), so if you need a progress bar, useXMLHttpRequest
or implement chunked uploads.
Using XMLHttpRequest
(with upload progress)
const file = input.files[0];
const formData = new FormData();
formData.append('file', file);
const xhr = new XMLHttpRequest();
xhr.open('POST', '/upload');
xhr.upload.onprogress = (e) => {
if (e.lengthComputable) {
const percent = Math.round((e.loaded / e.total) * 100);
console.log('Upload progress:', percent + '%');
}
};
xhr.onload = () => {
if (xhr.status >= 200 && xhr.status < 300) {
console.log('Upload successful:', JSON.parse(xhr.responseText));
} else {
console.error('Upload failed:', xhr.status, xhr.responseText);
}
};
xhr.onerror = () => { console.error('Network error'); };
xhr.send(formData);
Key point: use xhr.upload.onprogress
to track bytes uploaded.
3) Don’t set Content-Type
manually
When sending FormData
, do not set Content-Type
in fetch
or XMLHttpRequest
. Browsers will set Content-Type: multipart/form-data; boundary=...
including the correct boundary. If you set it yourself you’ll break the boundary and the server may be unable to parse the multipart body.
4) Multiple files, file metadata, and previews
-
Use
<input type="file" multiple>
to allow multiple files. -
Append multiple files to
FormData
with the same field name or different names:for (const file of input.files) formData.append('photos[]', file);
-
Preview images before upload with
FileReader.readAsDataURL(file)
.
5) Progress, cancellation, and aborting
-
XMLHttpRequest
supportsxhr.upload.onprogress
(upload) andxhr.onprogress
(download). -
With
fetch
, you can abort an ongoing request usingAbortController
, but you cannot measure upload progress natively. Example:const controller = new AbortController(); fetch('/upload', { method:'POST', body: formData, signal: controller.signal }); // to cancel: controller.abort();
If you need upload progress reporting with fetch
, implement chunked uploads where you upload file parts manually and track progress as you send chunks (see chunked/resumable section).
6) Chunked uploads & resumable uploads
Large files or flaky networks benefit from uploading in chunks (multipart chunking) and optionally supporting resume.
Simple chunked upload approach (client)
async function uploadInChunks(file, chunkSize = 5 * 1024 * 1024) { // 5MB
const totalChunks = Math.ceil(file.size / chunkSize);
for (let i = 0; i < totalChunks; i++) {
const start = i * chunkSize;
const chunk = file.slice(start, start + chunkSize);
const formData = new FormData();
formData.append('chunk', chunk);
formData.append('fileName', file.name);
formData.append('index', i);
formData.append('total', totalChunks);
// You can use fetch or XHR here
await fetch('/upload-chunk', { method: 'POST', body: formData });
// update UI progress using (i+1)/totalChunks
}
// tell server to assemble chunks (or server auto-assembles)
}
Server-side for chunked uploads
-
Accept chunks with metadata (file identifier, index, total).
-
Store chunks temporarily (e.g.,
uploads/tmp/<fileId>/<index>
). -
When all chunks received, concatenate them in the correct order into final file.
-
Optionally verify checksum (MD5/SHA) for integrity.
-
Clean up temp chunks.
Use existing protocols/libraries for production
-
tus (tus.io) — resumable upload protocol with client/server libraries.
-
resumable.js, Fine Uploader, or other libraries provide resume, chunking, retries, concurrency, and checksums out of the box.
7) Direct/cloud uploads (presigned URLs)
To offload file handling from your app server, use cloud storage direct uploads:
Flow (S3-style)
-
Client requests a presigned URL from your server (server authenticates and asks storage service for a presigned PUT or POST).
-
Server returns the presigned URL and any required form fields.
-
Client uploads directly to storage (using
fetch
PUT orFormData
POST) — server doesn’t handle file bytes.
Example (PUT):
// obtained from server: presignedUrl
await fetch(presignedUrl, { method: 'PUT', body: file, headers: { 'Content-Type': file.type } });
Advantages:
-
Reduces load on application server.
-
Scales well for large files.
Caveat: With PUT
presigned URLs you set Content-Type
; with POST
presigned forms you must include provided fields and use FormData
.
8) Server-side handling examples
Node + Express + multer (basic)
const express = require('express');
const multer = require('multer');
const upload = multer({ dest: 'uploads/' }); // or configure storage
const app = express();
app.post('/upload', upload.single('file'), (req, res) => {
console.log(req.file); // file metadata
console.log(req.body); // other form fields
res.json({ success: true, filename: req.file.filename });
});
Simple chunk assemble (pseudo-Node)
// when receiving chunk:
fs.rename(tmpPath, path.join(tmpDir, `chunk-${index}`), () => { /* saved */ });
// when all chunks present:
const out = fs.createWriteStream(finalPath);
for (let i=0; i<totalChunks; i++) {
const chunkPath = path.join(tmpDir, `chunk-${i}`);
const data = fs.readFileSync(chunkPath);
out.write(data);
fs.unlinkSync(chunkPath);
}
out.end();
9) Security & validation (very important)
-
Authenticate uploads (only authorized users can upload).
-
Validate file type on server (don’t trust MIME type or file extension alone). Use content/sniffing (magic numbers).
-
Limit file size both client-side UI and server-side strict limits.
-
Sanitize filenames (remove path characters), store with randomized names or GUIDs.
-
Store files outside web root or serve via generated URLs; prevent arbitrary execution.
-
Rate-limit and throttle uploads to prevent abuse.
-
Scan for malware if appropriate (virus scanner or cloud scanning).
-
Use HTTPS to protect file contents in transit.
-
CORS: ensure cross-origin uploads are allowed only from trusted origins and with correct credentials settings.
-
Avoid storing sensitive files without encryption at rest if required by policy.
10) UX considerations
-
Show clear file size limits and allowed file types in UI.
-
Preview images before upload and show thumbnails.
-
Show progress bars and allow pause/cancel.
-
Retry transient failures automatically with backoff, but limit retries.
-
Upload multiple files in controlled concurrency (e.g., 3 at a time) to avoid saturating bandwidth or server.
11) Debugging tips
-
In the browser devtools Network tab, inspect the request’s request headers and body — ensure
Content-Type
ismultipart/form-data
and boundary is present. -
If server says “missing file” check field name used in
FormData.append('file', ...)
matches server-side expectation. -
For large uploads, check server and proxy timeouts (Nginx, load balancers).
-
If upload fails in CORS, ensure
Access-Control-Allow-Origin
,Access-Control-Allow-Credentials
, andAccess-Control-Allow-Headers
are correct.
12) Example: Combined full stack flow (summary)
-
Client picks file(s) and builds
FormData
. -
Client either:
-
POSTs to app server (
fetch
orXHR
), or -
Requests presigned URL and PUTs directly to object storage.
-
-
Server validates user permission, file size, MIME, and stores file (on disk or cloud).
-
Server responds with metadata (URL, file ID, size, checksum).
-
Client shows success and receives file URL for displaying or future use.
13) Quick best-practices checklist
-
Use
FormData
for simple uploads. -
Use
XMLHttpRequest
if you need simple upload progress events. -
For large files, prefer chunked/resumable uploads (tus or libraries).
-
For scalability, upload directly to cloud storage with presigned URLs.
-
Always validate & sanitize on server; set size limits; scan for malware if required.
-
Use HTTPS and require authentication for uploads.