Build a File Upload API with Node.js and Cloud Storage | ZextOverse
Build a File Upload API with Node.js and Cloud Storage
File uploads seem simple until they aren't. This guide builds a production-ready upload API — with validation, security, and cloud storage — from the ground up.
Multer is the de facto middleware for handling multipart/form-data in Express. It parses incoming form data and makes files available on req.file (single) or req.files (multiple).
Multer has two storage strategies:
Strategy
Where files land
Best for
memoryStorage
RAM as Buffer
Cloud uploads, image processing
diskStorage
Local filesystem
Large files, video, temporary files
For cloud uploads, memoryStorage is the right choice — you get the file as a and stream it directly to S3 or Cloudinary without writing to disk.
Note on limits.fileSize: Multer enforces this limit during parsing, before your route handler fires. This prevents the server from buffering a 4 GB file before rejecting it — crucial for preventing memory exhaustion attacks.
Validation: Don't Trust the Client
Multer's fileFilter checks the MIME type the client claims — but the client controls that header. A malicious actor can upload a PHP script with Content-Type: image/jpeg and it'll pass the filter.
Real validation means inspecting the file's magic bytes — the first few bytes of the binary content that identify its actual format, regardless of what the client says.
npm install file-type
// src/middleware/validate.ts
import { Request, Response, NextFunction } from 'express'
import { fileTypeFromBuffer } from 'file-type'
const ALLOWED_TYPES = new Set([
'image/jpeg',
'image/png',
'image/webp',
'image/gif',
])
const MAX_DIMENSIONS = { width: 8000, height: 8000 }
const MIN_DIMENSIONS = { width: 10, height: 10 }
export async function validateUpload(
req: Request,
res: Response,
next: NextFunction
) {
if (!req.file) {
return res.status(400).json({ error: 'No file provided' })
}
// 1. Magic byte validation
const detected = await fileTypeFromBuffer(req.file.buffer)
if (!detected || !ALLOWED_TYPES.has(detected.mime)) {
return res.status(422).json({
error: 'Invalid file content. Only JPEG, PNG, WebP, and GIF are allowed.',
})
}
// Overwrite the client-supplied MIME with the detected one
req.file.mimetype = detected.mime
next()
}
Sanitizing the Filename
Original filenames are a security risk — they can contain path traversal sequences (../../etc/passwd), null bytes, or reserved OS characters. Never use them verbatim for storage keys.
import { randomUUID } from 'crypto'
import path from 'path'
function generateSafeFilename(originalName: string, mimeType: string): string {
const ext = mimeType.split('/')[1].replace('jpeg', 'jpg')
return `${randomUUID()}.${ext}`
}
Cloud Storage: Cloudinary
Cloudinary is an excellent choice for images — it handles format conversion, responsive resizing, and CDN delivery out of the box.
Before uploading, you can normalize images server-side using Sharp — the fastest Node.js image processing library.
// src/services/image.service.ts
import sharp from 'sharp'
interface ProcessOptions {
maxWidth?: number
maxHeight?: number
quality?: number
stripMetadata?: boolean
}
export async function processImage(
buffer: Buffer,
options: ProcessOptions = {}
): Promise<{ buffer: Buffer; metadata: sharp.Metadata }> {
const {
maxWidth = 2048,
maxHeight = 2048,
quality = 85,
stripMetadata = true,
} = options
let pipeline = sharp(buffer)
// Strip EXIF (location data, device info, etc.)
if (stripMetadata) {
pipeline = pipeline.withMetadata({})
}
// Resize if larger than max dimensions
pipeline = pipeline.resize(maxWidth, maxHeight, {
fit: 'inside', // preserve aspect ratio
withoutEnlargement: true, // never upscale
})
// Re-encode with controlled quality
pipeline = pipeline.webp({ quality })
const processed = await pipeline.toBuffer()
const metadata = await sharp(processed).metadata()
return { buffer: processed, metadata }
}
Stripping metadata is not optional for user-facing apps — photos taken on smartphones embed GPS coordinates, device model, and sometimes the owner's name into EXIF data. Without stripping, you're redistributing that private information to anyone who downloads the image.
Multer throws specific error types you should handle explicitly:
// src/middleware/errorHandler.ts
import { Request, Response, NextFunction } from 'express'
import multer from 'multer'
export function uploadErrorHandler(
err: Error,
req: Request,
res: Response,
next: NextFunction
) {
if (err instanceof multer.MulterError) {
const messages: Record<string, string> = {
LIMIT_FILE_SIZE: 'File is too large. Maximum size is 5 MB.',
LIMIT_FILE_COUNT: 'Too many files. Only one file per request.',
LIMIT_UNEXPECTED_FILE: 'Unexpected field name. Use "file" as the field name.',
}
return res.status(413).json({
success: false,
error: messages[err.code] ?? 'Upload error.',
})
}
if (err.message.startsWith('File type not allowed')) {
return res.status(415).json({
success: false,
error: err.message,
})
}
next(err)
}
Register it after your routes in index.ts:
// src/index.ts
import express from 'express'
import uploadRouter from './routes/upload'
import { uploadErrorHandler } from './middleware/errorHandler'
const app = express()
app.use('/api', uploadRouter)
app.use(uploadErrorHandler) // Must be last
app.listen(process.env.PORT ?? 3000, () => {
console.log(`Server running on port ${process.env.PORT ?? 3000}`)
})
Rate Limiting and Abuse Prevention
An unprotected upload endpoint is an open invitation to abuse — storage costs money, and processing images consumes CPU. Add rate limiting as a baseline:
npm install express-rate-limit
import rateLimit from 'express-rate-limit'
export const uploadRateLimit = rateLimit({
windowMs: 15 * 60 * 1000, // 15 minutes
max: 20, // 20 uploads per window per IP
standardHeaders: true,
legacyHeaders: false,
message: {
success: false,
error: 'Too many uploads from this IP. Please try again later.',
},
})
For authenticated APIs, rate limit by user ID rather than IP — IP-based limits are easily bypassed with VPNs.
Testing the API
With curl
# Upload a JPEG
curl -X POST http://localhost:3000/api/upload \
-F "file=@/path/to/photo.jpg" \
-H "Accept: application/json"
# Try to upload a text file disguised as an image (should be rejected)
curl -X POST http://localhost:3000/api/upload \
-F "file=@/path/to/script.php;type=image/jpeg"
With Fetch (client-side)
async function uploadImage(file: File) {
const formData = new FormData()
formData.append('file', file)
const response = await fetch('/api/upload', {
method: 'POST',
body: formData,
// Do NOT set Content-Type manually — the browser sets it with the boundary
})
if (!response.ok) {
const { error } = await response.json()
throw new Error(error)
}
return response.json()
}
Common mistake: Setting Content-Type: multipart/form-data manually in fetch. Don't. The browser must set it automatically so it can include the boundary parameter — without it, the server can't parse the body.
{
"success": false,
"error": "File is too large. Maximum size is 5 MB."
}
Invalid content (422)
{
"success": false,
"error": "Invalid file content. Only JPEG, PNG, WebP, and GIF are allowed."
}
Security Checklist
Before deploying, verify each of these:
Magic byte validation — checking file content, not just extension or MIME header
Filename sanitized — using UUIDs, never the original filename as storage key
EXIF stripped — no GPS or device metadata leaking to end users
File size limit enforced — at Multer level, not just route handler
MIME type whitelist — explicit allow-list, not a blocklist
Rate limiting — per IP or per authenticated user
S3 bucket not public — files served via CloudFront or pre-signed URLs
Cloudinary upload preset restricted — not open to unsigned uploads
No stack traces in error responses — only generic messages to clients
Virus scanning for production — consider ClamAV or a cloud scanning API for high-risk platforms
Production Considerations
Streaming large files: memoryStorage loads the entire file into RAM. For files larger than ~10 MB, switch to diskStorage and stream from disk to S3 using the @aws-sdk/lib-storageUpload class — it handles multipart uploads automatically.
Multiple upload variants: Generate thumbnails on upload rather than on every request. Use Sharp to create multiple sizes, upload all variants in parallel with Promise.all, and store the key set in your database.
Virus scanning: For platforms where untrusted users upload arbitrary files, integrate a scanning step between processing and cloud upload. AWS offers Amazon Macie for S3; Cloudinary has third-party add-ons; or run ClamAV in a sidecar container.
Pre-signed URLs: For private S3 objects, never expose the s3.amazonaws.com URL to clients. Generate short-lived pre-signed URLs server-side on demand: