How to Build a Backend with Next.js Route Handlers (No Express) | ZextOverse
How to Build a Backend with Next.js Route Handlers (No Express)
You don't need Express, Fastify, or a separate API server anymore. Next.js 13+ ships with everything you need to build a production-grade backend — right inside your frontend project.
Two repos (or a monorepo with complex tooling). Two deployment targets. CORS headers everywhere. Environment variable duplication. Context-switching between two different mental models of routing.
Next.js Route Handlers collapse this into a single project. Your API lives in the same codebase as your UI, shares the same TypeScript types, the same environment variables, and deploys as one unit — to Vercel, a Docker container, or any Node.js host.
This isn't a toy feature. Route Handlers support full HTTP semantics, streaming responses, middleware-style logic, authentication, database connections, and everything you'd expect from a real backend. They're just... built in.
What Are Route Handlers?
Introduced in Next.js 13 with the App Router, Route Handlers are files named route.ts (or route.js) placed inside the app/ directory. They export named async functions corresponding to HTTP methods: GET, POST, PUT, PATCH, DELETE, HEAD, and OPTIONS.
No app.listen(). No res.json(). No Express middleware stack to configure. Just a function that returns a Response.
Share this article:
The Request and Response Model
Route Handlers are built on the Web Fetch API — the same Request and Response primitives available in browsers, Cloudflare Workers, and Deno. Next.js extends these with NextRequest and NextResponse for convenience, but you can use the standard classes directly if you prefer.
Route Handlers support the same dynamic segment syntax as page routes. The second argument to any handler function is a context object containing resolved params.
Route Handlers run in a Node.js environment (by default), so you can use any database client you'd use in Express: Prisma, Drizzle, Mongoose, pg, mysql2, or any ORM.
With Prisma
// lib/db.ts — singleton pattern to avoid connection exhaustion
import { PrismaClient } from '@prisma/client'
const globalForPrisma = globalThis as unknown as {
prisma: PrismaClient | undefined
}
export const db =
globalForPrisma.prisma ??
new PrismaClient({
log: process.env.NODE_ENV === 'development' ? ['query'] : [],
})
if (process.env.NODE_ENV !== 'production') globalForPrisma.prisma = db
Never trust user input. Pair your route handlers with Zod for runtime validation and automatic TypeScript inference.
// app/api/users/route.ts
import { z } from 'zod'
import { NextRequest, NextResponse } from 'next/server'
import { db } from '@/lib/db'
const CreateUserSchema = z.object({
email: z.string().email(),
name: z.string().min(2).max(100),
role: z.enum(['admin', 'editor', 'viewer']).default('viewer'),
})
export async function POST(request: NextRequest) {
const body = await request.json()
const parsed = CreateUserSchema.safeParse(body)
if (!parsed.success) {
return NextResponse.json(
{
error: 'Validation failed',
details: parsed.error.flatten().fieldErrors,
},
{ status: 422 }
)
}
// parsed.data is fully typed: { email: string, name: string, role: 'admin' | 'editor' | 'viewer' }
const user = await db.user.create({ data: parsed.data })
return NextResponse.json(user, { status: 201 })
}
A failed validation returns a structured error response:
{
"error": "Validation failed",
"details": {
"email": ["Invalid email"],
"name": ["String must contain at least 2 character(s)"]
}
}
Streaming Responses
Route Handlers support streaming via the Web Streams API — particularly useful for AI-generated content, large data exports, or real-time progress updates.
Streaming Text (e.g., LLM output)
// app/api/chat/route.ts
import { NextRequest } from 'next/server'
export async function POST(request: NextRequest) {
const { messages } = await request.json()
const stream = new ReadableStream({
async start(controller) {
const encoder = new TextEncoder()
// Example: streaming from an AI SDK
const completion = await openai.chat.completions.create({
model: 'gpt-4o',
messages,
stream: true,
})
for await (const chunk of completion) {
const text = chunk.choices[0]?.delta?.content ?? ''
controller.enqueue(encoder.encode(text))
}
controller.close()
},
})
return new Response(stream, {
headers: {
'Content-Type': 'text/plain; charset=utf-8',
'Transfer-Encoding': 'chunked',
},
})
}
Server-Sent Events (SSE)
// app/api/events/route.ts
export async function GET() {
const stream = new ReadableStream({
start(controller) {
const encoder = new TextEncoder()
const send = (data: object) => {
controller.enqueue(
encoder.encode(`data: ${JSON.stringify(data)}\n\n`)
)
}
// Emit events over time
let count = 0
const interval = setInterval(() => {
send({ count: ++count, time: Date.now() })
if (count >= 10) {
clearInterval(interval)
controller.close()
}
}, 1000)
},
})
return new Response(stream, {
headers: {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
Connection: 'keep-alive',
},
})
}
CORS and Custom Headers
Set response headers directly on the NextResponse object, or use a utility for DRY cross-origin configuration.
Handle the preflight OPTIONS request in the same route file:
// app/api/data/route.ts
import { corsHeaders, corsResponse } from '@/lib/cors'
export async function OPTIONS() {
return new Response(null, { status: 204, headers: corsHeaders() })
}
export async function GET() {
return corsResponse({ data: 'hello' })
}
Rate Limiting
Without a framework plugin, rate limiting is a manual concern. The good news: it's a clean, composable function.
// lib/rate-limit.ts
import { NextRequest, NextResponse } from 'next/server'
const requestCounts = new Map<string, { count: number; resetAt: number }>()
export function rateLimit(
request: NextRequest,
options = { limit: 60, windowMs: 60_000 }
) {
const ip = request.ip ?? request.headers.get('x-forwarded-for') ?? 'unknown'
const now = Date.now()
const record = requestCounts.get(ip)
if (!record || now > record.resetAt) {
requestCounts.set(ip, { count: 1, resetAt: now + options.windowMs })
return null // allowed
}
if (record.count >= options.limit) {
return NextResponse.json(
{ error: 'Too many requests' },
{
status: 429,
headers: {
'Retry-After': String(Math.ceil((record.resetAt - now) / 1000)),
},
}
)
}
record.count++
return null // allowed
}
// app/api/contact/route.ts
import { rateLimit } from '@/lib/rate-limit'
export async function POST(request: NextRequest) {
const limited = rateLimit(request, { limit: 5, windowMs: 60_000 })
if (limited) return limited
// handle form submission...
}
Note: The in-memory Map approach works for single-instance deployments. For multi-instance or edge deployments, use a distributed store like Upstash Redis with their @upstash/ratelimit library.
Error Handling
Build a consistent error response utility to avoid ad-hoc error formats across your handlers:
// lib/api-error.ts
import { NextResponse } from 'next/server'
export class ApiError extends Error {
constructor(
public message: string,
public status: number = 500,
public code?: string
) {
super(message)
}
}
export function handleError(error: unknown) {
console.error(error)
if (error instanceof ApiError) {
return NextResponse.json(
{ error: error.message, code: error.code },
{ status: error.status }
)
}
return NextResponse.json(
{ error: 'Internal server error' },
{ status: 500 }
)
}
By default, Route Handlers run in a Node.js environment. You can opt into the Edge Runtime for lower latency and global distribution — but with constraints (no Node.js APIs, limited npm packages).
// app/api/geo/route.ts
export const runtime = 'edge' // ← opt in
import { NextRequest, NextResponse } from 'next/server'
export async function GET(request: NextRequest) {
const country = request.geo?.country ?? 'unknown'
const city = request.geo?.city ?? 'unknown'
return NextResponse.json({ country, city })
}
Feature
Node.js Runtime
Edge Runtime
Cold start
~100–500ms
~0–50ms
Node.js APIs
✅ Full access
❌ Not available
Database clients
✅ Prisma, pg, etc.
⚠️ HTTP-only (Neon, PlanetScale)
Max execution time
~60s (Vercel)
~5–30s
Global deployment
❌ Single region
✅ 300+ locations
For most CRUD APIs, stay on Node.js. Use Edge for geolocation, A/B testing, auth token verification, and lightweight middleware.
Project Structure: A Complete Example
Here's how a production-grade API section of a Next.js project might be organized:
No built-in WebSocket support: For bidirectional real-time communication, you'll need a separate WebSocket server (or use services like Pusher, Ably, or Supabase Realtime). SSE covers many "real-time" use cases one-directionally.
No shared middleware with fine-grained control: Next.js middleware.ts runs on the Edge and handles routing-level concerns. For handler-level middleware, you compose higher-order functions manually (as shown above).
No automatic OpenAPI generation: Unlike tRPC, Hono, or typed-route libraries, vanilla Route Handlers don't auto-generate API docs. Consider zod-to-openapi or next-swagger-doc for documentation.
Not ideal for complex background jobs: For queue-based processing, cron jobs, or long-running tasks, combine Route Handlers with Vercel Cron, Trigger.dev, or Inngest.
Route Handlers vs. tRPC vs. Server Actions
You're not locked into Route Handlers. Here's how the options compare for full-stack Next.js:
Route Handlers
tRPC
Server Actions
Protocol
HTTP REST
HTTP (RPC-style)
HTTP POST (form-like)
Type safety
Manual
✅ End-to-end
✅ End-to-end
External API
✅ Easy
❌ Client-coupling
❌ Not suitable
Learning curve
Low
Medium
Low
Best for
Public APIs, webhooks
Internal app APIs
Form mutations, simple actions
The pragmatic answer: use Server Actions for simple form mutations, tRPC for internal type-safe APIs, and Route Handlers when you need a real HTTP API — for mobile clients, third-party integrations, or webhooks.
Conclusion
Next.js Route Handlers don't just replace Express — they represent a different way of thinking about backend code. Instead of a separate server with its own lifecycle, your API becomes a collection of pure, stateless functions that co-locate with your UI, share your type system, and deploy as part of a single artifact.
The primitives are simple: a file, a function, a Request, a Response. The composability comes from you — higher-order functions for auth, utilities for validation, shared modules for database access. No magic, no framework lock-in.
You'll find that for most applications — CRUD APIs, authentication flows, webhooks, file uploads, and even streaming — Route Handlers are more than enough. And when they're not, they're easy to replace or augment with the right specialized tool for the job.
Start simple. Stay close to the platform. Ship faster.