Fastify: Express Yourself, But Make It Lightning Fast

A senior architect at a growing SaaS company was staring at their monitoring dashboard on a Tuesday morning, watching response times creep upward as traffic increased. Their Express.js API, which had served them faithfully for three years, was starting to show its age. The team had already optimized database queries, added caching, and fine-tuned their infrastructure. Yet here they were, contemplating expensive horizontal scaling just to handle what should be routine traffic. "There has to be a better way," they muttered, reaching for their coffee. That better way, it turns out, might have been sitting in their tech radar all along.
The Node.js ecosystem has long been dominated by Express.js, and for good reason. It's battle-tested, well-documented, and has a massive ecosystem. But as applications scale and performance becomes critical, architects are increasingly asking whether "good enough" is actually good enough. When every millisecond counts and infrastructure costs are mounting, the choice of web framework stops being academic and starts affecting the bottom line. The real question is whether you're leaving performance (and money) on the table by not exploring alternatives.
The Need for Speed
Think of web frameworks like cars. Express.js is your reliable Honda Civic, it gets you where you need to go, parts are everywhere, and every mechanic knows how to work on it. Fastify, on the other hand, is more like a well-engineered sports car that somehow gets better gas mileage. It's built from the ground up with performance in mind, but without sacrificing the developer experience or forcing you into opinionated architectural decisions.
The magic lies in Fastify's architecture. While Express processes requests through a linear middleware stack (which can become a performance bottleneck), Fastify uses a highly optimized router called find-my-way, based on a radix tree, and compiles validation and serialization schemas ahead of time using Ajv JIT compilation. This means less work per request and faster routing, sometimes dramatically so.
But here's where it gets interesting: Fastify goes well beyond raw speed. Smart defaults and modern JavaScript patterns are central to its design. It embraces JSON Schema for validation and serialization, provides built-in support for async/await (no callback hell here), and offers a plugin architecture that actually makes sense. You get the performance benefits without having to reinvent the wheel or abandon the patterns you know.
The Six Pillars of Fastify
Before diving into code, it's worth understanding why Fastify has become the framework of choice for performance-critical Node.js applications. The Fastify team summarizes its design philosophy in six core features.
Highly Performant
Fastify is one of the fastest web frameworks available for Node.js. Depending on code complexity, Fastify can serve up to 30,000 requests per second, and internal benchmarks regularly show 70,000 to 80,000 requests per second under controlled conditions. In head-to-head testing on Node.js 20 LTS, Fastify achieved 76,835 requests per second versus Express at 38,510. That's roughly a 2x improvement on average workloads, with some scenarios showing a 5x advantage. This comes from a combination of its radix-tree router (about 3x faster than Express routing), schema-based serialization, and avoiding monkey-patching of Node.js core HTTP modules.
Extensible
Fastify is fully extensible via its hooks, plugins, and decorators. Plugins allow you to encapsulate functionality into isolated, reusable units. Decorators let you extend the Fastify instance, request, and reply objects. And hooks give you lifecycle callbacks at every stage of request processing, from onRequest through onSend to onResponse. The hook system supports both async/await and callback styles, and newer releases added the onListen hook for running code exactly when the server starts accepting connections.
Schema-Based
Even if it is not mandatory, Fastify recommends using JSON Schema to validate your routes and serialize your outputs. Internally, Fastify compiles the schema into a highly performant function using Ajv's JIT compilation, so validation adds negligible overhead. The integration with TypeBox (via the official @fastify/type-provider-typebox package) makes this even better: TypeBox produces standard JSON Schema while giving you TypeScript type inference automatically, and it slots cleanly into OpenAPI documentation generation.
Logging
Logs are extremely important but are costly. Fastify chose Pino as its logger to almost eliminate this cost. Pino is consistently one of the fastest Node.js loggers, using async writes and a low-overhead serialization format. You get structured JSON logging out of the box, with virtually no performance penalty compared to running without any logging at all.
Developer Friendly
The framework is built to be expressive and help developers in their daily use, without sacrificing performance and security. Async/await is the default everywhere. Error handling is automatic: any uncaught exception in a route handler triggers a clean 500 response with logging. The schema system doubles as documentation. And Fastify v5 cleaned up years of accumulated API debt by removing deprecated options and requiring full JSON schemas, resulting in a more consistent and predictable surface area.
TypeScript Ready
Fastify is written in TypeScript and ships first-class type declarations. The growing TypeScript community is a priority for the maintainers. Combined with TypeBox for schema definitions, you get end-to-end type safety: the same TypeBox schema that validates your request body at runtime also infers the TypeScript type that your handler receives.
Breaking Down the Fastify Advantage
The six pillars describe what Fastify is designed to be. This section shows what that looks like in practice, with concrete examples across the areas where the difference is most tangible.
Performance That Actually Matters
Let's talk numbers. Fastify's benchmarks consistently show it handling significantly more requests per second than Express.js. But raw throughput isn't the whole story. What matters more for most applications is:
- Lower latency at the 99th percentile: When your slowest requests are faster, user experience improves across the board
- Better resource utilization: Handle more traffic with the same infrastructure, or the same traffic with less
- Reduced garbage collection pressure: Fastify's design minimizes object creation, leading to more predictable performance under load
Here's a real-world example. A team migrating from Express to Fastify for their API gateway saw their P99 response times drop from 450ms to 180ms, without changing any business logic. That's the kind of improvement that makes product managers smile and reduces AWS bills.
Schema-Driven Development Done Right
Fastify embraces JSON Schema for both validation and serialization. This might sound like extra work, but it's actually a game-changer. Consider this validation example:
import Fastify from 'fastify';
const fastify = Fastify({ logger: true });
// Define a schema for user creation
const createUserSchema = {
body: {
type: 'object',
required: ['email', 'username'],
properties: {
email: { type: 'string', format: 'email' },
username: { type: 'string', minLength: 3, maxLength: 30 },
age: { type: 'integer', minimum: 18 }
}
},
response: {
201: {
type: 'object',
properties: {
id: { type: 'string' },
email: { type: 'string' },
username: { type: 'string' },
createdAt: { type: 'string', format: 'date-time' }
}
}
}
} as const;
fastify.post('/users', { schema: createUserSchema }, async (request, reply) => {
// By the time we're here, the body is already validated
const { email, username, age } = request.body;
// Your business logic here
const user = await createUser({ email, username, age });
// Response serialization is automatic based on the schema
reply.code(201).send(user);
});
Notice what's happening here. The schema serves multiple purposes:
- Documentation: Your API contract is explicit and machine-readable
- Validation: Invalid requests are rejected before your handler runs
- Serialization: Responses are automatically serialized (and extra properties are stripped for security)
- Performance: Schemas are compiled once at startup using Ajv JIT, making runtime validation incredibly fast
Compare this to Express, where you'd typically use a library like Joi or express-validator, adding middleware overhead and requiring separate serialization logic. Fastify bakes this into the framework with zero runtime penalty.
If you want to take schema-driven development further, the @fastify/type-provider-typebox package lets you define schemas with TypeBox and get TypeScript types automatically inferred:
import Fastify from 'fastify';
import { TypeBoxTypeProvider } from '@fastify/type-provider-typebox';
import { Type } from '@sinclair/typebox';
const fastify = Fastify({ logger: true }).withTypeProvider<TypeBoxTypeProvider>();
const CreateUserBody = Type.Object({
email: Type.String({ format: 'email' }),
username: Type.String({ minLength: 3, maxLength: 30 }),
age: Type.Optional(Type.Integer({ minimum: 18 }))
});
fastify.post('/users', { schema: { body: CreateUserBody } }, async (request) => {
// request.body is fully typed here, no extra type assertions needed
const { email, username } = request.body;
return createUser({ email, username });
});
The Plugin System That Doesn't Suck
Fastify's plugin architecture is based on encapsulation. Each plugin gets its own context, preventing the namespace pollution that plagues many Express applications. Here's what that looks like:
// database-plugin.js
const fp = require('fastify-plugin');
const { MongoClient } = require('mongodb');
async function databasePlugin(fastify, options) {
const client = new MongoClient(options.uri);
await client.connect();
// Decorate the fastify instance with a database connection
fastify.decorate('mongo', client.db(options.database));
// Proper cleanup on shutdown
fastify.addHook('onClose', async (instance) => {
await client.close();
});
}
module.exports = fp(databasePlugin);
// app.js
const fastify = require('fastify')({ logger: true });
// Register the plugin
fastify.register(require('./database-plugin'), {
uri: process.env.MONGODB_URI,
database: 'myapp'
});
// Now all routes can access fastify.mongo
fastify.get('/users/:id', async (request, reply) => {
const user = await request.server.mongo
.collection('users')
.findOne({ _id: request.params.id });
return user;
});
The beauty here is encapsulation and lifecycle management. Plugins can have their own dependencies, decorators, and hooks. When the server shuts down, cleanup happens automatically in the correct order. No more hunting for connection leaks or race conditions during shutdown.
With over 296 plugins available in the ecosystem, there is coverage for almost every common need: authentication, CORS, rate limiting, database connections, circuit breakers, Swagger/OpenAPI generation, and more.
Modern JavaScript Without the Ceremony
Fastify was built for async/await from day one. No callback hell, no next() confusion, no wondering if you remembered to call next() or return a response. Your route handlers are just async functions:
fastify.get('/complex-operation', async (request, reply) => {
// Errors automatically trigger 500 responses
const data = await fetchFromDatabase();
const processed = await processData(data);
const enriched = await enrichWithExternalAPI(processed);
// Just return the data - serialization happens automatically
return enriched;
});
If an error is thrown anywhere in that chain, Fastify catches it, logs it via Pino, and returns an appropriate error response. No try/catch boilerplate unless you need custom error handling.
Fastify v5: What Changed
Fastify v5 was released in late 2024 and requires Node.js v20 or later (Node.js v18 exited LTS in April 2025). The update includes roughly 20 breaking changes, most of them cleanup items that had been accumulating as deprecation warnings in v4. Key things to know if you're upgrading:
- Full JSON schemas are required: The
jsonShortHandoption is removed. Route definitions forquerystring,params, andbodynow require complete JSON Schema objects. - Logger behavior changed: The
loggeroption no longer accepts a custom logger instance directly. UseloggerInstanceinstead, or configure Pino options throughlogger. hasRoute()behavior: Now only matches exact registered route strings, not arbitrary request paths.- All v4 deprecations removed: If your v4 code produced deprecation warnings, address them before upgrading.
The migration guide on the official Fastify docs covers each breaking change with before/after examples.
Making the Switch
A full rewrite is rarely the right starting point. The strategies below let you move incrementally, validating gains at each step before committing further.
The Gradual Approach
You don't need to rewrite your entire application overnight. Here's a practical three-phase migration strategy:
Phase 1: New Routes in Fastify (Weeks 1-2) Start by creating new endpoints in Fastify while keeping your existing Express app running. Use a reverse proxy to route traffic based on path:
// fastify-app.js - New service endpoints
const fastify = require('fastify')({ logger: true });
fastify.get('/api/v2/users', async (request, reply) => {
// New, optimized endpoint
return await fetchUsers();
});
fastify.listen({ port: 3001 });
// nginx.conf
// Route /api/v2/* to Fastify (port 3001)
// Route everything else to Express (port 3000)
Phase 2: Migrate High-Traffic Routes (Weeks 3-6) Identify your hottest paths using your APM tools. These are your quick wins, migrating them gives you immediate performance benefits:
// Before (Express)
app.get('/api/search', async (req, res) => {
const { q, limit = 10 } = req.query;
const results = await searchService.search(q, limit);
res.json(results);
});
// After (Fastify v5)
const searchSchema = {
querystring: {
type: 'object',
properties: {
q: { type: 'string', minLength: 1 },
limit: { type: 'integer', minimum: 1, maximum: 100, default: 10 }
},
required: ['q']
}
};
fastify.get('/api/search', { schema: searchSchema }, async (request, reply) => {
const { q, limit } = request.query;
return await searchService.search(q, limit);
});
Phase 3: Complete Migration (Weeks 7-12) Once you're comfortable with Fastify and have validated performance improvements, migrate remaining routes. This is also a great time to clean up technical debt and improve your schemas.
Testing the Waters
Before committing to a full migration, validate your assumptions:
- Create a proof of concept: Migrate one representative endpoint and load test it
- Measure everything: Compare response times, throughput, and resource usage
- Check compatibility: Ensure your existing middleware and libraries have Fastify equivalents
- Validate the plugin ecosystem: Confirm that plugins exist for your critical dependencies (database drivers, authentication, etc.)
Here's a complete benchmark setup you can run yourself on Node.js 20 LTS. It compares an equivalent JSON endpoint in both frameworks using autocannon, Fastify's recommended load testing tool.
Install dependencies
npm install fastify express autocannon
Express server (express-server.js)
const express = require('express');
const app = express();
// Simulate a small JSON payload identical to the Fastify server
app.get('/api/users', (req, res) => {
res.json([
{ id: '1', name: 'Alice', role: 'admin' },
{ id: '2', name: 'Bob', role: 'user' }
]);
});
app.listen(3000, () => console.log('Express listening on :3000'));
Fastify server (fastify-server.js)
const fastify = require('fastify')({ logger: false });
const responseSchema = {
200: {
type: 'array',
items: {
type: 'object',
properties: {
id: { type: 'string' },
name: { type: 'string' },
role: { type: 'string' }
}
}
}
};
fastify.get('/api/users', { schema: { response: responseSchema } }, async () => {
return [
{ id: '1', name: 'Alice', role: 'admin' },
{ id: '2', name: 'Bob', role: 'user' }
];
});
fastify.listen({ port: 3001 }, () => console.log('Fastify listening on :3001'));
Benchmark runner (benchmark.js)
const autocannon = require('autocannon');
const { spawn } = require('child_process');
function startServer(file) {
const proc = spawn(process.execPath, [file], { stdio: 'inherit' });
// Give the server 500 ms to bind before we start hammering it
return new Promise((resolve) => setTimeout(() => resolve(proc), 500));
}
async function runTest(url, name) {
console.log(`\nBenchmarking ${name} at ${url} ...`);
const result = await autocannon({
url,
connections: 100, // concurrent connections
duration: 30, // seconds
pipelining: 10 // HTTP pipelining factor
});
console.log(`\n${name} results (Node.js 20 LTS):`);
console.log(` Requests/sec (avg) : ${result.requests.average.toLocaleString()}`);
console.log(` Latency (avg) : ${result.latency.average} ms`);
console.log(` Latency (p99) : ${result.latency.p99} ms`);
console.log(` Throughput (avg) : ${(result.throughput.average / 1024 / 1024).toFixed(1)} MB/s`);
return result;
}
(async () => {
const expressProc = await startServer('express-server.js');
const fastifyProc = await startServer('fastify-server.js');
const express = await runTest('http://localhost:3000/api/users', 'Express v4');
const fastify = await runTest('http://localhost:3001/api/users', 'Fastify v5');
expressProc.kill();
fastifyProc.kill();
const ratio = (fastify.requests.average / express.requests.average).toFixed(2);
console.log(`\nFastify handled ${ratio}x more requests/sec than Express.`);
})();
Run it with:
node benchmark.js
Results on Node.js 20.11 LTS
| Metric | Express v4 | Fastify v5 | Improvement |
|---|---|---|---|
| Requests/sec | 38,510 | 76,835 | 2.0x faster |
| Avg latency | 2.6 ms | 1.3 ms | 50% lower |
| P99 latency | 48 ms | 12 ms | 75% lower |
The P99 number is the one that matters most in practice. A 75% reduction in your worst-case latency means fewer timeout errors, a more consistent user experience, and more headroom before you need to scale out.
Common Gotchas and How to Avoid Them
Middleware Translation Express middleware doesn't work directly in Fastify, but most patterns have equivalents:
| Express Pattern | Fastify Equivalent |
|---|---|
app.use(middleware) |
fastify.addHook('onRequest', handler) |
res.json(data) |
return data or reply.send(data) |
next() |
Not needed, use async/await |
req.params.id |
request.params.id |
| Custom middleware | Plugins or hooks |
Reply vs Return In Fastify, you can either use the reply object or return data directly. Pick one pattern and stick with it:
// Pattern 1: Using reply (more control)
fastify.get('/users', async (request, reply) => {
const users = await getUsers();
reply.code(200).send(users);
});
// Pattern 2: Return directly (cleaner for simple cases)
fastify.get('/users', async (request, reply) => {
return await getUsers(); // Automatically sends 200 with data
});
Schema Strictness
Fastify's schema validation is strict by default. Additional properties in requests are removed unless you specify additionalProperties: true. This is a security feature, but it can surprise developers coming from Express. In v5, this strictness is reinforced: full schema objects are required for all input parameters, so there's no ambiguity about what the framework will accept.
Why Teams Are Making the Switch
After working with Fastify across multiple projects, a clear pattern emerges in what drives teams to make the switch and what keeps them there. Here's what consistently stands out:
- Performance gains are real and measurable: Expect 2x or more throughput improvements without code changes beyond framework migration
- Developer experience is excellent: Async/await everywhere, first-class TypeScript support written into the framework itself, and helpful error messages make daily development smoother
- The plugin ecosystem is mature: With 296+ plugins, most common needs (auth, CORS, rate limiting, database connections, Swagger) have well-maintained solutions
- Schema-driven development prevents bugs: Catching validation errors at the framework level means fewer bugs in production, and the TypeBox integration gives you runtime safety and compile-time types from a single source of truth
- Migration risk is manageable: The gradual migration approach means you can validate benefits before full commitment
- v5 is the modern baseline: The cleanup work in Fastify v5 results in a more consistent API surface, and the Node.js v20+ requirement means you're building on a well-supported runtime
The teams seeing the most success with Fastify share some common traits. They value performance, appreciate schema-driven development, and are comfortable with modern JavaScript patterns. They're building APIs that need to scale efficiently, whether that's handling high traffic or reducing infrastructure costs.
Is Fastify right for every project? Probably not. If you have a small internal tool with minimal traffic, Express's larger ecosystem and familiarity might outweigh Fastify's performance benefits. But for customer-facing APIs, microservices, or any application where performance and efficiency matter, Fastify deserves serious consideration.
The question to ask yourself: Are you building something that needs to be fast, or are you building something that just needs to work? If the answer is both, and let's be honest, it usually is, Fastify might be exactly what you've been looking for.
What performance bottlenecks are you currently accepting as "just how things are"? And more importantly, what could you build if those constraints disappeared?
Share this article
Enjoyed this article?
Subscribe to get more insights delivered to your inbox monthly
Subscribe to Newsletter