Turbopack Internals: Why It's Faster Than Webpack and What Its Limits Are
A deep dive into Turbopack's incremental computation engine, Rust-based architecture, and function-level memoization — exploring why it outperforms Webpack by up to 24×, real benchmark numbers, current limitations like missing Module Federation support, and how to enable it in Next.js today.
The Bundler War Is Not Over
For years, Webpack has been the undisputed king of JavaScript bundling. It powered nearly every major React application, Next.js site, and enterprise frontend stack. But as codebases grew larger and developer frustration with slow build times mounted, the ecosystem started looking for alternatives. Enter Turbopack — Vercel's bet on a fundamentally different architecture for JavaScript tooling.
Turbopack isn't just a faster Webpack. It's a rethink of how bundlers should work at a systems level — written in Rust, powered by an incremental computation engine, and designed from day one to stay fast as your project scales.
In this deep dive, we'll peel back the layers of Turbopack's internals: the Turbo engine, its graph-based dependency model, why Rust matters here, how it stacks up against Webpack in real benchmark numbers, and what you still can't do with it today.
What Is Turbopack, Really?
Turbopack is an incremental bundler and build system built with Rust, announced by Vercel at Next.js Conf 2022. It was created with involvement from the same mind behind Webpack (Tobias Koppers joined Vercel) and is designed to replace Webpack inside the Next.js ecosystem — and eventually serve as a standalone bundler.
The key distinction: Turbopack is not a port of Webpack to Rust. It's a ground-up redesign that abandons many of Webpack's core architectural decisions in favor of an incremental computation model borrowed from build systems like Bazel, Buck, and Turborepo (which shares the same underlying "Turbo" engine).
"Turbopack uses an on-demand, incremental computation engine that only processes what it needs to, when it needs to." — Vercel Engineering Blog
The Incremental Computation Model: The Core Insight
The single biggest reason Turbopack is fast isn't Rust. It's the incremental computation model — specifically the Turbo engine, a demand-driven, lazy evaluation system with fine-grained memoization baked into its core.
How Traditional Bundlers Think
Webpack and most traditional bundlers work in sequential phases:
- Resolve all entry points
- Build the full dependency graph
- Transform every discovered module
- Optimize, tree-shake, and emit chunks
Even with caching layers bolted on, Webpack's mental model is fundamentally a full pipeline — it needs to understand the entire graph before it can emit anything useful. This works reasonably well for small projects but becomes a serious bottleneck as your codebase reaches hundreds or thousands of modules.
Turbo Engine: Function-Level Caching
Turbopack's engine operates on a completely different model. Every operation — resolving a file, parsing it, transforming it, linking dependencies — is modeled as a pure function with memoized outputs. The engine tracks which inputs produced which outputs, forming a fine-grained dependency DAG (Directed Acyclic Graph) at the function level, not the module level.
When you change a file, the engine invalidates only the downstream functions whose inputs changed. It doesn't re-process the entire graph. It re-runs only the smallest possible set of computations necessary to produce an updated result.
This is called demand-driven evaluation: the engine starts from the output you need — say, the dev server response for a specific module — and walks backwards through the graph, executing only computations that are stale.
// Conceptual model of Turbo engine's computation graph
[file A changed]
→ invalidate: parse(A)
→ invalidate: transform(A)
→ invalidate: link(A → B)
// parse(B), transform(B) → still valid, served from cache
// parse(C), transform(C) → untouched entirely
This is fundamentally different from Webpack's incremental builds, which re-process entire chunks due to how its cache is structured around bundle boundaries rather than individual function-level computations.
Persistent Caching Across Sessions
The Turbo engine also supports persistent caching that survives process restarts. Computed results are serialized to disk, so your second next dev startup is dramatically faster — it doesn't re-parse files that haven't changed since the last session. Webpack 5 introduced filesystem caching too, but Turbopack's cache operates at a more granular level. Where Webpack caches module compilation results, Turbopack caches individual function invocations — giving it much finer invalidation granularity and far higher cache hit rates in practice.
Rust-Based Architecture: More Than Raw Speed
Turbopack is written in Rust, and this choice has implications beyond simple execution speed.
No Garbage Collector Pauses
JavaScript (Node.js) has a garbage collector. For long-running processes like dev servers, GC pauses can introduce latency spikes at the worst possible moments — right when you save a file and expect instant feedback. Rust's ownership model eliminates this entirely: no GC, predictable memory usage, no stop-the-world pauses during HMR updates.
True CPU Parallelism
Node.js is single-threaded by default. Webpack achieves parallelism through worker threads (thread-loader, HappyPack), but inter-thread communication has overhead and coordination is complex. Rust's fearless concurrency model allows Turbopack to parallelize the entire computation graph safely — transforming dozens of files simultaneously across all available CPU cores with minimal coordination overhead.
Native SWC Integration
Turbopack uses SWC (Speedy Web Compiler) as its JavaScript and TypeScript transformer — also written in Rust. This means the transform pipeline is entirely native: no spawning Babel worker processes, no serialization overhead between JavaScript and native code. Parsing, transforming, and type-stripping all happen in the same memory space as the bundler itself.
Babel, in contrast, runs in JavaScript and transforms ASTs through a plugin chain that — while extremely flexible — adds significant overhead per module. A project with 500 modules and a complex Babel config can spend multiple seconds purely in transformation. Turbopack eliminates this bottleneck at the architectural level.
Turbopack vs Webpack: Architecture Head-to-Head
Dependency Graph Construction
Webpack builds its dependency graph eagerly — it resolves and processes every import it can find, constructing a complete picture before outputting anything. This is why webpack --watch takes time to spin up, even before you've touched a single file.
Turbopack builds the graph lazily and on-demand. When the dev server receives a request for a module, it resolves only that module's immediate dependencies — traversing the graph incrementally as requests arrive. Modules that are never requested in a session are never processed. On large apps, this alone can save multiple seconds of startup time.
Hot Module Replacement
Webpack's HMR works by rebuilding the affected chunk and pushing the update to the browser. For large chunks — which are common in real apps — this means re-bundling hundreds of modules even if only one changed.
Turbopack's HMR is module-granular. Because the computation graph is function-level, a file change triggers re-computation of only the affected functions. The browser receives a minimal patch — just the updated module(s), not an entire re-bundled chunk. This is why Turbopack's HMR latency stays nearly constant regardless of project size.
Plugin and Loader Architecture
Webpack's plugin system is JavaScript-based and extremely mature — thousands of plugins exist for SVGs, MDX, YAML, CSS modules, and virtually every other use case. This maturity is also its Achilles heel: JS plugins can introduce significant overhead, and their interaction with Webpack's internal graph can cause cascading slowdowns that are difficult to diagnose.
Turbopack's plugin system is still maturing. It uses a Rust-based transformation pipeline with "transforms" operating on asset types. Custom Webpack loaders aren't directly portable, which remains one of Turbopack's most significant current limitations.
Real Benchmark Numbers
Vercel published controlled benchmarks comparing Turbopack and Webpack dev server performance. Here are the key findings:
Cold Start (First Dev Server Boot)
- 1,000 module app: Turbopack ~1.8s vs Webpack ~8.5s — 4.7× faster
- 5,000 module app: Turbopack ~3.4s vs Webpack ~28s — 8.2× faster
- 30,000 module app: Turbopack ~7.1s vs Webpack ~170s+ — 24× faster
HMR Update Speed (After a File Change)
- Small app (<1,000 modules): Turbopack ~15ms vs Webpack ~250ms
- Large app (5,000+ modules): Turbopack ~20ms vs Webpack ~1,200ms+
Note: These benchmarks come from Vercel's controlled environments. Real-world results vary based on machine specs, project configuration, and module complexity. What matters is the relative improvement — and it consistently grows in Turbopack's favor as project size increases.
Why the Gap Grows With Scale
This is the most critical insight: Turbopack's advantage isn't linear — it's superlinear. As module count grows, Webpack's costs compound because more of the graph must be re-evaluated. Turbopack's incremental model keeps HMR times nearly constant because it only reprocesses the changed subgraph. At 30,000 modules, this difference is measured in minutes, not milliseconds.
How to Enable Turbopack in Next.js
As of Next.js 15, Turbopack is the default bundler for next dev. You don't need any flags — it's enabled out of the box.
# Next.js 15+ — Turbopack is the default dev bundler
next dev
If you're on Next.js 13 or 14 and want to opt in:
# Explicit opt-in (Next.js 13.x - 14.x)
next dev --turbo
You can configure Turbopack behavior in next.config.js:
// next.config.js
/** @type {import('next').NextConfig} */
const nextConfig = {
experimental: {
turbo: {
// Custom module aliases
resolveAlias: {
'underscore': 'lodash',
},
// File extension resolution order
resolveExtensions: ['.tsx', '.ts', '.jsx', '.js'],
},
},
};
module.exports = nextConfig;
For production builds (next build), Webpack remains the stable default as of early 2026. Turbopack production builds are available under an experimental flag in canary releases:
# Experimental production build (Next.js canary)
next build --experimental-turbo
Current Limitations: Where Turbopack Falls Short
Turbopack is genuinely impressive, but it is not a drop-in replacement for Webpack in all scenarios. Here are the limitations that will matter for real projects:
1. No Stable Production Builds Yet
Turbopack's production build pipeline is still experimental. Most teams use Turbopack only for next dev and keep Webpack for next build. This creates a potential dev/prod parity gap if your project relies heavily on bundler-specific behavior.
2. Limited Custom Loader Support
Webpack's loader ecosystem spans thousands of packages. Turbopack supports a growing subset via its built-in asset pipeline, but custom Webpack loaders cannot be directly ported. Projects with non-standard loaders for MDX, SVG sprite generation, or exotic file types may need to find alternatives.
3. No Module Federation
Webpack 5's Module Federation — the backbone of many micro-frontend architectures — is not currently supported in Turbopack. Teams using Module Federation are effectively blocked from migrating until this is addressed.
4. CommonJS Interop Edge Cases
Turbopack handles CommonJS modules differently from Webpack in some edge cases — particularly around dynamic require() patterns and circular dependencies in legacy packages. This can cause unexpected failures when working with older npm packages.
5. Immature Observability Tooling
Tools like Webpack Bundle Analyzer have no direct Turbopack equivalent yet. Understanding what's in your bundle, debugging size regressions, and optimizing code splitting are harder without this ecosystem maturity.
6. Windows Performance Discrepancy
Some developers report less pronounced performance gains on Windows compared to macOS or Linux, partly due to filesystem event handling differences and NTFS path overhead. This is actively being worked on but worth factoring in for cross-platform teams.
The Future Roadmap
Turbopack's development velocity is high, and the trajectory is clear. Here's what's coming:
Stable Production Builds
Making next build powered by Turbopack is the team's primary milestone. When this stabilizes, the entire Next.js build lifecycle — dev and prod — will run on the same incremental engine, eliminating the current parity gap entirely.
Turbopack as a Standalone Bundler
The Turbo engine is already extracted as a separate Rust crate shared with Turborepo. The long-term vision is Turbopack as a general-purpose bundler, usable outside of Next.js — competing directly with Vite and esbuild in the broader ecosystem.
Webpack Loader Compatibility Layer
Vercel is actively building a compatibility shim that will allow existing Webpack loaders to run inside Turbopack's pipeline. JS-based loaders won't get full parallelism benefits, but the migration path from complex Webpack configs will become far smoother.
Module Federation Support
Module Federation is on the roadmap, though without a firm timeline. This is the key unlock for enterprise micro-frontend teams and represents a significant unblocking for widespread Turbopack adoption in large organizations.
Remote Cache Sharing
The architecture already supports sharing cached computation results across machines — analogous to Turborepo's Remote Caching. This could mean CI pipelines and local dev environments share the same cache, making cold starts nearly instant even on fresh containers.
Should You Switch Today?
For Next.js 15+ projects — Turbopack is already powering your dev server. The question is whether to opt into experimental production builds. For most production teams: not yet.
For Next.js 13/14 projects with complex Webpack configurations — evaluate carefully. If you rely on Module Federation, custom loaders, or non-standard CJS packages, Turbopack may not be ready. If your config is relatively standard, running next dev --turbo in a branch is a low-risk experiment worth trying.
For greenfield projects — embrace Turbopack via Next.js 15. You'll get immediate dev speed gains, and by the time you need stable production builds, they'll very likely be ready.
Conclusion
Turbopack represents a genuine architectural leap in JavaScript bundling — not primarily because it's written in Rust, but because the incremental computation model solves a fundamental scaling problem that has haunted large-scale frontend development for years.
The compounding innovations — demand-driven evaluation, function-level memoization, persistent cross-session caching, and fine-grained HMR invalidation — together produce performance characteristics that scale gracefully with project size. This is why Turbopack is 24× faster than Webpack on 30,000-module apps: it isn't doing the same work faster — it's doing fundamentally less work.
The limitations are real and should not be minimized. Production build stability, Module Federation, and the Webpack plugin ecosystem are gaps that matter to real teams today. But the direction is unmistakable: Turbopack is where Next.js is heading, and understanding its internals gives you the mental model to know when to embrace it, when to wait, and why the performance numbers aren't magic — they're engineering.
Admin
Cal.com
Open source scheduling — self-host your booking system, replace Calendly. Free & privacy-first.
Comments (0)
Sign in to comment
No comments yet. Be the first to comment!