React Suspense and Streaming: Reduce TTFB With Server Components
React Suspense with streaming SSR lets you send HTML progressively, improving perceived performance. Learn how to combine Server Components, Suspense, and streaming to dramatically reduce TTFB.
The Problem With Traditional SSR
Traditional server-side rendering waits until the entire page is rendered before sending any HTML. If one data fetch is slow (say, a personalized recommendations panel), the whole page is blocked. Users see nothing until every piece of data arrives.
React Suspense with streaming SSR solves this: you send the HTML shell immediately, then stream in the deferred parts as they become ready.
How Streaming Works
With streaming, the server sends HTML in chunks using HTTP chunked transfer encoding. The browser can start rendering and executing JavaScript from the first chunk while the server is still processing the rest.
// app/page.tsx — Next.js App Router streaming example
import { Suspense } from 'react';
import { ArticleList } from './ArticleList';
import { RecommendedPosts } from './RecommendedPosts';
import { Skeleton } from './Skeleton';
export default function HomePage() {
return (
<main>
{/* Shell renders immediately — no data needed */}
<header><Nav /></header>
{/* Critical content — streams as soon as data is ready */}
<Suspense fallback={<Skeleton rows={5} />}>
<ArticleList />
</Suspense>
{/* Non-critical — streams later, doesn't block above */}
<Suspense fallback={<Skeleton rows={3} />}>
<RecommendedPosts />
</Suspense>
</main>
);
}
Server Components That Fetch Data
// app/ArticleList.tsx — Server Component
// No "use client" directive = runs on server
async function ArticleList() {
// This fetch happens on the server
const articles = await db.post.findMany({
where: { status: 'PUBLISHED' },
orderBy: { publishedAt: 'desc' },
take: 10,
select: { id: true, title: true, slug: true, summary: true }
});
return (
<ul>
{articles.map(article => (
<li key={article.id}>
<a href={`/posts/${article.slug}`}>{article.title}</a>
<p>{article.summary}</p>
</li>
))}
</ul>
);
}
Parallel Data Fetching in Server Components
Avoid sequential awaits in Server Components — fetch in parallel:
// BAD: sequential — total time = A + B + C
async function Dashboard() {
const user = await getUser(); // 100ms
const stats = await getStats(); // 200ms
const alerts = await getAlerts(); // 150ms
// Total: 450ms
}
// GOOD: parallel — total time = max(A, B, C)
async function Dashboard() {
const [user, stats, alerts] = await Promise.all([
getUser(), // 100ms
getStats(), // 200ms
getAlerts(), // 150ms
]);
// Total: 200ms
}
Loading UI with generateStaticParams
// app/posts/[slug]/loading.tsx
// This file creates the Suspense boundary automatically
export default function Loading() {
return (
<div className="animate-pulse">
<div className="h-8 bg-gray-200 rounded w-3/4 mb-4" />
<div className="h-4 bg-gray-200 rounded w-full mb-2" />
<div className="h-4 bg-gray-200 rounded w-5/6" />
</div>
);
}
Measuring the Improvement
Streaming typically improves:
- TTFB (Time to First Byte): from 500-1500ms to under 100ms because the shell sends immediately
- FCP (First Contentful Paint): users see content sooner even if some sections are still loading
- Perceived performance: skeletons feel much better than blank screens
When Not to Use Streaming
Streaming doesn't help if your entire page depends on one critical data source. It shines when different page sections have independent data needs with different fetch times. Identify your page's slow parts and wrap only those in Suspense boundaries.
Admin
Cal.com
Open source scheduling — tự host booking system, thay thế Calendly. Free & privacy-first.
Bình luận (0)
Đăng nhập để bình luận
Chưa có bình luận nào. Hãy là người đầu tiên!