Streaming AI Responses in React: Techniques and Patterns
Streaming makes AI responses feel instant by showing text as it generates. Learn the three patterns for streaming AI in React — from simple to advanced — and when to use each.
Why Streaming Transforms AI UX
Without streaming, users wait 3-8 seconds staring at a loading spinner before seeing any response. With streaming, text appears within 200-400ms and flows in naturally — just like watching someone type. This single change makes AI interfaces feel dramatically more responsive.
Pattern 1: Using the useChat Hook (Simplest)
The Vercel AI SDK's useChat handles streaming automatically. This is the right choice for 90% of chat interfaces:
'use client';
import { useChat } from 'ai/react';
export function SimpleStreamingChat() {
const { messages, input, handleInputChange, handleSubmit, isLoading } = useChat({
api: '/api/chat',
});
return (
<div>
{messages.map(m => (
<div key={m.id} className={m.role}>>
{/* Text appears incrementally as it streams */}
<p>{m.content}</p>
</div>
))}
<form onSubmit={handleSubmit}>
<input value={input} onChange={handleInputChange} />
<button type="submit" disabled={isLoading}>Send</button>
</form>
</div>
);
}
Pattern 2: Manual Fetch With ReadableStream (More Control)
When you need custom handling or aren't using the AI SDK's hooks:
'use client';
import { useState } from 'react';
export function ManualStreamingComponent() {
const [response, setResponse] = useState('');
const [loading, setLoading] = useState(false);
async function sendMessage(prompt: string) {
setLoading(true);
setResponse('');
const res = await fetch('/api/generate', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ prompt }),
});
const reader = res.body!.getReader();
const decoder = new TextDecoder();
while (true) {
const { done, value } = await reader.read();
if (done) break;
const text = decoder.decode(value);
setResponse(prev => prev + text);
}
setLoading(false);
}
return (
<div>
<p className="whitespace-pre-wrap">{response}</p>
{loading && <span className="animate-pulse">▋</span>}
</div>
);
}
Pattern 3: Server-Sent Events for Real-Time Updates
For cases where the server needs to push multiple streaming updates (like agent progress):
// app/api/agent-run/route.ts
export async function POST(request: Request) {
const encoder = new TextEncoder();
const stream = new ReadableStream({
async start(controller) {
function send(data: object) {
controller.enqueue(encoder.encode(`data: ${JSON.stringify(data)}\n\n`));
}
send({ type: 'status', message: 'Starting analysis...' });
await runStep1();
send({ type: 'status', message: 'Processing results...' });
await runStep2();
send({ type: 'complete', result: 'Done!' });
controller.close();
}
});
return new Response(stream, {
headers: {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
}
});
}
// Client: consume SSE with EventSource
const es = new EventSource('/api/agent-run');
es.onmessage = (event) => {
const data = JSON.parse(event.data);
if (data.type === 'status') setStatus(data.message);
if (data.type === 'complete') {
setResult(data.result);
es.close();
}
};
Rendering Markdown From Streaming Text
Streaming markdown needs special handling to avoid hydration errors as partial syntax arrives:
import ReactMarkdown from 'react-markdown';
function StreamingMarkdown({ content }: { content: string }) {
// Add a cursor to indicate streaming is active
const displayContent = content + '▋';
return (
<div className="prose">
<ReactMarkdown>{displayContent}</ReactMarkdown>
</div>
);
}
Handling Network Interruptions
Always handle stream failures gracefully with retry logic and clear error states. The AI SDK's useChat handles this automatically; for manual streams, wrap in try/catch and expose a retry button. Users accept occasional failures far better than they accept hangs with no feedback.
Admin
Cal.com
Open source scheduling — tự host booking system, thay thế Calendly. Free & privacy-first.
Vercel
Deploy Next.js app trong 30 giây. Free tier rộng rãi cho side projects.
Bình luận (0)
Đăng nhập để bình luận
Chưa có bình luận nào. Hãy là người đầu tiên!