The useChat Hook Explained: Build AI Chat Interfaces the Right Way
The useChat hook from the Vercel AI SDK handles streaming, message history, and state management for AI chat. Learn all its options and patterns for building production chat interfaces.
What useChat Does for You
Building an AI chat interface from scratch means managing: streaming responses chunk by chunk, message history, loading states, error handling, and form submission. The useChat hook from the Vercel AI SDK handles all of this in about 5 lines of code.
Basic Usage
'use client';
import { useChat } from 'ai/react';
export function ChatInterface() {
const {
messages, // Message[] — full conversation history
input, // string — current input value
handleInputChange, // input onChange handler
handleSubmit, // form onSubmit handler
isLoading, // boolean — streaming in progress
error, // Error | undefined
stop, // cancel current stream
reload, // retry last message
append, // programmatically add a message
} = useChat({
api: '/api/chat', // your route handler
});
return (
<div className="flex flex-col h-screen">
<div className="flex-1 overflow-y-auto p-4 space-y-4">
{messages.map(m => (
<div key={m.id} className={`flex ${m.role === 'user' ? 'justify-end' : 'justify-start'}`}>
<div className={`p-3 rounded-lg max-w-md ${
m.role === 'user' ? 'bg-blue-600 text-white' : 'bg-gray-100'
}`}>
{m.content}
</div>
</div>
))}
{isLoading && <div className="text-gray-400 text-sm">Thinking...</div>}
</div>
<form onSubmit={handleSubmit} className="p-4 border-t flex gap-2">
<input
value={input}
onChange={handleInputChange}
placeholder="Type a message..."
className="flex-1 p-2 border rounded"
disabled={isLoading}
/>
<button type="submit" disabled={isLoading}>Send</button>
{isLoading && <button type="button" onClick={stop}>Stop</button>}
</form>
</div>
);
}
The Server Route Handler
// app/api/chat/route.ts
import { streamText } from 'ai';
import { openai } from '@ai-sdk/openai';
export async function POST(request: Request) {
const { messages } = await request.json();
const result = streamText({
model: openai('gpt-4o-mini'),
system: 'You are a helpful assistant.',
messages, // useChat sends the full history automatically
});
return result.toDataStreamResponse();
}
Sending Extra Data With Messages
// Send extra body data with each request
const { handleSubmit } = useChat({
api: '/api/chat',
body: {
userId: currentUser.id,
sessionId: session.id,
preferredLanguage: 'en',
},
});
// Access in the route handler
export async function POST(request: Request) {
const { messages, userId, sessionId } = await request.json();
// Use userId to personalize the system prompt, etc.
}
Initial Messages and Conversation Seeding
const { messages } = useChat({
api: '/api/chat',
initialMessages: [
{
id: 'system-greeting',
role: 'assistant',
content: 'Hello! I'm your coding assistant. What are you building today?'
}
],
});
Error Handling and Retry
const { error, reload } = useChat({ api: '/api/chat' });
if (error) {
return (
<div className="p-4 bg-red-50 border border-red-200 rounded">
<p className="text-red-700">Something went wrong: {error.message}</p>
<button onClick={reload} className="mt-2 px-3 py-1 bg-red-600 text-white rounded text-sm">
Try Again
</button>
</div>
);
}
Programmatic Messages With append
const { append } = useChat({ api: '/api/chat' });
// Trigger a message without user input
async function startOnboarding() {
await append({
role: 'user',
content: 'Help me get started with your product'
});
}
When to Go Beyond useChat
Use useChat for standard chat flows. When you need tool calls, structured output, multi-agent coordination, or complex state machines, look at useCompletion for simple text completion or build a custom streaming hook around the AI SDK's core primitives.
Admin
Cal.com
Open source scheduling — tự host booking system, thay thế Calendly. Free & privacy-first.
Vercel
Deploy Next.js app trong 30 giây. Free tier rộng rãi cho side projects.
Bình luận (0)
Đăng nhập để bình luận
Chưa có bình luận nào. Hãy là người đầu tiên!