
How to Handle Background Jobs with BullMQ and Redis in Node.js (2026)
Stop blocking HTTP responses with slow tasks. This guide shows how to queue work with BullMQ and Redis in Node.js — retries, concurrency, and cron scheduling in under 30 minutes.
The problem
When a user action triggers something slow — sending an email, calling a third-party API, resizing an image — blocking the HTTP response until it finishes is the wrong move. The naive fix is await someSlowThing() inside the route handler, which ties up the server thread and breaks under load. You need a queue: accept the request instantly, hand the work to a background worker, and process it reliably with retries. BullMQ on top of Redis is the production-grade answer for Node.js — it handles retries, concurrency, delayed jobs, and cron scheduling out of the box.
Prerequisites
- Node.js 22+
- Redis 7+ (local Docker:
docker run -p 6379:6379 redis:7-alpine) - TypeScript 5 (optional but assumed)
npm install bullmq ioredis- Env var:
REDIS_URL=redis://localhost:6379
Step 1: Create a shared queue
A Queue is the entry point — you add jobs here from anywhere in your app (API routes, webhooks, crons):
// queue/email-queue.ts
import { Queue } from "bullmq";
import IORedis from "ioredis";
const connection = new IORedis(process.env.REDIS_URL!, {
maxRetriesPerRequest: null, // required by BullMQ
});
export const emailQueue = new Queue("email", { connection });
// Add a job: fire-and-forget from your API handler
await emailQueue.add(
"send-welcome",
{ to: "[email protected]", name: "Alice" },
{ attempts: 3, backoff: { type: "exponential", delay: 2000 } }
);
Step 2: Define the worker
A Worker pulls jobs from the queue and processes them. Run this in a separate process (e.g. node worker.js) so it doesn't share resources with your web server:
// worker/email-worker.ts
import { Worker, Job } from "bullmq";
import IORedis from "ioredis";
const connection = new IORedis(process.env.REDIS_URL!, {
maxRetriesPerRequest: null,
});
const worker = new Worker(
"email",
async (job: Job) => {
const { to, name } = job.data;
console.log(\`[email] Sending welcome to \${to}\`);
// await resend.emails.send({ to, subject: "Welcome!", ... });
return { sent: true };
},
{
connection,
concurrency: 5, // process 5 jobs in parallel
}
);
worker.on("completed", (job) =>
console.log(\`Job \${job.id} completed\`)
);
worker.on("failed", (job, err) =>
console.error(\`Job \${job?.id} failed:\`, err.message)
);
Step 3: Add recurring scheduled jobs
BullMQ's QueueScheduler was removed in v3 — use queue.upsertJobScheduler instead for cron jobs:
// Schedule a daily digest job at 07:00 UTC
await emailQueue.upsertJobScheduler(
"daily-digest", // scheduler ID (idempotent)
{ pattern: "0 7 * * *" }, // cron expression
{
name: "send-digest",
data: { type: "daily" },
opts: { attempts: 2 },
}
);
Full working example
// ─── queue/index.ts ──────────────────────────────────────────────────────────
import { Queue } from "bullmq";
import IORedis from "ioredis";
const connection = new IORedis(process.env.REDIS_URL!, {
maxRetriesPerRequest: null,
});
export const emailQueue = new Queue("email", { connection });
export async function scheduleJobs() {
await emailQueue.upsertJobScheduler(
"daily-digest",
{ pattern: "0 7 * * *" },
{ name: "send-digest", data: { type: "daily" }, opts: { attempts: 2 } }
);
}
// ─── worker/index.ts ─────────────────────────────────────────────────────────
import { Worker, Job } from "bullmq";
import IORedis from "ioredis";
const connection = new IORedis(process.env.REDIS_URL!, {
maxRetriesPerRequest: null,
});
async function processEmail(job: Job): Promise<{ sent: boolean }> {
const { to, name, type } = job.data;
switch (job.name) {
case "send-welcome":
console.log(\`Sending welcome email to \${to} (\${name})\`);
// await sendWelcomeEmail(to, name);
break;
case "send-digest":
console.log(\`Sending daily digest (type: \${type})\`);
// await sendDigest();
break;
default:
throw new Error(\`Unknown job: \${job.name}\`);
}
return { sent: true };
}
const worker = new Worker("email", processEmail, {
connection,
concurrency: 5,
removeOnComplete: { count: 100 },
removeOnFail: { count: 50 },
});
worker.on("completed", (job) => console.log(\`✓ \${job.id} \${job.name}\`));
worker.on("failed", (job, err) => console.error(\`✗ \${job?.id}:\`, err.message));
// ─── api/enqueue/route.ts (Next.js App Router) ───────────────────────────────
import { NextRequest, NextResponse } from "next/server";
import { emailQueue } from "@/queue";
export const dynamic = "force-dynamic";
export async function POST(req: NextRequest) {
const { to, name } = await req.json();
if (!to) return NextResponse.json({ error: "to required" }, { status: 400 });
const job = await emailQueue.add(
"send-welcome",
{ to, name },
{ attempts: 3, backoff: { type: "exponential", delay: 2000 } }
);
return NextResponse.json({ jobId: job.id }, { status: 202 });
}
Testing it
Start the worker in one terminal (npx tsx worker/index.ts), then POST a job:
curl -X POST http://localhost:3000/api/enqueue \
-H "Content-Type: application/json" \
-d '{"to":"[email protected]","name":"Alice"}'
# → {"jobId":"1"}
# Worker terminal: ✓ 1 send-welcome
Troubleshooting
maxRetriesPerRequest must be null: BullMQ requires this IORedis option explicitly — addmaxRetriesPerRequest: nullto your connection config.- Jobs stuck in
waiting: No worker is connected to the queue. Make sure the worker process is running and connected to the same Redis instance. - Scheduler not firing:
upsertJobSchedulerrequires BullMQ v5+. Checknpm ls bullmqand upgrade if needed.
Get weekly highlights
No spam, unsubscribe anytime.
Fillout
Powerful form builder for devs — integrates with Next.js, React, Notion, Airtable. Save hours of coding.
Dub.co
Short links & analytics for developers — track clicks, create branded links, manage affiliate URLs with ease.



Comments (0)
Sign in to comment
No comments yet. Be the first to comment!