How to Fix 'async event loop error during development' in LangGraph (TypeScript)
What this error usually means
If you’re seeing async event loop error during development in LangGraph TypeScript, you’re usually calling an async graph API from the wrong execution context. In practice, this shows up when the graph is invoked during module initialization, inside a hot-reload path, or from code that is already mid-await and gets re-entered by the dev server.
The key point: LangGraph itself is not “broken.” Your runtime is usually trying to manage multiple async lifecycles at once, and the Node event loop complains.
The Most Common Cause
The #1 cause is running graph.invoke() or graph.stream() at import time instead of inside an explicit async handler. This is common in Next.js route files, Vite dev servers, test setup files, and scripts that execute immediately when imported.
Here’s the broken pattern:
| Broken | Fixed |
|---|---|
| Runs on import | Runs inside a function |
| Hard to control lifecycle | Explicit async boundary |
| Breaks under hot reload | Stable in dev |
// broken.ts
import { StateGraph } from "@langchain/langgraph";
const graph = new StateGraph(/* ... */).compile();
// This runs as soon as the file is imported.
const result = await graph.invoke({
messages: [{ role: "user", content: "hello" }],
});
console.log(result);
// fixed.ts
import { StateGraph } from "@langchain/langgraph";
const graph = new StateGraph(/* ... */).compile();
export async function runGraph() {
const result = await graph.invoke({
messages: [{ role: "user", content: "hello" }],
});
return result;
}
// Call it from a request handler, CLI entrypoint, or test.
runGraph().then(console.log);
If you’re in a framework like Next.js, keep the invocation inside the route handler:
import { NextRequest, NextResponse } from "next/server";
import { runGraph } from "./fixed";
export async function POST(_req: NextRequest) {
const output = await runGraph();
return NextResponse.json(output);
}
The important part is that the graph call happens after your runtime has fully started, not while modules are still loading.
Other Possible Causes
1) Mixing ESM and CommonJS in a dev build
LangGraph TypeScript projects often hit event loop weirdness when the project compiles one way and runs another way.
Broken config:
{
"type": "module",
"compilerOptions": {
"module": "CommonJS"
}
}
Fixed config:
{
"type": "module",
"compilerOptions": {
"module": "NodeNext",
"moduleResolution": "NodeNext"
}
}
If your package is ESM, keep your TypeScript module settings aligned. Don’t mix require() with import unless you know exactly how your bundler resolves it.
2) Recreating the graph on every hot reload
If your dev server re-imports files and rebuilds the graph repeatedly, you can end up with multiple async executions racing each other.
Broken:
export function createApp() {
const graph = new StateGraph(/* ... */).compile();
return graph;
}
const app = createApp();
await app.invoke(input);
Fixed:
let compiledGraph: ReturnType<StateGraph<any>["compile"]> | null = null;
export function getGraph() {
if (!compiledGraph) {
compiledGraph = new StateGraph(/* ... */).compile();
}
return compiledGraph;
}
This matters in dev environments where module reloads are frequent. Compile once per process if possible.
3) Calling LangGraph from top-level test setup
Jest, Vitest, and Node test runners can trigger async work before the environment is ready.
Broken:
// setup.ts
import { graph } from "./graph";
await graph.invoke({ input: "test" });
Fixed:
// setup.ts
beforeAll(async () => {
await graph.invoke({ input: "test" });
});
If you need top-level setup, make sure your runner supports it and that the file is treated as an actual ESM module. Otherwise move it into lifecycle hooks.
4) Using stream() without consuming or closing it properly
A stream left open during development can look like an event loop issue because Node keeps work alive longer than expected.
Broken:
const stream = await graph.stream(input);
// never consumed
Fixed:
const stream = await graph.stream(input);
for await (const chunk of stream) {
console.log(chunk);
}
If you use streaming APIs, always consume them fully or close them according to the API contract.
How to Debug It
- •
Find the first place
invoke,stream, orstreamEventsis called.
If it’s at top level in a module, move it into a function immediately. - •
Check whether the stack trace points to import-time execution.
If you see frames fromnext dev, Vite HMR, test setup files, or module initialization code, that’s usually your culprit. - •
Verify your runtime mode.
Confirm whether you’re running ESM or CommonJS consistently across:- •
package.json - •
tsconfig.json - •bundler config
- •test runner config
- •
- •
Reduce to a single entrypoint.
Temporarily create one script that does nothing except:import { graph } from "./graph"; async function main() { console.log(await graph.invoke({ input: "ping" })); } main().catch(console.error);If this works, your bug is in framework integration, not LangGraph itself.
Prevention
- •Keep all LangGraph execution behind explicit async functions.
- •Compile graphs once per process; don’t rebuild them on every request or hot reload.
- •Align ESM/CommonJS settings across TypeScript, package.json, tests, and bundlers.
- •In frameworks like Next.js or Express middleware chains, invoke graphs only inside request handlers or service methods.
- •Add one smoke test that runs
graph.invoke()from a clean Node entrypoint before shipping changes.
If you follow that pattern, most “async event loop” errors disappear fast. The remaining cases are usually config drift or framework lifecycle mistakes, not LangGraph bugs.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit