How to Fix 'async event loop error during development' in LangChain (TypeScript)
What this error means
If you’re seeing async event loop error during development in a LangChain TypeScript app, you’re usually hitting a Node runtime mismatch or an async boundary problem in your dev workflow. It often shows up when running with ts-node, tsx, Jest, Vitest, or a hot-reload tool that changes how the event loop is managed.
In practice, the real underlying error is often one of these:
- •
Error: Cannot use 'await' outside an async function - •
Error [ERR_REQUIRE_ESM]: require() of ES Module - •
UnhandledPromiseRejectionWarning - •
TypeError: ... is not a functionfrom mixing sync and async LangChain APIs
The Most Common Cause
The #1 cause is calling async LangChain methods from top-level code or from a non-async callback, then letting your dev runner try to execute it synchronously.
This usually happens with classes like:
- •
ChatOpenAI - •
OpenAIEmbeddings - •
RunnableSequence - •
AgentExecutor
Broken vs fixed pattern
| Broken | Fixed |
|---|---|
Calls .invoke() without awaiting it | Wraps execution in an async function |
| Uses top-level side effects in dev entrypoints | Keeps startup and execution explicit |
| Lets the process exit before promises settle | Awaits the full chain before exiting |
// ❌ Broken
import { ChatOpenAI } from "@langchain/openai";
import { PromptTemplate } from "@langchain/core/prompts";
const model = new ChatOpenAI({ model: "gpt-4o-mini" });
const prompt = PromptTemplate.fromTemplate("Summarize: {text}");
const chain = prompt.pipe(model);
// This runs at module load time and may break in dev runners
const result = chain.invoke({ text: "LangChain async event loop issue" });
console.log(result);
// ✅ Fixed
import { ChatOpenAI } from "@langchain/openai";
import { PromptTemplate } from "@langchain/core/prompts";
async function main() {
const model = new ChatOpenAI({ model: "gpt-4o-mini" });
const prompt = PromptTemplate.fromTemplate("Summarize: {text}");
const chain = prompt.pipe(model);
const result = await chain.invoke({ text: "LangChain async event loop issue" });
console.log(result);
}
void main().catch((err) => {
console.error("LangChain run failed:", err);
process.exitCode = 1;
});
Why this matters:
- •LangChain JS methods like
.invoke(),.stream(), and.batch()are async. - •Dev tools can reload modules multiple times.
- •If you kick off work at import time, the event loop can get into a bad state fast.
Other Possible Causes
1) Mixing CommonJS and ESM
LangChain packages are ESM-first. If your project is still using CommonJS patterns, you’ll see runtime failures during development.
// ❌ Broken tsconfig/package setup
{
"compilerOptions": {
"module": "commonjs"
}
}
// ✅ Fixed tsconfig/package setup
{
"compilerOptions": {
"module": "NodeNext",
"moduleResolution": "NodeNext",
"target": "ES2022"
},
"type": "module"
}
Typical error:
- •
Error [ERR_REQUIRE_ESM]: require() of ES Module @langchain/openai not supported
2) Using the wrong API shape for the class
Some LangChain classes expose async methods only. If you call sync-style helpers or assume constructors do all the work, you’ll get confusing runtime errors.
// ❌ Broken
import { OpenAIEmbeddings } from "@langchain/openai";
const embeddings = new OpenAIEmbeddings();
const vector = embeddings.embedQuery("hello"); // returns Promise, not vector
console.log(vector.length);
// ✅ Fixed
import { OpenAIEmbeddings } from "@langchain/openai";
async function main() {
const embeddings = new OpenAIEmbeddings();
const vector = await embeddings.embedQuery("hello");
console.log(vector.length);
}
void main();
Typical error:
- •
TypeError: vector.length is undefined - •Or downstream failures when a Promise gets passed into another runnable
3) Hot reload running your file twice
Tools like Nodemon, tsx watch mode, or custom Vite/webpack dev servers can re-import modules and trigger duplicate initialization.
// ❌ Broken: side effects at import time
import { ChatOpenAI } from "@langchain/openai";
export const model = new ChatOpenAI({ model: "gpt-4o-mini" });
export const startup = model.invoke("hello");
Fix by moving startup logic behind an explicit function:
// ✅ Fixed
import { ChatOpenAI } from "@langchain/openai";
export function createModel() {
return new ChatOpenAI({ model: "gpt-4o-mini" });
}
4) Unhandled promise rejection in test runners
Jest and Vitest will surface async failures as event-loop noise if you forget to await or return promises.
// ❌ Broken test
test("runs chain", () => {
chain.invoke({ input: "test" });
});
// ✅ Fixed test
test("runs chain", async () => {
await chain.invoke({ input: "test" });
});
Typical error:
- •
UnhandledPromiseRejectionWarning - •
PromiseRejectionHandledWarning
How to Debug It
- •
Find the first async call
- •Search for
.invoke(,.stream(,.batch(,.embedQuery(, or.call(. - •Check whether it’s awaited.
- •If it runs at module scope, move it into an
async function.
- •Search for
- •
Check your module format
- •Inspect
package.jsonfor"type": "module"or"type": "commonjs". - •Inspect
tsconfig.jsonfor"module"and"moduleResolution". - •If LangChain imports fail on startup, this is usually the culprit.
- •Inspect
- •
Run without watch mode
- •Disable Nodemon/tsx watch/Vitest watch.
- •Run once with plain Node output:
npm run build && node dist/index.js - •If the error disappears, your dev runner is involved.
- •
Log promise boundaries
- •Add logs before and after each async LangChain call.
- •If you see “before” but not “after”, the promise is failing or never awaited.
- •Wrap the whole flow in a top-level catch:
void main().catch(console.error);
Prevention
- •Keep all LangChain execution inside explicit
async main()functions. - •Standardize on ESM for TypeScript projects using modern LangChain packages.
- •In tests and scripts, always
await.invoke(),.stream(), and embedding calls. - •Avoid side effects at import time; initialize models lazily inside functions.
Keep learning
- •The complete AI Agents Roadmap — my full 8-step breakdown
- •Free: The AI Agent Starter Kit — PDF checklist + starter code
- •Work with me — I build AI for banks and insurance companies
By Cyprian Aarons, AI Consultant at Topiax.
Want the complete 8-step roadmap?
Grab the free AI Agent Starter Kit — architecture templates, compliance checklists, and a 7-email deep-dive course.
Get the Starter Kit