Integrate with the ChatOpenRouter chat model using LangChain JavaScript.
This will help you get started with OpenRouter chat models. OpenRouter is a unified API that provides access to models from multiple providers (OpenAI, Anthropic, Google, Meta, and more) through a single endpoint.
API ReferenceFor detailed documentation of all features and configuration options, head to the ChatOpenRouter API reference.
To access models via OpenRouter youβll need to create an OpenRouter account, get an API key, and install the @langchain/openrouter integration package.
Now we can instantiate our model object and generate chat completions:
Copy
import { ChatOpenRouter } from "@langchain/openrouter";const model = new ChatOpenRouter({ model: "anthropic/claude-sonnet-4.5", temperature: 0, maxTokens: 1024, // other params...});
const aiMsg = await model.invoke([ { role: "system", content: "You are a helpful assistant that translates English to French. Translate the user sentence.", }, { role: "user", content: "I love programming.", },]);console.log(aiMsg.content);
const stream = await model.stream("Write a short poem about the sea.");for await (const chunk of stream) { process.stdout.write(typeof chunk.content === "string" ? chunk.content : "");}
OpenRouter uses the OpenAI-compatible tool calling format. You can describe tools and their arguments, and have the model return a JSON object with a tool to invoke and the inputs to that tool.
With ChatOpenRouter.bindTools, you can pass in Zod schemas, LangChain tools, or raw function definitions as tools to the model. Under the hood these are converted to OpenAI tool schemas and passed in every model invocation.
Copy
import { ChatOpenRouter } from "@langchain/openrouter";import { tool } from "@langchain/core/tools";import { z } from "zod";const getWeather = tool(async ({ location }) => `Sunny in ${location}`, { name: "get_weather", description: "Get the current weather in a given location", schema: z.object({ location: z .string() .describe("The city and state, e.g. San Francisco, CA"), }),});const modelWithTools = new ChatOpenRouter({ model: "openai/gpt-4o",}).bindTools([getWeather]);const aiMsg = await modelWithTools.invoke( "What is the weather like in San Francisco?");console.log(aiMsg.tool_calls);
ChatOpenRouter supports structured output via the .withStructuredOutput() method. The extraction strategy is chosen automatically based on model capabilities:
jsonSchema β native JSON Schema response format (used when the model supports it)
functionCalling β wraps the schema as a tool call (default fallback)
jsonMode β asks the model to respond in JSON without strict schema constraints
When multi-model routing is active (models list or route: "fallback"), the method always falls back to functionCalling because the actual backend modelβs capabilities are unknown at request time.
Copy
import { ChatOpenRouter } from "@langchain/openrouter";import { z } from "zod";const model = new ChatOpenRouter({ model: "openai/gpt-4.1" });const movieSchema = z.object({ title: z.string().describe("The title of the movie"), year: z.number().describe("The year the movie was released"), director: z.string().describe("The director of the movie"), rating: z.number().describe("The movie's rating out of 10"),});const structuredModel = model.withStructuredOutput(movieSchema, { name: "movie", method: "jsonSchema", });const response = await structuredModel.invoke( "Provide details about the movie Inception");console.log(response);
OpenRouter supports multimodal inputs for models that accept them. The available modalities depend on the model you select β check the OpenRouter models page for details.
Not all models support all modalities. Check the OpenRouter models page for model-specific support.
Many models on OpenRouter are served by multiple providers. The provider parameter gives you control over which providers handle your requests and how theyβre selected.
For detailed documentation of all ChatOpenRouter features and configurations, head to the ChatOpenRouter API reference.For more information about OpenRouterβs platform, models, and features, see the OpenRouter documentation.