OpenRouter SDK

OpenRouter routes requests to many providers behind one key and billing surface. The official TypeScript SDK (@openrouter/sdk) supports streaming, tools, and agent-style flows—see SDK home and TypeScript documentation for the latest API.

Arrow stays responsible for chat layout, typography, and controls; your server (or edge function) should call OpenRouter with the secret key and stream results to the client.

  • Install @openrouter/sdk (ESM-only; use dynamic import() if you are on CJS).
  • Store OPENROUTER_API_KEY on the server only.
  • Stream chunks to the browser and map them into the message shape Arrow expects (same mental model as the Vercel AI SDK guide, different client library).

Install

pnpm add @openrouter/sdk

The package is ESM-only; see the OpenRouter SDK page for Bun, Yarn, and CJS notes.

Authenticate

Create a key in the OpenRouter dashboard and add it to your server environment:

.env.local

OPENROUTER_API_KEY=your_key_here

Pass apiKey into the client constructor explicitly in server code, or rely on the SDK’s environment defaults—never expose the key in the browser bundle.

Streaming chat call

Example adapted from the OpenRouter SDK usage pattern (model id and chunk shape may vary by SDK version—pin a version in production):

import { OpenRouter } from "@openrouter/sdk";

const openRouter = new OpenRouter({
  apiKey: process.env.OPENROUTER_API_KEY,
});

const stream = await openRouter.chat.send({
  messages: [{ role: "user", content: "Hello from Arrow + OpenRouter." }],
  model: "openai/gpt-4o",
  stream: true,
});

for await (const chunk of stream) {
  const delta = chunk.choices[0]?.delta?.content;
  if (delta) process.stdout.write(delta);
}

For non-streaming calls, set stream: false and read the final message from the response object. For tools and agents, follow the API reference.

Wire up Arrow

  • Message list: Map OpenRouter roles (user, assistant, …) to the same structure you use for Arrow rows.
  • Streaming: Buffer text deltas into the active assistant message; on finish, commit the final content so history matches the server.
  • Model switching: OpenRouter model strings look like openai/gpt-4o or anthropic/claude-3.5-sonnet; show the active model in your picker if users can change it.
  • Bridging to useChat: You can proxy OpenRouter streams through a Next.js route that emits the same format as @ai-sdk/react expects, or drive Arrow purely from your own fetch reader—both are valid.

Official resources