Skip to content

SDKs & Integrations

Official SDKs for TypeScript and Python, plus integrations with LangChain, LlamaIndex, CrewAI, and Vercel AI SDK.

# SDKs & Integrations

BrainstormRouter provides official SDKs and integrates with popular AI frameworks. Since the API is OpenAI-compatible, any tool that works with OpenAI also works with BrainstormRouter by changing the base URL.

Official SDKs

TypeScript

``bash

npm install @brainstormrouter/sdk

`

`typescript

import { BrainstormRouter } from '@brainstormrouter/sdk';

const br = new BrainstormRouter({

apiKey: 'br-your-api-key',

});

// Auto-routed completion

const response = await br.chat.completions.create({

model: 'auto',

messages: [{ role: 'user', content: 'Explain TCP/IP' }],

});

// Access routing metadata

console.log(response.routing.model); // "claude-opus-4-6"

console.log(response.routing.strategy); // "combined"

console.log(response.routing.cost); // 0.0234

// Intelligence APIs

const rankings = await br.intelligence.rankings({ taskType: 'code_generation' });

const health = await br.intelligence.health();

const forecast = await br.intelligence.costForecast({ tokens: 1_000_000 });

`

Python

`bash

pip install brainstormrouter

`

`python

from brainstormrouter import BrainstormRouter

br = BrainstormRouter(api_key="br-your-api-key")

response = br.chat.completions.create(

model="auto",

messages=[{"role": "user", "content": "Explain TCP/IP"}],

)

# Routing metadata

print(response.routing.model) # "claude-opus-4-6"

print(response.routing.cost) # 0.0234

# Intelligence APIs

rankings = br.intelligence.rankings(task_type="code_generation")

health = br.intelligence.health()

`

CLI

`bash

npm install -g @brainstormrouter/cli

`

`bash

br chat "What is the best model for code review?"

br models --sort quality

br health

br budget --forecast

`

Framework Integrations

Vercel AI SDK

BrainstormRouter works natively with Vercel AI SDK using the OpenAI-compatible provider:

`typescript

import { openai } from '@ai-sdk/openai';

import { streamText } from 'ai';

const model = openai('auto', {

baseURL: 'https://api.brainstormrouter.com/v1',

apiKey: 'br-your-api-key',

});

const result = streamText({

model,

messages: [{ role: 'user', content: 'Build a React component' }],

});

`

LangChain

`python

from langchain_openai import ChatOpenAI

llm = ChatOpenAI(

model="auto",

openai_api_key="br-your-api-key",

openai_api_base="https://api.brainstormrouter.com/v1",

)

response = llm.invoke("Summarize this document")

`

LlamaIndex

`python

from llama_index.llms.openai import OpenAI

llm = OpenAI(

model="auto",

api_key="br-your-api-key",

api_base="https://api.brainstormrouter.com/v1",

)

`

CrewAI

`python

from crewai import LLM

llm = LLM(

model="openai/auto",

api_key="br-your-api-key",

base_url="https://api.brainstormrouter.com/v1",

)

`

OpenAI Drop-In Replacement

Any application using the OpenAI SDK can switch to BrainstormRouter by setting two environment variables:

`bash

export OPENAI_API_KEY="br-your-api-key"

export OPENAI_BASE_URL="https://api.brainstormrouter.com/v1"

`

No code changes required. Set model to auto` in your application to enable intelligent routing, or keep your existing model names -- BrainstormRouter routes them to the correct provider.