Supercompat
Supercompat — Switch AI models without compromises.
Supercompat is a TypeScript library that lets you call any LLM provider through the OpenAI SDK (or the Anthropic SDK). Swap one adapter and the same client.responses.create() call reaches Anthropic, Google, Groq, Mistral, Together, OpenRouter, Perplexity, Ollama, or Azure — with the original SDK types intact.
It runs in-process. No proxy server, no request forwarding, no extra latency. Supercompat installs a custom fetch on the SDK instance and routes calls locally.
Install
npm install supercompat openai
Quick example
import {
supercompat,
anthropicClientAdapter,
completionsRunAdapter,
memoryStorageAdapter,
} from 'supercompat/openai'
import Anthropic from '@anthropic-ai/sdk'
const client = supercompat({
clientAdapter: anthropicClientAdapter({ anthropic: new Anthropic() }),
storageAdapter: memoryStorageAdapter(),
runAdapter: completionsRunAdapter(),
})
const response = await client.responses.create({
model: 'claude-sonnet-4-6',
input: 'Say hello.',
})
console.log(response.output_text)
client is a real OpenAI instance with the real TypeScript types. Every call made on it — responses, chat.completions, beta.threads — is intercepted by Supercompat and translated into a request against the Anthropic SDK. Switching providers is a change to clientAdapter; everything else stays the same.
Persistent state
memoryStorageAdapter is fine for one-shot scripts but loses everything on restart. For persisted conversations, threads, and runs, swap it for prismaStorageAdapter:
import { PrismaClient } from '@prisma/client'
import {
supercompat,
anthropicClientAdapter,
completionsRunAdapter,
prismaStorageAdapter,
} from 'supercompat/openai'
import Anthropic from '@anthropic-ai/sdk'
const prisma = new PrismaClient()
const client = supercompat({
clientAdapter: anthropicClientAdapter({ anthropic: new Anthropic() }),
storageAdapter: prismaStorageAdapter({ prisma }),
runAdapter: completionsRunAdapter(),
})
const first = await client.responses.create({
model: 'claude-sonnet-4-6',
input: 'My name is Alice.',
})
const second = await client.responses.create({
model: 'claude-sonnet-4-6',
input: 'What did I just tell you?',
previous_response_id: first.id,
})
Conversations, responses, assistants, threads, messages, and runs all land in Postgres. See Storage adapters for every option — including OpenAI-managed and Azure-managed state.
Where to go next
InstallationInstall the package, pick a provider SDK, and wire them together.
ComparisonHow Supercompat compares to Vercel AI SDK, LiteLLM, LangChain, and others.
Output SDKsReturn an OpenAI-shaped or Anthropic-shaped client. Works with every provider.
AdaptersThe three adapter types — client, storage, and run — and how they compose.
ProvidersSetup notes for OpenAI, Anthropic, Google, Azure, and every other backend.
ToolsFunction calling, web search, file search, code interpreter, and computer use.
StreamingStream deltas through the OpenAI SDK regardless of which provider is behind it.