OpenAI
OpenAI is the native home of the types Supercompat uses. There's almost nothing to translate — the main reason to route through Supercompat at all is to get a single, provider-agnostic surface so you can switch later.
Install
npm install supercompat openai
Minimal setup
import OpenAI from 'openai'
import {
supercompat,
openaiClientAdapter,
openaiResponsesRunAdapter,
memoryStorageAdapter,
} from 'supercompat/openai'
const client = supercompat({
clientAdapter: openaiClientAdapter({
openai: new OpenAI({ apiKey: process.env.OPENAI_API_KEY }),
}),
storageAdapter: memoryStorageAdapter(),
runAdapter: openaiResponsesRunAdapter(),
})
const response = await client.responses.create({
model: 'gpt-4.1-mini',
input: 'Hello.',
})
Using OpenAI-managed state
Skip Postgres entirely. openaiResponsesStorageAdapter forwards conversations and responses directly to OpenAI.
import {
openaiClientAdapter,
openaiResponsesRunAdapter,
openaiResponsesStorageAdapter,
} from 'supercompat/openai'
const client = supercompat({
clientAdapter: openaiClientAdapter({ openai: new OpenAI() }),
storageAdapter: openaiResponsesStorageAdapter(),
runAdapter: openaiResponsesRunAdapter(),
})
Assistants API
The same client exposes the Assistants surface. Use completionsRunAdapter for provider-portable assistants, or openaiResponsesRunAdapter to run assistants on the Responses API.
const assistant = await client.beta.assistants.create({
model: 'gpt-4.1-mini',
instructions: 'You are a code reviewer.',
})
Native tools
With openaiResponsesRunAdapter, these built-in tools are forwarded to OpenAI as-is:
{ type: 'code_interpreter' }
{ type: 'computer_use_preview', computer_use_preview: { display_width, display_height, environment } }
Models
Some current examples: