OpenAI Compatible Providers
You can use the OpenAI Compatible Provider package to use language model providers that implement the OpenAI API.
Below we focus on the general setup and provider instance creation. You can also write a custom provider package leveraging the OpenAI Compatible package.
We provide detailed documentation for the following OpenAI compatible providers:
The general setup and provider instance creation is the same for all of these providers.
Setup
The OpenAI Compatible provider is available via the @ai-sdk/openai-compatible
module. You can install it with:
pnpm add @ai-sdk/openai-compatible
Provider Instance
To use an OpenAI compatible provider, you can create a custom provider instance with the createOpenAICompatible
function from @ai-sdk/openai-compatible
:
import { createOpenAICompatible } from '@ai-sdk/openai-compatible';
const provider = createOpenAICompatible({ name: 'provider-name', headers: { Authorization: `Bearer ${process.env.PROVIDER_API_KEY}`, }, baseURL: 'https://api.provider.com/v1',});
Language Models
You can create provider models using a provider instance.
The first argument is the model id, e.g. model-id
.
const model = provider('model-id');
Example
You can use provider language models to generate text with the generateText
function:
import { createOpenAICompatible } from '@ai-sdk/openai-compatible'import { generateText } from 'ai'
const provider = createOpenAICompatible({ name: 'provider-name', headers: { Authorization: `Bearer ${process.env.PROVIDER_API_KEY}`, }, baseURL: 'https://api.provider.com/v1'})
const { text } = await generateText({ model: provider('model-id') prompt: 'Write a vegetarian lasagna recipe for 4 people.'})
Including model ids for auto-completion
import { createOpenAICompatible } from '@ai-sdk/openai-compatible';import { generateText } from 'ai';
type ExampleChatModelIds = | 'meta-llama/Llama-3-70b-chat-hf' | 'meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo' | (string & {});
type ExampleCompletionModelIds = | 'codellama/CodeLlama-34b-Instruct-hf' | 'Qwen/Qwen2.5-Coder-32B-Instruct' | (string & {});
type ExampleEmbeddingModelIds = | 'BAAI/bge-large-en-v1.5' | 'bert-base-uncased' | (string & {});
const model = createOpenAICompatible< ExampleChatModelIds, ExampleCompletionModelIds, ExampleEmbeddingModelIds>({ name: 'example', headers: { Authorization: `Bearer ${process.env.MY_API_KEY}`, }, baseURL: 'https://api.example.com/v1',});
// Subsequent calls to e.g. `model.chatModel` will auto-complete the model id// from the list of `ExampleChatModelIds` while still allowing free-form// strings as well.
const { text } = await generateText({ model: model.chatModel('meta-llama/Llama-3-70b-chat-hf'), prompt: 'Write a vegetarian lasagna recipe for 4 people.',});