FoundationsProviders and Models

Providers and Models

Companies such as OpenAI and Anthropic (providers) offer access to a range of large language models (LLMs) with differing strengths and capabilities through their own APIs.

Each provider typically has its own unique method for interfacing with their models, complicating the process of switching providers and increasing the risk of vendor lock-in.

To solve these challenges, Vercel AI SDK Core offers a standardized approach to interacting with LLMs through a language model specification that abstracts differences between providers. This unified interface allows you to switch between providers with ease while using the same API for all providers.

Here is an overview of the AI SDK Provider Architecture:

AI SDK Core Provider Architecture

AI SDK Providers

Vercel AI SDK comes with several providers that you can use to interact with different language models:

You can also use the OpenAI provider with OpenAI-compatible APIs:

Our language model specification is published as an open-source package, which you can use to create custom providers.

The open-source community has created the following providers:

Model Capabilities

The AI providers support different language models with various capabilities. Here are the capabilities of popular models:

ProviderModelImage InputObject GenerationTool UsageTool Streaming
OpenAIgpt-4o
OpenAIgpt-4o-mini
OpenAIgpt-4-turbo
OpenAIgpt-4
Anthropicclaude-3-5-sonnet-20240620
Mistralmistral-large-latest
Mistralmistral-small-latest
Google Generative AImodels/gemini-1.5-pro-latest
Google Vertexgemini-1.5-flash
Google Vertexgemini-1.5-pro
Groqllama-3.1-405b-reasoning
Groqllama-3.1-70b-versatile
Groqllama-3.1-8b-instant
Groqmixtral-8x7b-32768
Groqgemma2-9b-it

This table is not exhaustive. Additional models can be found in the provider documentation pages and on the provider websites.