Anthropic Provider

The Anthropic provider contains language model support for the Anthropic Messages API.

Setup

The Anthropic provider is available in the @ai-sdk/anthropic module. You can install it with

pnpm
npm
yarn
pnpm install @ai-sdk/anthropic

Provider Instance

You can import the default provider instance anthropic from @ai-sdk/anthropic:

import { anthropic } from '@ai-sdk/anthropic';

If you need a customized setup, you can import createAnthropic from @ai-sdk/anthropic and create a provider instance with your settings:

import { createAnthropic } from '@ai-sdk/anthropic';
const anthropic = createAnthropic({
// custom settings
});

You can use the following optional settings to customize the Google Generative AI provider instance:

  • baseURL string

    Use a different URL prefix for API calls, e.g. to use proxy servers. The default prefix is https://api.anthropic.com/v1.

  • apiKey string

    API key that is being send using the x-api-key header. It defaults to the ANTHROPIC_API_KEY environment variable.

  • headers Record<string,string>

    Custom headers to include in the requests.

  • fetch (input: RequestInfo, init?: RequestInit) => Promise<Response>

    Custom fetch implementation. Defaults to the global fetch function. You can use it as a middleware to intercept requests, or to provide a custom fetch implementation for e.g. testing.

Language Models

You can create models that call the Anthropic Messages API using the provider instance. The first argument is the model id, e.g. claude-3-haiku-20240307. Some models have multi-modal capabilities.

const model = anthropic('claude-3-haiku-20240307');

Anthropic Messages` models support also some model specific settings that are not part of the standard call settings. You can pass them as an options argument:

const model = anthropic('claude-3-haiku-20240307', {
topK: 0.2,
});

The following optional settings are available for Anthropic models:

  • topK number

    Only sample from the top K options for each subsequent token.

    Used to remove "long tail" low probability responses. Recommended for advanced use cases only. You usually only need to use temperature.

Example

You can use Anthropic language models to generate text with the generateText function:

import { anthropic } from '@ai-sdk/anthropic';
import { generateText } from 'ai';
const { text } = await generateText({
model: anthropic('claude-3-haiku-20240307'),
prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});

Anthropic language models can also be used in the streamText, generateObject, streamObject, and streamUI functions (see AI SDK Core and AI SDK RSC).

Model Capabilities

ModelImage InputObject GenerationTool UsageTool Streaming
claude-3-5-sonnet-20240620
claude-3-opus-20240229
claude-3-sonnet-20240229
claude-3-haiku-20240307