AI SDK ProvidersAmazon Bedrock

Amazon Bedrock Provider

The Amazon Bedrock provider for the Vercel AI SDK contains language model support for the Amazon Bedrock APIs.

Setup

The Bedrock provider is available in the @ai-sdk/amazon-bedrock module. You can install it with

pnpm
npm
yarn
pnpm install @ai-sdk/amazon-bedrock

Prerequisites

Access to Amazon Bedrock foundation models isn't granted by default. In order to gain access to a foundation model, an IAM user with sufficient permissions needs to request access to it through the console. Once access is provided to a model, it is available for all users in the account.

See the Model Access Docs for more information.

Authentication

Step 1: Creating AWS Access Key and Secret Key

To get started, you'll need to create an AWS access key and secret key. Here's how:

Login to AWS Management Console

Create an IAM User

  • Navigate to the IAM dashboard and click on "Users" in the left-hand navigation menu.
  • Click on "Create user" and fill in the required details to create a new IAM user.
  • Make sure to select "Programmatic access" as the access type.
  • The user account needs the AmazonBedrockFullAccess policy attached to it.

Create Access Key

  • Click on the "Security credentials" tab and then click on "Create access key".
  • Click "Create access key" to generate a new access key pair.
  • Download the .csv file containing the access key ID and secret access key.

Step 2: Configuring the Access Key and Secret Key

Within your project add a .env file if you don't already have one. This file will be used to set the access key and secret key as environment variables. Add the following lines to the .env file:

AWS_ACCESS_KEY_ID=YOUR_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY=YOUR_SECRET_ACCESS_KEY
AWS_REGION=YOUR_REGION

Many frameworks such as Next.js load the .env file automatically. If you're using a different framework, you may need to load the .env file manually using a package like dotenv.

Remember to replace YOUR_ACCESS_KEY_ID, YOUR_SECRET_ACCESS_KEY, and YOUR_REGION with the actual values from your AWS account.

Provider Instance

You can import the default provider instance bedrock from @ai-sdk/amazon-bedrock:

import { bedrock } from '@ai-sdk/amazon-bedrock';

If you need a customized setup, you can import createAmazonBedrock from @ai-sdk/amazon-bedrock and create a provider instance with your settings:

import { createAmazonBedrock } from '@ai-sdk/amazon-bedrock';
const bedrock = createAmazonBedrock({
region: 'us-east-1',
accessKeyId: 'xxxxxxxxx',
secretAccessKey: 'xxxxxxxxx',
});

You can use the following optional settings to customize the Amazon Bedrock provider instance:

  • region string

    The AWS region that you want to use for the API calls. It uses the AWS_REGION environment variable by default.

  • accessKeyId string

    The AWS access key ID that you want to use for the API calls. It uses the AWS_ACCESS_KEY_ID environment variable by default.

  • secretAccessKey string

    The AWS secret access key that you want to use for the API calls. It uses the AWS_SECRET_ACCESS_KEY environment variable by default.

  • bedrockOptions object

    Optional. The configuration options used by the Amazon Bedrock Library (BedrockRuntimeClientConfig), including:

    • region string The AWS region that you want to use for the API calls.

    • credentials object The AWS credentials that you want to use for the API calls.

    When bedrockOptions are provided, the region, accessKeyId, and secretAccessKey settings are ignored.

Language Models

You can create models that call the Bedrock API using the provider instance. The first argument is the model id, e.g. meta.llama3-70b-instruct-v1:0.

const model = bedrock('meta.llama3-70b-instruct-v1:0');

Amazon Bedrock models also support some model specific settings that are not part of the standard call settings. You can pass them as an options argument:

const model = bedrock('anthropic.claude-3-sonnet-20240229-v1:0', {
additionalModelRequestFields: { top_k: 350 },
});

Documentation for additional settings based on the selected model can be found within the Amazon Bedrock Inference Parameter Documentation.

Example

You can use Amazon Bedrock language models to generate text with the generateText function:

import { bedrock } from '@ai-sdk/amazon-bedrock';
import { generateText } from 'ai'
const { text } = await generateText({
model: bedrock('meta.llama3-70b-instruct-v1:0')
prompt: 'Write a vegetarian lasagna recipe for 4 people.'
})

Amazon Bedrock language models can also be used in the streamText function (see AI SDK Core).

Model Capabilities

Note: This model list is ever changing and may not be complete. Refer to the Amazon Bedrock documentation for up to date information.

ModelImage InputObject GenerationTool UsageTool Streaming
amazon.titan-tg1-large
amazon.titan-text-express-v1
anthropic.claude-3-5-sonnet-20240620-v1:0
anthropic.claude-3-opus-20240229-v1:0
anthropic.claude-3-sonnet-20240229-v1:0
anthropic.claude-3-haiku-20240307-v1:0
anthropic.claude-v2:1
cohere.command-r-v1:0
cohere.command-r-plus-v1:0
meta.llama2-13b-chat-v1
meta.llama2-70b-chat-v1
meta.llama3-8b-instruct-v1:0
meta.llama3-70b-instruct-v1:0
mistral.mistral-7b-instruct-v0:2
mistral.mixtral-8x7b-instruct-v0:1
mistral.mistral-large-2402-v1:0
mistral.mistral-small-2402-v1:0