Google Vertex Provider
The Google Vertex provider for the Vercel AI SDK contains language model support for the Google Vertex AI APIs.
The Google Vertex provider is not compatible with edge environments.
Setup
The Google provider is available in the @ai-sdk/google-vertex
module. You can install it with
pnpm install @ai-sdk/google-vertex
Provider Instance
You can import the default provider instance vertex
from @ai-sdk/google-vertex
:
import { vertex } from '@ai-sdk/google-vertex';
If you need a customized setup, you can import createVertex
from @ai-sdk/google-vertex
and create a provider instance with your settings:
import { createVertex } from '@ai-sdk/google-vertex';
const vertex = createVertex({ project: 'my-project', // optional location: 'us-central1', // optional});
You can use the following optional settings to customize the Google Generative AI provider instance:
-
project string
The Google Cloud project ID that you want to use for the API calls. It uses the
GOOGLE_VERTEX_PROJECT
environment variable by default. -
location string
The Google Cloud location that you want to use for the API calls, e.g.
us-central1
. It uses theGOOGLE_VERTEX_LOCATION
environment variable by default. -
googleAuthOptions object
Optional. The Authentication options used by the Google Auth Library:
-
authClient object An
AuthClient
to use. -
keyFilename string Path to a .json, .pem, or .p12 key file.
-
keyFile string Path to a .json, .pem, or .p12 key file.
-
credentials object Object containing client_email and private_key properties, or the external account client options.
-
clientOptions object Options object passed to the constructor of the client.
-
scopes string | string[] Required scopes for the desired API request.
-
projectId string Your project ID.
-
universeDomain string The default service domain for a given Cloud universe.
-
Language Models
You can create models that call the Vertex API using the provider instance.
The first argument is the model id, e.g. gemini-1.5-pro
.
const model = vertex('gemini-1.5-pro');
Google Vertex models support also some model specific settings that are not part of the standard call settings. You can pass them as an options argument:
const model = vertex('gemini-1.5-pro', { topK: 0.2,});
The following optional settings are available for Google Vertex models:
-
topK number
Optional. The maximum number of tokens to consider when sampling.
Models use nucleus sampling or combined Top-k and nucleus sampling. Top-k sampling considers the set of topK most probable tokens. Models running with nucleus sampling don't allow topK setting.
-
safetySettings Array<{ category: string; threshold: string }>
Optional. Safety settings for the model.
-
category string
The category of the safety setting. Can be one of the following:
HARM_CATEGORY_UNSPECIFIED
HARM_CATEGORY_HATE_SPEECH
HARM_CATEGORY_DANGEROUS_CONTENT
HARM_CATEGORY_HARASSMENT
HARM_CATEGORY_SEXUALLY_EXPLICIT
-
threshold string
The threshold of the safety setting. Can be one of the following:
HARM_BLOCK_THRESHOLD_UNSPECIFIED
BLOCK_LOW_AND_ABOVE
BLOCK_MEDIUM_AND_ABOVE
BLOCK_ONLY_HIGH
BLOCK_NONE
-
Example
You can use Google Vertex language models to generate text with the generateText
function:
import { vertex } from '@ai-sdk/google-vertex'import { generateText } from 'ai'
const { text } = await generateText({ model: vertex('gemini-1.5-pro') prompt: 'Write a vegetarian lasagna recipe for 4 people.'})
Google Vertex language models can also be used in the streamText
and streamUI
functions
(see AI SDK Core and AI SDK RSC).
Model Capabilities
Model | Image Input | Object Generation | Tool Usage | Tool Streaming |
---|---|---|---|---|
gemini-1.5-flash | ||||
gemini-1.5-pro | ||||
gemini-1.0-pro-vision | ||||
gemini-1.0-pro |