Google Generative AI Provider
The Google Generative AI provider contains language model support for the Google Generative AI APIs.
Setup
The Google provider is available in the @ai-sdk/google
module. You can install it with
pnpm install @ai-sdk/google
Provider Instance
You can import the default provider instance google
from @ai-sdk/google
:
import { google } from '@ai-sdk/google';
If you need a customized setup, you can import createGoogleGenerativeAI
from @ai-sdk/google
and create a provider instance with your settings:
import { createGoogleGenerativeAI } from '@ai-sdk/google';
const google = createGoogleGenerativeAI({ // custom settings});
You can use the following optional settings to customize the Google Generative AI provider instance:
-
baseURL string
Use a different URL prefix for API calls, e.g. to use proxy servers. The default prefix is
https://generativelanguage.googleapis.com/v1beta
. -
apiKey string
API key that is being send using the
x-goog-api-key
header. It defaults to theGOOGLE_GENERATIVE_AI_API_KEY
environment variable. -
headers Record<string,string>
Custom headers to include in the requests.
-
fetch (input: RequestInfo, init?: RequestInit) => Promise<Response>
Custom fetch implementation. Defaults to the global
fetch
function. You can use it as a middleware to intercept requests, or to provide a custom fetch implementation for e.g. testing.
Language Models
You can create models that call the Google Generative AI API using the provider instance.
The first argument is the model id, e.g. models/gemini-pro
.
The models support tool calls and some have multi-modal capabilities.
const model = google('models/gemini-pro');
Google Generative AI models support also some model specific settings that are not part of the standard call settings. You can pass them as an options argument:
const model = google('models/gemini-pro', { topK: 0.2,});
The following optional settings are available for Google Generative AI models:
-
topK number
Optional. The maximum number of tokens to consider when sampling.
Models use nucleus sampling or combined Top-k and nucleus sampling. Top-k sampling considers the set of topK most probable tokens. Models running with nucleus sampling don't allow topK setting.
-
safetySettings Array<{ category: string; threshold: string }>
Optional. Safety settings for the model.
-
category string
The category of the safety setting. Can be one of the following:
HARM_CATEGORY_HATE_SPEECH
HARM_CATEGORY_DANGEROUS_CONTENT
HARM_CATEGORY_HARASSMENT
HARM_CATEGORY_SEXUALLY_EXPLICIT
-
threshold string
The threshold of the safety setting. Can be one of the following:
HARM_BLOCK_THRESHOLD_UNSPECIFIED
BLOCK_LOW_AND_ABOVE
BLOCK_MEDIUM_AND_ABOVE
BLOCK_ONLY_HIGH
BLOCK_NONE
-
Example
You can use Google Generative AI language models to generate text with the generateText
function:
import { google } from '@ai-sdk/google';import { generateText } from 'ai';
const { text } = await generateText({ model: google('models/gemini-pro'), prompt: 'Write a vegetarian lasagna recipe for 4 people.',});
Google Generative AI language models can also be used in the streamText
, generateObject
, streamObject
, and streamUI
functions
(see AI SDK Core and AI SDK RSC).
Model Capabilities
Model | Image Input | Object Generation | Tool Usage | Tool Streaming |
---|---|---|---|---|
models/gemini-1.5-pro-latest |