AI SDK Coreembed
embed()
Generate an embedding for a single value using an embedding model.
This is ideal for use cases where you need to embed a single value to e.g. retrieve similar items or to use the embedding in a downstream task.
import { openai } from '@ai-sdk/openai';import { embed } from 'ai';
const { embedding } = await embed({ model: openai.embedding('text-embedding-3-small'), value: 'sunny day at the beach',});
Import
import { embed } from "ai"
API Signature
Parameters
model:
The embedding model to use. Example: openai.embedding('text-embedding-3-small')
value:
The value to embed. The type depends on the model.
maxRetries?:
Maximum number of retries. Set to 0 to disable retries. Default: 2.
abortSignal?:
An optional abort signal that can be used to cancel the call.
headers?:
Additional HTTP headers to be sent with the request. Only applicable for HTTP-based providers.
experimental_telemetry?:
Telemetry configuration. Experimental feature.
TelemetrySettings
isEnabled?:
Enable or disable telemetry. Disabled by default while experimental.
functionId?:
Identifier for this function. Used to group telemetry data by function.
metadata?:
Additional information to include in the telemetry data.
Returns
value:
The value that was embedded.
embedding:
The embedding of the value.
usage:
The token usage for generating the embeddings.
EmbeddingTokenUsage
tokens:
The total number of input tokens.
rawResponse:
Optional raw response data.
RawResponse
headers:
Response headers.