Together.ai Provider
The Together.ai provider contains support for 200+ open-source models through the Together.ai API.
Setup
The Together.ai provider is available via the @ai-sdk/togetherai
module. You can
install it with
pnpm add @ai-sdk/togetherai
Provider Instance
You can import the default provider instance togetherai
from @ai-sdk/togetherai
:
import { togetherai } from '@ai-sdk/togetherai';
If you need a customized setup, you can import createTogetherAI
from @ai-sdk/togetherai
and create a provider instance with your settings:
import { createTogetherAI } from '@ai-sdk/togetherai';
const togetherai = createTogetherAI({ apiKey: process.env.TOGETHER_AI_API_KEY ?? '',});
You can use the following optional settings to customize the Together.ai provider instance:
-
baseURL string
Use a different URL prefix for API calls, e.g. to use proxy servers. The default prefix is
https://api.together.xyz/v1
. -
apiKey string
API key that is being sent using the
Authorization
header. It defaults to theTOGETHER_AI_API_KEY
environment variable. -
headers Record<string,string>
Custom headers to include in the requests.
-
fetch (input: RequestInfo, init?: RequestInit) => Promise<Response>
Custom fetch implementation. Defaults to the global
fetch
function. You can use it as a middleware to intercept requests, or to provide a custom fetch implementation for e.g. testing.
Language Models
You can create Together.ai models using a provider instance. The first argument is the model id, e.g. google/gemma-2-9b-it
.
const model = togetherai('google/gemma-2-9b-it');
Example
You can use Together.ai language models to generate text with the generateText
function:
import { togetherai } from '@ai-sdk/togetherai';import { generateText } from 'ai';
const { text } = await generateText({ model: togetherai('meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo'), prompt: 'Write a vegetarian lasagna recipe for 4 people.',});
Together.ai language models can also be used in the streamText
and streamUI
functions (see AI SDK Core and AI SDK RSC).
The Together.ai provider also supports completion models via (following the above example code) togetherai.completionModel()
and embedding models via togetherai.textEmbeddingModel()
.
Model Capabilities
Model | Image Input | Object Generation | Tool Usage | Tool Streaming |
---|---|---|---|---|
meta-llama/Meta-Llama-3.3-70B-Instruct-Turbo | ||||
meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo | ||||
mistralai/Mixtral-8x22B-Instruct-v0.1 | ||||
mistralai/Mistral-7B-Instruct-v0.3 | ||||
google/gemma-2b-it | ||||
Qwen/Qwen2.5-72B-Instruct-Turbo | ||||
databricks/dbrx-instruct |
The table above lists popular models. Please see the Together.ai docs for a full list of available models. You can also pass any available provider model ID as a string if needed.