Groq Provider

Groq is supported via OpenAI API compatibility - the OpenAI provider is used in the examples below.

The Groq provider contains language model support for the Groq API. It creates language model objects that can be used with the generateText, streamText, and generateObject.

Setup

The Groq provider is available via the @ai-sdk/openai module as it is compatible with the OpenAI API. You can install it with

pnpm
npm
yarn
pnpm install @ai-sdk/openai

Provider Instance

To use Groq, you can create a custom provider instance with the createOpenAI function from @ai-sdk/openai:

import { createOpenAI } from '@ai-sdk/openai';
const groq = createOpenAI({
baseURL: 'https://api.groq.com/openai/v1',
apiKey: process.env.GROQ_API_KEY,
});

Language Models

You can create Groq models using a provider instance. The first argument is the model id, e.g. llama3-8b-8192.

const model = groq('llama3-8b-8192');

Example

You can use Groq language models to generate text with the generateText function:

import { createOpenAI } from '@ai-sdk/openai';
import { generateText } from 'ai';
const groq = createOpenAI({
baseURL: 'https://api.groq.com/openai/v1',
apiKey: process.env.GROQ_API_KEY,
});
const { text } = await generateText({
model: groq('llama3-8b-8192'),
prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});

Groq language models can also be used in the streamText, generateObject, streamObject, and streamUI functions (see AI SDK Core and AI SDK RSC).

Model Capabilities

Groq offers a variety of models with different capabilities, including:

ModelImage InputObject GenerationTool UsageTool Streaming
llama-3.1-405b-reasoning
llama-3.1-70b-versatile
llama-3.1-8b-instant
mixtral-8x7b-32768
gemma2-9b-it

The table above lists popular models. You can also pass any available provider model ID as a string if needed.