Skip to content
Docs
ai/prompts

ai/prompts

The ai/prompts module contains functions to assist with converting Message's into prompts that can be used with the useChat and useCompletion hooks.

💡

The experimental_ prefix on the functions in this module indicates that the API is not yet stable and may change in the future without a major version bump.

experimental_buildOpenAssistantPrompt

Uses <|prompter|>, <|endoftext|>, and <|assistant> tokens. If a Message with an unsupported role is passed, an error will be thrown.

route.ts
import { experimental_buildOpenAssistantPrompt } from 'ai/prompts';
 
const { messages } = await req.json();
const response = Hf.textGenerationStream({
  model: 'OpenAssistant/oasst-sft-4-pythia-12b-epoch-3.5',
  inputs: experimental_buildOpenAssistantPrompt(messages),
});

experimental_buildStarChatBetaPrompt

Uses <|user|>, <|end|>, <|system|>, and <|assistant> tokens. If a Message with an unsupported role is passed, an error will be thrown.

route.ts
import { experimental_buildStarChatBetaPrompt } from 'ai/prompts';
 
const { messages } = await req.json();
const response = Hf.textGenerationStream({
  model: 'HuggingFaceH4/starchat-beta',
  inputs: experimental_buildStarChatBetaPrompt(messages),
});

experimental_buildLlama2Prompt

Uses LLama 2 chat tokens ([INST]) to create a prompt, learn more in the Hugging Face Blog on how to prompt Llama 2 (opens in a new tab). If a Message with an unsupported role is passed, an error will be thrown.

route.ts
import { experimental_buildLlama2Prompt } from 'ai/prompts';
 
const { messages } = await req.json();
const response = Hf.textGenerationStream({
  model: 'meta-llama/Llama-2-7b-chat-hf',
  inputs: experimental_buildLlama2Prompt(messages),
});

experimental_buildOpenAIMessages

Validates the Vercel AI messages and converts them to the format required by OpenAI. This provides additional checks and validation compared to using any or directly casting to the OpenAI message interface.

route.ts
import { experimental_buildOpenAIMessages } from 'ai/prompts';
 
const { messages } = await req.json();
 
const response = await openai.chat.completions.create({
  model: 'gpt-3.5-turbo',
  stream: true,
  messages: experimental_buildOpenAIMessages(messages),
  functions,
});

experimental_buildAnthropicPrompt

Constructs a prompt in the format required by Anthropic.

route.ts
import { experimental_buildAnthropicPrompt } from 'ai/prompts';
 
const { messages } = await req.json();
 
const response = await anthropic.completions.create({
  model: 'claude-2',
  prompt: experimental_buildAnthropicPrompt(messages),
  stream: true,
  max_tokens_to_sample: 300,
});

© 2023 Vercel Inc.