Generate Text with Chat Prompt
Previously, we were able to generate text and objects using either a single message prompt, a system prompt, or a combination of both of them. However, there may be times when you want to generate text based on a series of messages.
A chat completion allows you to generate text based on a series of messages. This series of messages can be any series of interactions between any number of systems, but the most popular and relatable use case has been a series of messages that represent a conversation between a user and a model.
Client
Let's create a simple conversation between a user and a model, and place a button that will call continueConversation
.
'use client';
import { useState } from 'react';import { Message, continueConversation } from './actions';
// Allow streaming responses up to 30 secondsexport const maxDuration = 30;
export default function Home() { const [conversation, setConversation] = useState<Message[]>([]); const [input, setInput] = useState<string>('');
return ( <div> <div> {conversation.map((message, index) => ( <div key={index}> {message.role}: {message.content} </div> ))} </div>
<div> <input type="text" value={input} onChange={event => { setInput(event.target.value); }} /> <button onClick={async () => { const { messages } = await continueConversation([ ...conversation, { role: 'user', content: input }, ]);
setConversation(messages); }} > Send Message </button> </div> </div> );}
Server
Now, let's implement the continueConversation
function that will insert the user's message into the conversation and generate a response.
'use server';
import { generateText } from 'ai';import { openai } from '@ai-sdk/openai';
export interface Message { role: 'user' | 'assistant'; content: string;}
export async function continueConversation(history: Message[]) { 'use server';
const { text } = await generateText({ model: openai('gpt-3.5-turbo'), system: 'You are a friendly assistant!', messages: history, });
return { messages: [ ...history, { role: 'assistant' as const, content: text, }, ], };}