Generative UI
The introduction of function calling with language models like GPT-3.5+ has paved the way for tailored interactive user interfaces that the language model decides to render.
Leveraging React Server Components and Server Actions, the AI SDK seamlessly integrates interface rendering capabilities through the ai/rsc
package.
You can use models that do not support function calling with ai/rsc
by emulating structured outputs like JSON or directly streaming their outputs.
Examples
Search
Let your users see more than words can say by rendering components directly within your search experience.
Task Planning
Make it easier for your users to interpret agent execution so they can stay in the loop with the magic behind the scenes.
AI and UI State
The AI SDK introduces two new concepts: AIState
and UIState
. These states introduce a clear separation of concerns between the server-side AI operations and client-side UI rendered in the application. This separation allows developers to securely maintain the AI state, which may include something like your system prompt or other metadata. Meanwhile, the UI state is designed to allow React Server Components to be efficiently streamed to the client.
AIState
AIState
is a JSON representation of all the context the LLM needs to read. For a chat app, AIState
generally stores the textual conversation history between the user and the assistant. In practice, it can also be used to store other values and meta information such as createdAt
of each message. AIState
by default, can be accessed/modified on both Server and Client.
UIState
UIState
is what the application uses to display the UI. It is a fully client-side state (very similar to useState
) and can keep data and UI elements returned by the LLM. This state can be anything, but can't be accessed on the server.
Setup
Build your app
Prerequisites
Before you start, make sure you have the following:
- Node.js 18+ installed on your local development machine.
- An OpenAI API key.
If you haven't obtained your OpenAI API key, you can do so by signing up (opens in a new tab) on the OpenAI website.
Install Next.js
We'll start by creating a new Next.js application. This command will create a new directory named my-ai-app
and set up a basic Next.js application inside it.
pnpm dlx create-next-app@canary my-ai-app
Navigate to the newly created directory:
cd my-ai-app
Install Dependencies
Next, we'll install ai
and openai
, OpenAI's official JavaScript SDK compatible with the Vercel Edge Runtime.
pnpm install ai openai zod
Configure OpenAI API Key
Create a .env.local
file in your project root and add your OpenAI API Key. This key is used to authenticate your application with the OpenAI service.
touch .env.local
Edit the .env.local
file:
OPENAI_API_KEY=xxxxxxxxx
Replace xxxxxxxxx
with your actual OpenAI API key.
Create an ai/rsc
instance on the server
import { OpenAI } from "openai";
import { createAI, getMutableAIState, render } from "ai/rsc";
import { z } from "zod";
const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
// An example of a spinner component. You can also import your own components,
// or 3rd party component libraries.
function Spinner() {
return <div>Loading...</div>;
}
// An example of a flight card component.
function FlightCard({ flightInfo }) {
return (
<div>
<h2>Flight Information</h2>
<p>Flight Number: {flightInfo.flightNumber}</p>
<p>Departure: {flightInfo.departure}</p>
<p>Arrival: {flightInfo.arrival}</p>
</div>
);
}
// An example of a function that fetches flight information from an external API.
async function getFlightInfo(flightNumber: string) {
return {
flightNumber,
departure: 'New York',
arrival: 'San Francisco',
};
}
async function submitUserMessage(userInput: string) {
'use server';
const aiState = getMutableAIState<typeof AI>();
// Update the AI state with the new user message.
aiState.update([
...aiState.get(),
{
role: 'user',
content: userInput,
},
]);
// The `render()` creates a generated, streamable UI.
const ui = render({
model: 'gpt-4-0125-preview',
provider: openai,
messages: [
{ role: 'system', content: 'You are a flight assistant' },
...aiState.get()
],
// `text` is called when an AI returns a text response (as opposed to a tool call).
// Its content is streamed from the LLM, so this function will be called
// multiple times with `content` being incremental.
text: ({ content, done }) => {
// When it's the final content, mark the state as done and ready for the client to access.
if (done) {
aiState.done([
...aiState.get(),
{
role: "assistant",
content
}
]);
}
return <p>{content}</p>
},
tools: {
get_flight_info: {
description: 'Get the information for a flight',
parameters: z.object({
flightNumber: z.string().describe('the number of the flight')
}).required(),
render: async function* ({ flightNumber }) {
// Show a spinner on the client while we wait for the response.
yield <Spinner/>
// Fetch the flight information from an external API.
const flightInfo = await getFlightInfo(flightNumber)
// Update the final AI state.
aiState.done([
...aiState.get(),
{
role: "function",
name: "get_flight_info",
// Content can be any string to provide context to the LLM in the rest of the conversation.
content: JSON.stringify(flightInfo),
}
]);
// Return the flight card to the client.
return <FlightCard flightInfo={flightInfo} />
}
}
}
})
return {
id: Date.now(),
display: ui
};
}
// Define the initial state of the AI. It can be any JSON object.
const initialAIState: {
role: 'user' | 'assistant' | 'system' | 'function';
content: string;
id?: string;
name?: string;
}[] = [];
// The initial UI state that the client will keep track of, which contains the message IDs and their UI nodes.
const initialUIState: {
id: number;
display: React.ReactNode;
}[] = [];
// AI is a provider you wrap your application with so you can access AI and UI state in your components.
export const AI = createAI({
actions: {
submitUserMessage
},
// Each state can be any shape of object, but for chat applications
// it makes sense to have an array of messages. Or you may prefer something like { id: number, messages: Message[] }
initialUIState,
initialAIState
});
Connect to the ai/rsc
instance from the client
import type { Metadata } from "next";
import { Inter } from "next/font/google";
import { AI } from "./action";
import "./globals.css";
const inter = Inter({ subsets: ["latin"] });
export const metadata: Metadata = {
title: "Create Next App",
description: "Generated by create next app",
};
export default function RootLayout({
children,
}: Readonly<{
children: React.ReactNode;
}>) {
return (
<html lang="en">
<body className={inter.className}>
<AI>
{children}
</AI>
</body>
</html>
);
}
Send and receive messages
'use client'
import { useState } from 'react';
import { useUIState, useActions } from "ai/rsc";
import type { AI } from './action';
export default function Page() {
const [inputValue, setInputValue] = useState('');
const [messages, setMessages] = useUIState<typeof AI>();
const { submitUserMessage } = useActions<typeof AI>();
return (
<div>
{
// View messages in UI state
messages.map((message) => (
<div key={message.id}>
{message.display}
</div>
))
}
<form onSubmit={async (e) => {
e.preventDefault();
// Add user message to UI state
setMessages((currentMessages) => [
...currentMessages,
{
id: Date.now(),
display: <div>{inputValue}</div>,
},
]);
// Submit and get response message
const responseMessage = await submitUserMessage(inputValue);
setMessages((currentMessages) => [
...currentMessages,
responseMessage,
]);
setInputValue('');
}}>
<input
placeholder="Send a message..."
value={inputValue}
onChange={(event) => {
setInputValue(event.target.value)
}}
/>
</form>
</div>
)
}
Recipes
Requesting updates from the server from generated UI
In many cases, the interface you render will need to request updates from the server. For example, a weather app might need to request the latest weather data every 5 minutes, or a button press should execute a search. To accomplish this, you can pass Server Actions to your UI components and call them as needed.
For example, if you have an handleUserMessage
action registered in createAI
, you can define nested Server Actions like this:
async function handleUserMessage(userInput) {
'use server';
const card = createStreamableUI(<Spinner />);
async function getCityWeather() {
try {
card.update(
<>
Analyzing...
<WeatherCardSkeleton />
</>,
);
// Your customized LLM logic, e.g. tools API.
const res = await callLLM(
`Return the city name from the user input: ${userInput}`,
);
const temperature = await getCityTemperature(res.city);
card.done(
<>
Here's the weather of {res.city}:
<WeatherCard
city={res.city}
temperature={temperature}
refreshAction={async () => {
'use server';
return getCityTemperature(res.city);
}}
/>
</>,
);
} catch {
card.done(<ErrorCard />);
tokenCounter.done(0);
}
}
getCityWeather();
return {
startedAt: Date.now(),
ui: card.value, // Streamed UI value
tokens: tokenCounter.value, // Extra meta information.
};
}
And then your WeatherCard
component can call the refreshAction
function in an onClick
event handler, an effect or something like polling to load the latest weather data and update the client UI.
Updating AI State from Client Components
The AI state represents the context of the conversation provided to the language model and is
often an array of chat messages. Sometimes, like if the user is filling out a form or interacting with a UI element like a slider,
you need to update the language models context with the new information. You can mutate the AI state on the client using the useAIState
hook.
In the example below, we have a slider that allows the user to purchase a number of shares of a stock. When the user changes the slider, we update the AI state with the new information.
'use client';
import { useAIState } from 'ai/rsc';
import { useId } from 'react';
import type { AI } from '../action';
export function Slider({ name, price }) {
const [aiState, setAIState] = useAIState<typeof AI>();
// A unique identifier we attach to each message so we can update it later.
const id = useId();
// Whenever the slider changes, we need to update the local value state and the history
// so LLM also knows what's going on without having a new message every time the slider changes.
function onSliderChange(e: React.ChangeEvent<HTMLInputElement>) {
setValue(Number(e.target.value));
// Insert a new message into the AI state.
const info = {
role: 'assistant' as const,
content: `[User has changed to purchase ${newValue} shares of ${name}. Total cost: $${(
newValue * price
).toFixed(2)}]`,
// Identifier of this UI component, so we don't insert it many times.
id,
};
// If the last message is already inserted by us, update it. This is to avoid
// adding every slider change to the AI state.
if (aiState[aiState.length - 1]?.id === id) {
setAIState([...aiState.slice(0, -1), info]);
} else {
// If it doesn't exist, append it to the AI state.
setAIState([...aiState, info]);
}
}
}
Nested UI Streaming
One powerful feature provided by the AI SDK is the ability to stream UI components within other UI components. This allows you to create complex UIs that are built up from smaller, reusable components.
In the example below, we pass a historyChart
streamable as a prop to a StockCard
component. The StockCard
can render the historyChart
streamable, and it will automatically update as the server responds with new data.
// Create a streaming UI node. You can make as many as you need.
async function getStockHistoryChart() {
'use server';
const ui = createStreamableUI(<Spinner />);
// We need to wrap this in an async IIFE to avoid blocking. Without it, the UI wouldn't render
// while the fetch or LLM call are in progress.
(async () => {
const price = await callLLM('What is the current stock price of AAPL?');
// Show a spinner as the history chart for now.
// We won't be updating this again so we use `ui.done()` instead of `ui.update()`.
const historyChart = createStreamableUI(<Spinner />);
ui.done(<StockCard historyChart={historyChart.value} price={price} />);
// Getting the history data and then update that part of the UI.
const historyData = await fetch('https://my-stock-data-api.com');
historyChart.done(<HistoryChart data={historyData} />);
})();
return ui;
}