AI SDK UIStreaming Data

Streaming Data

Depending on your use case, you may want to stream additional data alongside the model's response. This can be achieved with StreamData.

What is StreamData

The StreamData class allows you to stream arbitrary data to the client alongside your LLM response. This can be particularly useful in applications that need to augment AI responses with metadata, auxiliary information, or custom data structures that are relevant to the ongoing interaction.

How To Use StreamData

To use StreamData, create a StreamData value on the server, append some data, and then include it in toAIStreamResponse.

You need to call close() on the StreamData object to ensure the data is sent to the client. This can best be done in the onFinish callback of streamText.

On the client, the useChat hook returns data, which will contain the additional data.

On the server

While this example uses Next.js (App Router), StreamData is compatible with any framework.

import { openai } from '@ai-sdk/openai';
import { streamText, StreamData } from 'ai';
// Allow streaming responses up to 30 seconds
export const maxDuration = 30;
export async function POST(req: Request) {
// Extract the `messages` from the body of the request
const { messages } = await req.json();
// Create a new StreamData
const data = new StreamData();
// Append additional data
data.append({ test: 'value' });
// Call the language model
const result = await streamText({
model: openai('gpt-4-turbo'),
onFinish() {
data.close();
},
messages,
});
// Respond with the stream and additional StreamData
return result.toAIStreamResponse({ data });
}

On the client

On the client, you can destructure data from the useChat hook which stores all StreamData in an array. The data is of the type JSONValue[].

const { data } = useChat();

Future versions of the AI SDK will support each Message having a data object attached to it.