AI SDK UIStreaming Data

Streaming Data

Depending on your use case, you may want to stream additional data alongside the model's response. This can be achieved with StreamData.

What is StreamData

The StreamData class allows you to stream arbitrary data to the client alongside your LLM response. This can be particularly useful in applications that need to augment AI responses with metadata, auxiliary information, or custom data structures that are relevant to the ongoing interaction.

How To Use StreamData

To use StreamData, create a StreamData value on the server, append some data and then return it alongside the model response with StreamingTextResponse. On the client, the useChat hook returns data, which will contain the additional data.

On the server

While this example uses Next.js (App Router), StreamData is compatible with any framework.

import { openai } from '@ai-sdk/openai';
import { StreamingTextResponse, streamText, StreamData } from 'ai';
export const dynamic = 'force-dynamic';
export async function POST(req: Request) {
// Extract the `messages` from the body of the request
const { messages } = await req.json();
// Call the language model
const result = await streamText({
model: openai('gpt-4-turbo'),
// Create a new StreamData
const data = new StreamData();
// Append additional data
data.append({ test: 'value' });
// Convert the response into a friendly text-stream
const stream = result.toAIStream({
onFinal(_) {
// Respond with the stream and additional StreamData
return new StreamingTextResponse(stream, {}, data);

On the client

On the client, you can destructure data from the useChat hook which stores all StreamData in an array. The data is of the type JSONValue[].

const { data } = useChat();

Future versions of the AI SDK will support each Message having a data object attached to it.