AI SDK RSCstreamUI
streamUI
A helper function to create a streamable UI from LLM providers. This function is similar to AI SDK Core APIs and supports the same model interfaces.
Import
import { streamUI } from "ai/rsc"
Object Parameter
model:
The language model to use. Example: openai('gpt-4-turbo')
initial?:
The initial UI to render.
system:
The system prompt to use that specifies the behavior of the model.
prompt:
The input prompt to generate the text from.
messages:
A list of messages that represent a conversation.
UserMessage
role:
The role for the user message.
content:
The content of the message.
TextPart
type:
The type of the message part.
text:
The text content of the message part.
ImagePart
type:
The type of the message part.
image:
The image content of the message part.
AssistantMessage
role:
The role for the assistant message.
content:
The content of the message.
TextPart
type:
The type of the message part.
text:
The text content of the message part.
ToolCallPart
type:
The type of the message part.
toolCallId:
The id of the tool call.
toolName:
The name of the tool, which typically would be the name of the function.
args:
Parameters generated by the model to be used by the tool.
ToolMessage
role:
The role for the assistant message.
content:
The content of the message.
ToolResultPart
type:
The type of the message part.
toolCallId:
The id of the tool call the result corresponds to.
toolName:
The name of the tool the result corresponds to.
result:
The result returned by the tool after execution.
isError?:
Whether the result is an error or an error message.
maxTokens?:
Maximum number of tokens to generate.
temperature?:
A number between 0 and 1 that affects the randomness of the model.
topP?:
A number between 0 and 1 that affects the likelihood of tokens with the top n % probability to be considered.
presencePenalty?:
A number between -1 and 1 that affects the likelihood of the model to repeat information that is already in the prompt.
frequencyPenalty?:
A number between -1 and 1 that affects the likelihood of the model to repeatedly use the same words or phrases.
seed?:
The seed to use for random sampling that affects the deterministic output of the model.
maxRetries?:
The maximum number of retries to attempt.
abortSignal?:
Cancels a call that is in progress.
tools:
Tools that are accessible to and can be called by the model.
Tool
description?:
Information about the purpose of the tool including details on how and when it can be used by the model.
parameters:
The typed schema that describes the parameters of the tool that can also be used to validation and error handling.
generate?:
A function or a generator function that is called with the arguments from the tool call and yields React nodes as the UI.
text?:
Callback to handle the generated tokens from the model.
Text
content:
The full content of the completion.
delta:
The delta.
done:
Is it done?
Returns
It can return any valid ReactNode.