LlamaIndexAdapter
The LlamaIndexAdapter
module provides helper functions to transform LlamaIndex output streams into data streams and data stream responses.
See the LlamaIndex Adapter documentation for more information.
It supports:
- LlamaIndex ChatEngine streams
- LlamaIndex QueryEngine streams
Import
import { LlamaIndexAdapter } from "ai"
API Signature
Methods
toDataStream:
(stream: AsyncIterable<EngineResponse>, AIStreamCallbacksAndOptions) => AIStream
Converts LlamaIndex output streams to data stream.
toDataStreamResponse:
(stream: AsyncIterable<EngineResponse>, options?: {init?: ResponseInit, data?: StreamData, callbacks?: AIStreamCallbacksAndOptions}) => Response
Converts LlamaIndex output streams to data stream response.
mergeIntoDataStream:
(stream: AsyncIterable<EngineResponse>, options: { dataStream: DataStreamWriter; callbacks?: StreamCallbacks }) => void
Merges LlamaIndex output streams into an existing data stream.
Examples
Convert LlamaIndex ChatEngine Stream
app/api/completion/route.ts
import { OpenAI, SimpleChatEngine } from 'llamaindex';import { LlamaIndexAdapter } from 'ai';
export async function POST(req: Request) { const { prompt } = await req.json();
const llm = new OpenAI({ model: 'gpt-4o' }); const chatEngine = new SimpleChatEngine({ llm });
const stream = await chatEngine.chat({ message: prompt, stream: true, });
return LlamaIndexAdapter.toDataStreamResponse(stream);}