LlamaIndex

LlamaIndex is a framework for building LLM-powered applications. LlamaIndex helps you ingest, structure, and access private or domain-specific data. LlamaIndex.TS offers the core features of LlamaIndex for Python for popular runtimes like Node.js (official support), Vercel Edge Functions (experimental), and Deno (experimental).

Example: Completion

Here is a basic example that uses both Vercel AI SDK and LlamaIndex together with the Next.js App Router.

The AI SDK LlamaIndexAdapter uses the stream result from calling the chat method on a LlamaIndex ChatEngine or the query method on a LlamaIndex QueryEngine to pipe text to the client.

app/api/completion/route.ts
import { OpenAI, SimpleChatEngine } from 'llamaindex';
import { LlamaIndexAdapter } from 'ai';
export const maxDuration = 60;
export async function POST(req: Request) {
const { prompt } = await req.json();
const llm = new OpenAI({ model: 'gpt-4o' });
const chatEngine = new SimpleChatEngine({ llm });
const stream = await chatEngine.chat({
message: prompt,
stream: true,
});
return LlamaIndexAdapter.toDataStreamResponse(stream);
}

Then, we use the Vercel AI SDK's useCompletion method in the page component to handle the completion:

app/page.tsx
'use client';
import { useCompletion } from 'ai/react';
export default function Chat() {
const { completion, input, handleInputChange, handleSubmit } =
useCompletion();
return (
<div>
{completion}
<form onSubmit={handleSubmit}>
<input value={input} onChange={handleInputChange} />
</form>
</div>
);
}

More Examples

create-llama is the easiest way to get started with LlamaIndex. It uses the Vercel AI SDK to connect to LlamaIndex in all its generated code.