LangChain is a framework for developing applications powered by language models. It provides tools and abstractions for working with AI models, agents, vector stores, and other data sources for retrieval augmented generation (RAG). However, LangChain does not provide a way to easily build UIs or a standard way to stream data to the client.

Example: Completion

Here is a basic example that uses both Vercel AI SDK and LangChain together with the Next.js App Router.

The AI SDK LangChainAdapter uses the result from LangChain ExpressionLanguage streaming to pipe text to the client. LangChainAdapter.toAIStream() is compatible with the LangChain Expression Language .stream() function response.

import { ChatOpenAI } from '@langchain/openai';
import { LangChainAdapter, StreamingTextResponse } from 'ai';
export const dynamic = 'force-dynamic';
export const maxDuration = 60;
export async function POST(req: Request) {
const { prompt } = await req.json();
const model = new ChatOpenAI({
model: 'gpt-3.5-turbo-0125',
temperature: 0,
const stream = await;
const aiStream = LangChainAdapter.toAIStream(stream);
return new StreamingTextResponse(aiStream);

Then, we use the Vercel AI SDK's useCompletion method in the page component to handle the completion:

'use client';
import { useCompletion } from 'ai/react';
export default function Chat() {
const { completion, input, handleInputChange, handleSubmit } =
return (
<form onSubmit={handleSubmit}>
<input value={input} onChange={handleInputChange} />

More Examples

You can find additional examples in the Vercel AI SDK examples/next-langchain folder.