Fastify

You can use the AI SDK in a Fastify server to generate and stream text and objects to the client.

Examples

The examples start a simple HTTP server that listens on port 8080. You can e.g. test it using curl:

curl -X POST http://localhost:8080

The examples use the OpenAI gpt-4o model. Ensure that the OpenAI API key is set in the OPENAI_API_KEY environment variable.

Full example: github.com/vercel/ai/examples/fastify

Data Stream

You can use the toDataStream method to get a data stream from the result and then pipe it to the response.

index.ts
import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';
import Fastify from 'fastify';
const fastify = Fastify({ logger: true });
fastify.post('/', async function (request, reply) {
const result = streamText({
model: openai('gpt-4o'),
prompt: 'Invent a new holiday and describe its traditions.',
});
// Mark the response as a v1 data stream:
reply.header('X-Vercel-AI-Data-Stream', 'v1');
reply.header('Content-Type', 'text/plain; charset=utf-8');
return reply.send(result.toDataStream({ data }));
});
fastify.listen({ port: 8080 });

Data Stream With Stream Data

toDataStream can be used with StreamData to send additional data to the client.

index.ts
import { openai } from '@ai-sdk/openai';
import { StreamData, streamText } from 'ai';
import Fastify from 'fastify';
const fastify = Fastify({ logger: true });
fastify.post('/', async function (request, reply) {
// use stream data (optional):
const data = new StreamData();
data.append('initialized call');
const result = streamText({
model: openai('gpt-4o'),
prompt: 'Invent a new holiday and describe its traditions.',
onFinish() {
data.append('call completed');
data.close();
},
});
// Mark the response as a v1 data stream:
reply.header('X-Vercel-AI-Data-Stream', 'v1');
reply.header('Content-Type', 'text/plain; charset=utf-8');
return reply.send(result.toDataStream({ data }));
});
fastify.listen({ port: 8080 });

Text Stream

You can use the textStream property to get a text stream from the result and then pipe it to the response.

index.ts
import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';
import Fastify from 'fastify';
const fastify = Fastify({ logger: true });
fastify.post('/', async function (request, reply) {
const result = streamText({
model: openai('gpt-4o'),
prompt: 'Invent a new holiday and describe its traditions.',
});
reply.header('Content-Type', 'text/plain; charset=utf-8');
return reply.send(result.textStream);
});
fastify.listen({ port: 8080 });