Friendli Provider

The FriendliAI provider supports both open-source LLMs via Friendli Serverless Endpoints and custom models via Dedicated Endpoints.

It creates language model objects that can be used with the generateText, streamText, generateObject, and streamObject functions.

Setup

The Friendli provider is available via the @friendliai/ai-provider module. You can install it with:

pnpm
npm
yarn
pnpm add @friendliai/ai-provider

Credentials

The tokens required for model usage can be obtained from the Friendli suite.

To use the provider, you need to set the FRIENDLI_TOKEN environment variable with your personal access token.

export FRIENDLI_TOKEN="YOUR_FRIENDLI_TOKEN"

Check the FriendliAI documentation for more information.

Provider Instance

You can import the default provider instance friendliai from @friendliai/ai-provider:

import { friendli } from '@friendliai/ai-provider';

Language Models

You can create FriendliAI models using a provider instance. The first argument is the model id, e.g. meta-llama-3.1-8b-instruct.

const model = friendli('meta-llama-3.1-70b-instruct');

Example: Generating text

You can use FriendliAI language models to generate text with the generateText function:

import { friendli } from '@friendliai/ai-provider';
import { generateText } from 'ai'
const { text } = await generateText({
model: friendli('meta-llama-3.1-8b-instruct')
prompt: 'What is the meaning of life?',
})

Example: Using built-in tools

Built-in tools are currently in beta.

If you use @friendliai/ai-provider, you can use the built-in tools via the tools option.

Built-in tools allow models to use tools to generate better answers. For example, a web:search tool can provide up-to-date answers to current questions.

import { friendli } from '@friendliai/ai-provider';
import { streamText } from 'ai';
export async function POST(req: Request) {
const { messages } = await req.json();
const result = streamText({
model: friendli('meta-llama-3.1-70b-instruct', {
tools: [
{ type: 'web:search' },
{ type: 'math:calculator' },
{ type: 'code:python-interpreter' }, // and more tools..!!
],
}),
messages,
});
return result.toDataStreamResponse();
}

Example: Generating text with Dedicated Endpoints

To use a custom model via a dedicated endpoint, you can use the friendli.dedicated instance with the endpoint id, e.g. zbimjgovmlcb

import { friendli } from '@friendliai/ai-provider';
import { generateText } from 'ai';
const { text } = await generateText({
model: friendli.dedicated('YOUR_ENDPOINT_ID'),
prompt: 'What is the meaning of life?',
});

FriendliAI language models can also be used in the streamText, generateObject, streamObject, and streamUI functions. (see AI SDK Core and AI SDK RSC).

Model Capabilities

ModelImage InputObject GenerationTool UsageTool Streaming
meta-llama-3.1-70b-instruct
meta-llama-3.1-8b-instruct
mixtral-8x7b-instruct-v0-1

To access more models, visit the Friendli Dedicated Endpoints documentation to deploy your custom models.

OpenAI Compatibility

You can also use @ai-sdk/openai as the APIs are OpenAI-compatible.

import { createOpenAI } from '@ai-sdk/openai';
const friendli = createOpenAI({
baseURL: 'https://api.friendli.ai/serverless/v1',
apiKey: process.env.FRIENDLI_TOKEN,
});

If you are using dedicated endpoints

import { createOpenAI } from '@ai-sdk/openai';
const friendli = createOpenAI({
baseURL: 'https://api.friendli.ai/dedicated/v1',
apiKey: process.env.FRIENDLI_TOKEN,
});