FriendliAI Provider
The FriendliAI provider supports both open-source LLMs via Friendli Serverless Endpoints and custom models via Dedicated Endpoints.
It creates language model objects that can be used with the generateText
, streamText
, generateObject
, and streamObject
functions.
Setup
The Friendli provider is available via the @friendliai/ai-provider
module.
You can install it with:
pnpm add @friendliai/ai-provider
Credentials
The tokens required for model usage can be obtained from the Friendli suite.
To use the provider, you need to set the FRIENDLI_TOKEN
environment variable with your personal access token.
export FRIENDLI_TOKEN="YOUR_FRIENDLI_TOKEN"
Check the FriendliAI documentation for more information.
Provider Instance
You can import the default provider instance friendliai
from @friendliai/ai-provider
:
import { friendli } from '@friendliai/ai-provider';
Language Models
You can create FriendliAI models using a provider instance.
The first argument is the model id, e.g. meta-llama-3.1-8b-instruct
.
const model = friendli('meta-llama-3.1-70b-instruct');
Example: Generating text
You can use FriendliAI language models to generate text with the generateText
function:
import { friendli } from '@friendliai/ai-provider';import { generateText } from 'ai';
const { text } = await generateText({ model: friendli('meta-llama-3.1-8b-instruct'), prompt: 'What is the meaning of life?',});
console.log(text);
Example: Structured Outputs (regex)
The regex option allows you to control the format of your LLM's output by specifying patterns. This can be particularly useful when you need:
- Specific formats like CSV
- Restrict output to specific characters such as Korean or Japanese
This feature is available with both generateText
and streamText
functions.
For a deeper understanding of how to effectively use regex patterns with LLMs, check out our detailed guide in the Structured Output LLM Agents blog post.
import { friendli } from '@friendliai/ai-provider';import { generateText } from 'ai';
const { text } = await generateText({ model: friendli('meta-llama-3.1-8b-instruct', { regex: '[\n ,.?!0-9\uac00-\ud7af]*', }), prompt: 'Who is the first king of the Joseon Dynasty?',});
console.log(text);
Example: Structured Outputs (json)
Structured outputs are a form of guided generation. The JSON schema is used as a grammar and the outputs will always conform to the schema.
import { friendli } from '@friendliai/ai-provider';import { generateObject } from 'ai';import { z } from 'zod';
const { object } = await generateObject({ model: friendli('meta-llama-3.1-8b-instruct'), schema: z.object({ brand: z.string(), model: z.string(), car_type: z.enum(['sedan', 'suv', 'truck', 'coupe']), }), prompt: 'Generate a JSON with the brand, model and car_type of the most iconic car from the 90s',});
console.log(JSON.stringify(object, null, 2));
Example: Using built-in tools
If you use @friendliai/ai-provider
, you can use the built-in tools via the tools
option.
Built-in tools allow models to use tools to generate better answers. For example, a web:search
tool can provide up-to-date answers to current questions.
import { friendli } from '@friendliai/ai-provider';import { streamText } from 'ai';
const result = streamText({ model: friendli('meta-llama-3.1-70b-instruct', { tools: [{ type: 'web:search' }, { type: 'math:calculator' }], }), prompt: 'Find the current USD to CAD exchange rate and calculate how much $5,000 USD would be in Canadian dollars.',});
for await (const textPart of result.textStream) { console.log(textPart);}
Example: Generating text with Dedicated Endpoints
To use a custom model via a dedicated endpoint, you can use the friendli.dedicated
instance with the endpoint id, e.g. zbimjgovmlcb
import { friendli } from '@friendliai/ai-provider';import { generateText } from 'ai';
const { text } = await generateText({ model: friendli.dedicated('YOUR_ENDPOINT_ID'), prompt: 'What is the meaning of life?',});
FriendliAI language models can also be used in the streamText
, generateObject
, streamObject
, and streamUI
functions.
(see AI SDK Core and AI SDK RSC).
Model Capabilities
Model | Image Input | Object Generation | Tool Usage | Tool Streaming |
---|---|---|---|---|
meta-llama-3.1-70b-instruct | ||||
meta-llama-3.1-8b-instruct | ||||
mixtral-8x7b-instruct-v0-1 |
To access more models, visit the Friendli Dedicated Endpoints documentation to deploy your custom models.
OpenAI Compatibility
You can also use @ai-sdk/openai
as the APIs are OpenAI-compatible.
import { createOpenAI } from '@ai-sdk/openai';
const friendli = createOpenAI({ baseURL: 'https://api.friendli.ai/serverless/v1', apiKey: process.env.FRIENDLI_TOKEN,});
If you are using dedicated endpoints
import { createOpenAI } from '@ai-sdk/openai';
const friendli = createOpenAI({ baseURL: 'https://api.friendli.ai/dedicated/v1', apiKey: process.env.FRIENDLI_TOKEN,});