Ollama Provider
sgomez/ollama-ai-provider is a community provider that uses Ollama to provide language model support for the Vercel AI SDK.
Setup
The Ollama provider is available in the ollama-ai-provider
module. You can install it with
pnpm install ollama-ai-provider
Provider Instance
You can import the default provider instance ollama
from ollama-ai-provider
:
import { ollama } from 'ollama-ai-provider';
If you need a customized setup, you can import createOllama
from ollama-ai-provider
and create a provider instance with your settings:
import { createOllama } from 'ollama-ai-provider';
const ollama = createOllama({ // custom settings});
You can use the following optional settings to customize the Ollama provider instance:
- baseURL string
Use a different URL prefix for API calls, e.g., to use proxy servers.
The default prefix is http://localhost:11434/api
.
Language Models
You can create models that call the Ollama Chat Completion API using the provider instance.
The first argument is the model id, e.g. phi3
. Some models have multi-modal capabilities.
const model = ollama('phi3');
You can find more models on the Ollama Library homepage.
Model Capabilities
This provider is capable of generating and streaming text and objects. Image input depends on models with multi-modal capabilities.
Embedding Models
You can create models that call the Ollama embeddings API
using the .embedding()
factory method.
const model = ollama.embedding('nomic-embed-text');