Portkey Provider
Portkey natively integrates with the AI SDK to make your apps production-ready and reliable. Import Portkey's Vercel package and use it as a provider in your Vercel AI app to enable all of Portkey's features:
- Full-stack observability and tracing for all requests
- Interoperability across 250+ LLMs
- Built-in 50+ state-of-the-art guardrails
- Simple & semantic caching to save costs & time
- Conditional request routing with fallbacks, load-balancing, automatic retries, and more
- Continuous improvement based on user feedback
Learn more at Portkey docs for the AI SDK
Setup
The Portkey provider is available in the @portkey-ai/vercel-provider
module. You can install it with:
pnpm add @portkey-ai/vercel-provider
Provider Instance
To create a Portkey provider instance, use the createPortkey
function:
import { createPortkey } from '@portkey-ai/vercel-provider';
const portkeyConfig = { provider: 'openai', //enter provider of choice api_key: 'OPENAI_API_KEY', //enter the respective provider's api key override_params: { model: 'gpt-4', //choose from 250+ LLMs },};
const portkey = createPortkey({ apiKey: 'YOUR_PORTKEY_API_KEY', config: portkeyConfig,});
You can find your Portkey API key in the Portkey Dashboard.
Language Models
Portkey supports both chat and completion models. Use portkey.chatModel()
for chat models and portkey.completionModel()
for completion models:
const chatModel = portkey.chatModel('');const completionModel = portkey.completionModel('');
Note: You can provide an empty string as the model name if you've defined it in the portkeyConfig
.
Examples
You can use Portkey language models with the generateText
or streamText
function:
generateText
import { createPortkey } from '@portkey-ai/vercel-provider';import { generateText } from 'ai';
const portkey = createPortkey({ apiKey: 'YOUR_PORTKEY_API_KEY', config: portkeyConfig,});
const { text } = await generateText({ model: portkey.chatModel(''), prompt: 'What is Portkey?',});
console.log(text);
streamText
import { createPortkey } from '@portkey-ai/vercel-provider';import { streamText } from 'ai';
const portkey = createPortkey({ apiKey: 'YOUR_PORTKEY_API_KEY', config: portkeyConfig,});
const result = await streamText({ model: portkey.completionModel(''), prompt: 'Invent a new holiday and describe its traditions.',});
for await (const chunk of result) { console.log(chunk);}
Note:
- Portkey supports
Tool
use with the AI SDK generatObject
andstreamObject
are currently not supported.
Advanced Features
Portkey offers several advanced features to enhance your AI applications:
-
Interoperability: Easily switch between 250+ AI models by changing the provider and model name in your configuration.
-
Observability: Access comprehensive analytics and logs for all your requests.
-
Reliability: Implement caching, fallbacks, load balancing, and conditional routing.
-
Guardrails: Enforce LLM behavior in real-time with input and output checks.
-
Security and Compliance: Set budget limits and implement fine-grained user roles and permissions.
For detailed information on these features and advanced configuration options, please refer to the Portkey documentation.