Azure Custom Provider for AI SDK
The Quail-AI/azure-ai-provider enables unofficial integration with Azure-hosted language models that use Azure's native APIs instead of the standard OpenAI API format.
Language Models
This provider works with any model in the Azure AI Foundry that is compatible with the Azure-Rest AI-inference API.
Note: This provider is not compatible with the Azure OpenAI models. For those, please use the Azure OpenAI Provider.
Models Tested:
- DeepSeek-R1
- LLama 3.3-70B Instruct
- Cohere-command-r-08-2024
Setup
Installation
Install the provider via npm:
npm i @quail-ai/azure-ai-provider
Provider Instance
Create an Azure AI resource and set up your endpoint URL and API key. Add the following to your .env
file:
AZURE_API_ENDPOINT=https://<your-resource>.services.ai.azure.com/modelsAZURE_API_KEY=<your-api-key>
Import createAzure
from the package to create your provider instance:
import { createAzure } from '@quail-ai/azure-ai-provider';
const azure = createAzure({ endpoint: process.env.AZURE_API_ENDPOINT, apiKey: process.env.AZURE_API_KEY,});
Basic Usage
Generate text using the Azure custom provider:
import { generateText } from 'ai';
const { text } = await generateText({ model: azure('your-deployment-name'), prompt: 'Write a story about a robot.',});
Status
✅ Chat Completions: Working with both streaming and non-streaming responses
⚠️ Tool Calling: Functionality highly dependent on model choice
⚠️ Embeddings: Implementation present but untested