@ai-sdk/huggingface
v1.0.49
Published
The **[Hugging Face Inference Providers](https://huggingface.co/docs/inference-providers/index)** for the [Vercel AI SDK](https://ai-sdk.dev/docs) contains language model support for thousands of models through multiple inference providers via the Hugging
Keywords
Readme
Vercel AI SDK - Hugging Face Provider
The Hugging Face Inference Providers for the Vercel AI SDK contains language model support for thousands of models through multiple inference providers via the Hugging Face router API.
Deploying to Vercel? With Vercel's AI Gateway you can access Hugging Face (and hundreds of models from other providers) — no additional packages, API keys, or extra cost. Get started with AI Gateway.
Setup
The Hugging Face provider is available in the @ai-sdk/huggingface module. You can install it with:
npm i @ai-sdk/huggingfaceSkill for Coding Agents
If you use coding agents such as Claude Code or Cursor, we highly recommend adding the AI SDK skill to your repository:
npx skills add vercel/aiProvider Instance
You can import the default provider instance huggingface from @ai-sdk/huggingface:
import { huggingface } from '@ai-sdk/huggingface';Example
import { huggingface } from '@ai-sdk/huggingface';
import { generateText } from 'ai';
const { text } = await generateText({
model: huggingface('meta-llama/Llama-3.1-8B-Instruct'),
prompt: 'Write a vegetarian lasagna recipe.',
});Documentation
Please check out the Hugging Face provider documentation for more information.
