@zenning/huggingface
v1.0.7
Published
The **[Hugging Face Inference Providers](https://huggingface.co/docs/inference-providers/index)** for the [Vercel AI SDK](https://ai-sdk.dev/docs) contains language model support for thousands of models through multiple inference providers via the Hugging
Downloads
22
Readme
Vercel AI SDK - Hugging Face Provider
The Hugging Face Inference Providers for the Vercel AI SDK contains language model support for thousands of models through multiple inference providers via the Hugging Face router API.
Setup
The Hugging Face provider is available in the @ai-sdk/huggingface module. You can install it with:
npm i @ai-sdk/huggingfaceProvider Instance
You can import the default provider instance huggingface from @ai-sdk/huggingface:
import { huggingface } from '@ai-sdk/huggingface';Example
import { huggingface } from '@ai-sdk/huggingface';
import { generateText } from 'ai';
const { text } = await generateText({
model: huggingface('meta-llama/Llama-3.1-8B-Instruct'),
prompt: 'Write a vegetarian lasagna recipe.',
});Documentation
Please check out the Hugging Face provider documentation for more information.
