@arelis-ai/governance-providers-huggingface
v1.0.0
Published
Arelis SDK provider adapter for Hugging Face Inference API and Dedicated Endpoints
Downloads
17
Maintainers
Readme
@arelis-ai/governance-providers-huggingface
Hugging Face Inference API + Dedicated Endpoints provider adapter for Arelis.
Install
pnpm add @arelis-ai/governance-providers-huggingfaceUsage (Inference API)
import {
createArelisClient,
createModelRegistry,
createAllowAllEngine,
createConsoleSink,
} from '@arelis-ai/arelis-governance-sdk';
import { HuggingFaceProvider } from '@arelis-ai/governance-providers-huggingface';
const provider = new HuggingFaceProvider(
{
token: process.env.HF_TOKEN,
apiType: 'inference',
},
['meta-llama/Llama-3-8b-instruct']
);
const registry = createModelRegistry();
registry.register(provider);
const client = createArelisClient({
modelRegistry: registry,
policyEngine: createAllowAllEngine(),
auditSink: createConsoleSink(),
});
const result = await client.models.generate({
model: 'meta-llama/Llama-3-8b-instruct',
request: {
model: 'meta-llama/Llama-3-8b-instruct',
messages: [{ role: 'user', content: 'Hello HF' }],
},
context: {
org: { id: 'org_1' },
actor: { type: 'human', id: 'user_1' },
purpose: 'testing',
environment: 'dev',
},
});
console.log(result.output.content);Usage (OpenAI-compatible endpoint)
const provider = new HuggingFaceProvider(
{
token: process.env.HF_TOKEN,
endpoint: 'https://your-endpoint.huggingface.cloud',
apiType: 'openai',
},
['tgi-openai']
);Notes
apiType: 'openai'enables streaming, tool calls, and multimodal inputs.apiType: 'inference'uses the standard Inference API and returns text only.
