@coopah/bentley-provider-anthropic
v0.3.0
Published
Anthropic LLM provider for Bentley
Downloads
269
Readme
@coopah/bentley-provider-anthropic
Anthropic (Claude) LLM provider for Bentley, built on the Vercel AI SDK.
Install
pnpm add @coopah/bentley-provider-anthropicDependencies
@coopah/bentley-core@ai-sdk/anthropic^3.0.58ai^6.0.116
Usage
import { createBentley } from "@coopah/bentley-core";
import { bentleyAnthropicPlugin } from "@coopah/bentley-provider-anthropic";
const bentley = createBentley({
plugins: [
bentleyAnthropicPlugin(process.env.ANTHROPIC_API_KEY),
// or: bentleyAnthropicPlugin() — reads from ANTHROPIC_API_KEY env var
],
});The plugin registers a credential requirement for ANTHROPIC_API_KEY (category: llm), which is picked up by the server and Studio UI.
API
bentleyAnthropicPlugin(apiKey?)—BentleyPluginthat registers Anthropic as an LLM providercreateBentleyAnthropicProvider(apiKey?)— Low-level factory returning(modelId: string) => LanguageModelANTHROPIC_MODEL_COSTS— Cost-per-token constants for Anthropic models (used by cost tracking middleware)
Related Packages
| Package | Role |
|---------|------|
| @coopah/bentley-core | Core runtime (required) |
| @coopah/bentley-provider-openai | OpenAI provider |
| @coopah/bentley-provider-ollama | Ollama provider (local models) |
| @coopah/bentley-provider-copilot | GitHub Copilot provider |
License
MIT
