@coopah/bentley-provider-openai
v0.3.0
Published
OpenAI LLM provider for Bentley
Readme
@coopah/bentley-provider-openai
OpenAI LLM provider for Bentley, built on the Vercel AI SDK.
Install
pnpm add @coopah/bentley-provider-openaiDependencies
@coopah/bentley-core@ai-sdk/openai^3.0.41ai^6.0.116
Usage
import { createBentley } from "@coopah/bentley-core";
import { bentleyOpenAIPlugin } from "@coopah/bentley-provider-openai";
const bentley = createBentley({
plugins: [
bentleyOpenAIPlugin(process.env.OPENAI_API_KEY),
// or: bentleyOpenAIPlugin() — reads from OPENAI_API_KEY env var
],
});The plugin registers a credential requirement for OPENAI_API_KEY (category: llm), which is picked up by the server and Studio UI.
API
bentleyOpenAIPlugin(apiKey?)—BentleyPluginthat registers OpenAI as an LLM providercreateBentleyOpenAIProvider(apiKey?)— Low-level factory returning(modelId: string) => LanguageModelOPENAI_MODEL_COSTS— Cost-per-token constants for OpenAI models (used by cost tracking middleware)
Related Packages
| Package | Role |
|---------|------|
| @coopah/bentley-core | Core runtime (required) |
| @coopah/bentley-provider-anthropic | Anthropic provider |
| @coopah/bentley-provider-ollama | Ollama provider (local models) |
| @coopah/bentley-provider-copilot | GitHub Copilot provider |
License
MIT
