@coopah/bentley-provider-copilot
v0.3.0
Published
GitHub Copilot LLM provider for Bentley
Readme
@coopah/bentley-provider-copilot
GitHub Copilot LLM provider for Bentley. Uses the Copilot API (OpenAI-compatible) with automatic token refresh and Copilot-specific request headers.
Install
pnpm add @coopah/bentley-provider-copilotDependencies
@coopah/bentley-core@ai-sdk/openai^3.0.41 (Copilot exposes an OpenAI-compatible API)ai^6.0.116
Usage
import { createBentley } from "@coopah/bentley-core";
import { bentleyCopilotPlugin } from "@coopah/bentley-provider-copilot";
const bentley = createBentley({
plugins: [bentleyCopilotPlugin()],
});Authentication is handled automatically via CopilotAuth, which manages GitHub Copilot session tokens and refresh cycles. Requests include Copilot-specific headers (Editor-Version, Editor-Plugin-Version, Copilot-Integration-Id).
API
bentleyCopilotPlugin()—BentleyPluginthat registers GitHub Copilot as an LLM providercreateBentleyCopilotProvider()— Low-level factory with dynamic token refresh (Proxy-based lazy resolution)CopilotAuth— Session token management class
Related Packages
| Package | Role |
|---------|------|
| @coopah/bentley-core | Core runtime (required) |
| @coopah/bentley-provider-openai | OpenAI provider |
| @coopah/bentley-provider-anthropic | Anthropic provider |
| @coopah/bentley-provider-ollama | Ollama provider (local models) |
License
MIT
