@glueco/plugin-llm-gemini
v0.1.2
Published
Google Gemini LLM plugin for Personal Resource Gateway
Downloads
229
Readme
@glueco/plugin-llm-gemini
Google Gemini LLM plugin for Personal Resource Gateway.
Installation
npm install @glueco/plugin-llm-geminiUsage
- Install the package
- Add to
proxy.plugins.tsat repository root:
const enabledPlugins = [
"@glueco/plugin-llm-gemini",
// ... other plugins
] as const;- Run
npm run buildor redeploy
Features
- OpenAI Compatibility: Accepts OpenAI-compatible requests and translates them to Gemini API format
- Response Translation: Converts Gemini responses back to OpenAI format
- Streaming Support: Full streaming support with SSE translation
Supported Actions
chat.completions- OpenAI-compatible chat completions
Supported Models
- gemini-2.0-flash-exp
- gemini-1.5-flash
- gemini-1.5-flash-8b
- gemini-1.5-pro
Enforcement Support
This plugin supports the following enforcement knobs:
model- Restrict to specific modelsmax_tokens- Limit output tokensstreaming- Enable/disable streaming
Credentials
Required credentials in proxy admin:
apiKey- Your Google AI Studio API keybaseUrl(optional) - Custom API base URL
