@glueco/plugin-llm-groq
v0.1.2
Published
Groq LLM plugin for Personal Resource Gateway
Readme
@glueco/plugin-llm-groq
Groq LLM plugin for Personal Resource Gateway.
Installation
npm install @glueco/plugin-llm-groqUsage
- Install the package
- Add to
proxy.plugins.tsat repository root:
const enabledPlugins = [
"@glueco/plugin-llm-groq",
// ... other plugins
] as const;- Run
npm run buildor redeploy
Supported Actions
chat.completions- OpenAI-compatible chat completions
Supported Models
- llama-3.3-70b-versatile
- llama-3.1-70b-versatile
- llama-3.1-8b-instant
- llama3-70b-8192
- llama3-8b-8192
- mixtral-8x7b-32768
- gemma2-9b-it
Enforcement Support
This plugin supports the following enforcement knobs:
model- Restrict to specific modelsmax_tokens- Limit output tokensstreaming- Enable/disable streaming
Credentials
Required credentials in proxy admin:
apiKey- Your Groq API keybaseUrl(optional) - Custom API base URL
