@agllama/mcp
v0.5.12
Published
MCP server for Llama project management - connect Claude to your agile workflow
Downloads
1,515
Maintainers
Readme
@agllama/mcp
Connect Claude AI to AG-Llama project management using the Model Context Protocol (MCP). Manage issues, sprints, and boards directly from your AI assistant.
What is AG-Llama?
AG-Llama is a modern, lightweight Jira alternative for agile teams. This MCP server lets Claude AI interact with your projects, create issues, manage sprints, and automate workflows without leaving your conversation.
Installation
No installation required! Run directly with npx:
npx @agllama/mcpQuick Start
1. Get Your API Key
- Sign up at agllama.onrender.com
- Navigate to Settings > API Keys
- Click "Generate API Key"
- Copy the key (shown only once)
2. Configure Claude Desktop
Add the MCP server to your Claude Desktop configuration:
For Claude Desktop App
Edit your Claude Desktop config file:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json - Linux:
~/.config/Claude/claude_desktop_config.json
Add this configuration:
{
"mcpServers": {
"llama": {
"command": "npx",
"args": ["-y", "@agllama/mcp"],
"env": {
"LLAMA_API_URL": "https://agllama-api.onrender.com",
"LLAMA_API_KEY": "llm_xxxxxxxx"
}
}
}
}Replace llm_xxxxxxxx with your actual API key.
For Claude Code (CLI)
If you're using Claude Code, add to your project's .mcp.json:
{
"mcpServers": {
"llama": {
"command": "npx",
"args": ["-y", "@agllama/mcp"],
"env": {
"LLAMA_API_URL": "https://agllama-api.onrender.com",
"LLAMA_API_KEY": "llm_xxxxxxxx"
}
}
}
}Or use the Claude CLI:
claude mcp add llama -- npx -y @agllama/mcpThen set environment variables:
export LLAMA_API_URL=https://agllama-api.onrender.com
export LLAMA_API_KEY=llm_xxxxxxxx3. Restart Claude
Restart Claude Desktop or Claude Code to load the MCP server.
4. Test the Connection
In Claude, try:
Use llama to test the connectionYou should see your user info and available organizations.
Usage Examples
Get Project Overview
Use llama to get the context for my-team/PROJClaude will show you the active sprint, backlog, workflow statuses, and team members.
Create Issues
Use llama to create a high-priority story in my-team/PROJ:
- Summary: "Add user authentication"
- Description: "Implement JWT-based auth with refresh tokens"Manage Sprints
Use llama to:
1. Show me the backlog for my-team/PROJ
2. Create a new sprint called "Sprint 5"
3. Add the top 3 highest-priority issues to it
4. Start the sprintSearch Issues
Use llama to find all critical bugs assigned to me in my-teamBoard Operations
Use llama to show me the board for my-team/PROJAutomate Workflows
Use llama to suggest a workflow for "sprint planning"Claude will find and execute saved workflows from your organization.
Available Tools
The MCP server provides 40+ tools organized into categories:
Core Operations
- Context & Connection: Test connection, get project snapshots
- Organizations: List, create, and manage orgs
- Projects: Create and configure projects
- Issues: Full CRUD operations, status transitions, assignments
- Sprints: Create, start, complete sprints, manage sprint backlog
- Boards: View kanban boards, move issues between columns
Collaboration
- Comments: Add, update, delete comments on issues
- Issue Links: Create relationships (blocks, relates to, duplicates)
- Labels: Create and manage issue labels
- Members: List team members
Automation
- Search: Find issues with filters
- Workflows: Create and run reusable Claude workflows
- Help: Built-in documentation (
llama_help)
See MCP_TOOLS.md for the full tool reference.
Environment Variables
| Variable | Required | Description |
|----------|----------|-------------|
| LLAMA_API_URL | Yes | API endpoint (production: https://agllama-api.onrender.com) |
| LLAMA_API_KEY | Yes | Your API key from AG-Llama Settings |
Self-Hosted Setup
Running your own AG-Llama instance? Configure the MCP server with your local or custom API URL:
{
"mcpServers": {
"llama": {
"command": "npx",
"args": ["-y", "@agllama/mcp"],
"env": {
"LLAMA_API_URL": "http://localhost:3001",
"LLAMA_API_KEY": "your-api-key-here"
}
}
}
}Troubleshooting
"Connection failed" error
- Verify your API key is correct
- Check that
LLAMA_API_URLis set tohttps://agllama-api.onrender.com - Ensure you've restarted Claude after configuration changes
"Invalid API key" error
- Generate a new API key at agllama.onrender.com/settings/api-keys
- Make sure the key starts with
llm_
Tools not showing up
- Confirm the MCP server is configured correctly in Claude Desktop config
- Check Claude's MCP server logs for errors
- Try running
npx @agllama/mcpdirectly to test
Rate limiting
The production API has rate limits. If you're hitting limits, consider:
- Batching operations
- Using project context (
llama_context) to cache information - Self-hosting AG-Llama for unlimited access
Links
- Web App: agllama.onrender.com
- API Documentation: agllama-api.onrender.com
- Full Tool Reference:
MCP_TOOLS.md
License
MIT License - see LICENSE file for details.
Support
- Questions: Use the
llama_helptool in Claude for built-in documentation
Built with the Model Context Protocol by Anthropic.
