npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@agllama/mcp

v0.5.12

Published

MCP server for Llama project management - connect Claude to your agile workflow

Downloads

1,515

Readme

@agllama/mcp

npm version License: MIT Node.js Version

Connect Claude AI to AG-Llama project management using the Model Context Protocol (MCP). Manage issues, sprints, and boards directly from your AI assistant.

What is AG-Llama?

AG-Llama is a modern, lightweight Jira alternative for agile teams. This MCP server lets Claude AI interact with your projects, create issues, manage sprints, and automate workflows without leaving your conversation.

Installation

No installation required! Run directly with npx:

npx @agllama/mcp

Quick Start

1. Get Your API Key

  1. Sign up at agllama.onrender.com
  2. Navigate to Settings > API Keys
  3. Click "Generate API Key"
  4. Copy the key (shown only once)

2. Configure Claude Desktop

Add the MCP server to your Claude Desktop configuration:

For Claude Desktop App

Edit your Claude Desktop config file:

  • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
  • Windows: %APPDATA%\Claude\claude_desktop_config.json
  • Linux: ~/.config/Claude/claude_desktop_config.json

Add this configuration:

{
  "mcpServers": {
    "llama": {
      "command": "npx",
      "args": ["-y", "@agllama/mcp"],
      "env": {
        "LLAMA_API_URL": "https://agllama-api.onrender.com",
        "LLAMA_API_KEY": "llm_xxxxxxxx"
      }
    }
  }
}

Replace llm_xxxxxxxx with your actual API key.

For Claude Code (CLI)

If you're using Claude Code, add to your project's .mcp.json:

{
  "mcpServers": {
    "llama": {
      "command": "npx",
      "args": ["-y", "@agllama/mcp"],
      "env": {
        "LLAMA_API_URL": "https://agllama-api.onrender.com",
        "LLAMA_API_KEY": "llm_xxxxxxxx"
      }
    }
  }
}

Or use the Claude CLI:

claude mcp add llama -- npx -y @agllama/mcp

Then set environment variables:

export LLAMA_API_URL=https://agllama-api.onrender.com
export LLAMA_API_KEY=llm_xxxxxxxx

3. Restart Claude

Restart Claude Desktop or Claude Code to load the MCP server.

4. Test the Connection

In Claude, try:

Use llama to test the connection

You should see your user info and available organizations.

Usage Examples

Get Project Overview

Use llama to get the context for my-team/PROJ

Claude will show you the active sprint, backlog, workflow statuses, and team members.

Create Issues

Use llama to create a high-priority story in my-team/PROJ:
- Summary: "Add user authentication"
- Description: "Implement JWT-based auth with refresh tokens"

Manage Sprints

Use llama to:
1. Show me the backlog for my-team/PROJ
2. Create a new sprint called "Sprint 5"
3. Add the top 3 highest-priority issues to it
4. Start the sprint

Search Issues

Use llama to find all critical bugs assigned to me in my-team

Board Operations

Use llama to show me the board for my-team/PROJ

Automate Workflows

Use llama to suggest a workflow for "sprint planning"

Claude will find and execute saved workflows from your organization.

Available Tools

The MCP server provides 40+ tools organized into categories:

Core Operations

  • Context & Connection: Test connection, get project snapshots
  • Organizations: List, create, and manage orgs
  • Projects: Create and configure projects
  • Issues: Full CRUD operations, status transitions, assignments
  • Sprints: Create, start, complete sprints, manage sprint backlog
  • Boards: View kanban boards, move issues between columns

Collaboration

  • Comments: Add, update, delete comments on issues
  • Issue Links: Create relationships (blocks, relates to, duplicates)
  • Labels: Create and manage issue labels
  • Members: List team members

Automation

  • Search: Find issues with filters
  • Workflows: Create and run reusable Claude workflows
  • Help: Built-in documentation (llama_help)

See MCP_TOOLS.md for the full tool reference.

Environment Variables

| Variable | Required | Description | |----------|----------|-------------| | LLAMA_API_URL | Yes | API endpoint (production: https://agllama-api.onrender.com) | | LLAMA_API_KEY | Yes | Your API key from AG-Llama Settings |

Self-Hosted Setup

Running your own AG-Llama instance? Configure the MCP server with your local or custom API URL:

{
  "mcpServers": {
    "llama": {
      "command": "npx",
      "args": ["-y", "@agllama/mcp"],
      "env": {
        "LLAMA_API_URL": "http://localhost:3001",
        "LLAMA_API_KEY": "your-api-key-here"
      }
    }
  }
}

Troubleshooting

"Connection failed" error

  • Verify your API key is correct
  • Check that LLAMA_API_URL is set to https://agllama-api.onrender.com
  • Ensure you've restarted Claude after configuration changes

"Invalid API key" error

Tools not showing up

  • Confirm the MCP server is configured correctly in Claude Desktop config
  • Check Claude's MCP server logs for errors
  • Try running npx @agllama/mcp directly to test

Rate limiting

The production API has rate limits. If you're hitting limits, consider:

  • Batching operations
  • Using project context (llama_context) to cache information
  • Self-hosting AG-Llama for unlimited access

Links

License

MIT License - see LICENSE file for details.

Support

  • Questions: Use the llama_help tool in Claude for built-in documentation

Built with the Model Context Protocol by Anthropic.