luunix
v0.4.6
Published
MCP runtime + package manager + memory layer - One MCP to rule them all
Maintainers
Readme
Luunix
MCP Runtime + Package Manager + Memory Layer - One MCP to rule them all
Luunix is a runtime and package manager for Model Context Protocol (MCP) servers. It allows you to dynamically load, unload, and manage multiple MCPs through a single interface, with a built-in memory layer for context management.
Supported AI Clients
Luunix supports 7 popular AI coding assistants:
| Client | Config Location | Status |
|--------|----------------|--------|
| Claude Code | ~/.claude.json | Full Support |
| Cursor | .cursor/mcp.json | Full Support |
| Windsurf | ~/.codeium/windsurf/mcp_config.json | Full Support |
| Zed | ~/.config/zed/settings.json | Full Support |
| Continue | ~/.continue/config.json | Full Support |
| Cline | VS Code Settings | Full Support |
| Codex CLI | ~/.codex/config.yaml | Full Support |
Why Luunix?
Token Savings
Traditional approach: All MCP tools are exposed to the LLM in every API request.
- 50 tools × ~150 tokens/tool = ~7,500 tokens per turn
Luunix approach: Only 10 core tools, load MCPs on-demand, store large responses in memory.
- 10 tools × ~150 tokens = ~1,500 tokens per turn
- Saves ~80% of tool schema tokens
- Memory layer further reduces context by storing large MCP responses
How It Works
┌─────────────────────────────────────────────────────────┐
│ Claude Code │
│ │
│ Only sees 10 Luunix tools: │
│ • luunix.list • luunix.load • luunix.call │
│ • luunix.unload • luunix.status │
│ • luunix.store • luunix.recall • luunix.forget│
│ • luunix.compress • luunix.memory_stats │
└─────────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────┐
│ Luunix Server │
│ │
│ MCP Runtime: │
│ Manages 100+ MCPs internally, load on-demand │
│ │
│ Memory Layer: │
│ SQLite + Local Embedding (all-MiniLM-L6-v2) │
│ Store large responses → Return summaries │
│ Semantic search for context recall │
└─────────────────────────────────────────────────────────┘Features
- Token Efficient: Only 10 tools exposed, saves ~80% schema tokens
- Dynamic Loading: Load/unload MCPs at runtime
- Full Schema Return:
luunix.loadreturns complete parameter schemas - Unified Proxy: Call any MCP tool via
luunix.call - Built-in Registry: Pre-configured popular MCPs
- Memory Layer: Store large MCP responses, recall via semantic search
- Local Embedding: No external API needed, runs entirely offline
Installation
npm install -g luunixQuick Start
1. Initialize Luunix
luunix initThis will automatically detect and configure your AI client to use Luunix. Supports Claude Code, Cursor, Windsurf, Zed, Continue, Cline, and Codex CLI.
Interactive Mode
Run luunix without arguments to enter the interactive menu:
luunix? What would you like to do?
❯ 🔍 Search & Install MCPs
📦 Manage Installed MCPs
⚙️ Configure AI Clients
📊 View Status
❌ Exit2. Add MCPs to Registry
luunix add context7 github playwright3. Use in Your AI Client
# Step 1: List available MCPs
luunix.list
# Step 2: Load an MCP (returns full tool schemas!)
luunix.load("context7")
# Returns:
# {
# "tools": [
# {
# "name": "query-docs",
# "parameters": {
# "libraryId": "string (required) - Context7 library ID",
# "query": "string (required) - Search query"
# }
# }
# ]
# }
# Step 3: Call the tool with correct parameters
luunix.call({
mcp: "context7",
tool: "query-docs",
args: { libraryId: "/pmndrs/zustand", query: "persist state" }
})
# Step 4: Store large responses in memory
luunix.store({
content: "[large MCP response...]",
tags: ["docs", "zustand"],
sourceMcp: "context7"
})
# Returns: { id: "abc-123", summary: "...", tokensSaved: 500 }
# Step 5: Recall context later via semantic search
luunix.recall({ query: "zustand persist", limit: 3 })CLI Commands
luunix init
Initialize Luunix and configure your AI client.
luunix init
luunix init --force # Force reconfigurationluunix add <mcps...>
Add MCPs to your installation.
luunix add context7
luunix add context7 github playwright # Add multiple at onceluunix remove <mcps...>
Remove MCPs from your installation.
luunix remove github
luunix rm context7 github # 'rm' is an aliasluunix list
List MCPs.
luunix list # Show installed MCPs
luunix list --all # Show all available MCPs
luunix ls -a # 'ls' is an aliasluunix search [keyword]
Search MCPs in remote registry (17,000+ available).
luunix search # List popular MCPs
luunix search github # Search for GitHub-related MCPs
luunix search --limit 50 # Show more results
luunix search --refresh # Force refresh cacheluunix config
Show Luunix configuration.
luunix config
luunix config --path # Show config file path onlyluunix serve
Start the Luunix MCP server. This is called automatically by AI clients.
luunix serveMCP Server Tools
When running as an MCP server, Luunix provides 10 tools:
Runtime Tools
luunix.list
List all available MCPs and their status.
Returns: MCP names, descriptions, install status, load status, available tools.
luunix.load
Load an MCP and get its complete tool schemas.
Input:
{ "name": "context7" }Returns: Full tool definitions including parameter names, types, and descriptions. This allows the LLM to correctly call tools via luunix.call.
luunix.call
Call a tool from a loaded MCP.
Input:
{
"mcp": "context7",
"tool": "query-docs",
"args": {
"libraryId": "/pmndrs/zustand",
"query": "how to persist state"
}
}Returns: The tool's response, proxied from the sub-MCP.
luunix.unload
Unload an MCP and free its resources.
Input:
{ "name": "context7" }luunix.status
Show Luunix server status with full tool schemas for all loaded MCPs.
Returns: Server info, loaded MCP count, total proxied tools, and complete tool schemas.
Memory Tools
luunix.store
Store content in the memory layer. Returns a summary and ID.
Input:
{
"content": "Large MCP response text...",
"tags": ["github", "issues"],
"sourceMcp": "github",
"sourceTool": "search_issues"
}Returns:
{
"success": true,
"id": "a0fba3ec-20d5-4db9-821f-15ec5b95d19c",
"summary": "[~171 tokens, 17 lines] GitHub Issue #456...",
"tokensSaved": 115,
"tags": ["github", "issues"]
}luunix.recall
Recall memories using semantic search.
Input:
{
"query": "database performance issue",
"limit": 5,
"threshold": 0.3,
"tags": ["github"],
"includeContent": false
}Returns: Matching memories sorted by similarity score.
luunix.compress
Compress content to generate a summary without storing it.
Input:
{
"content": "Very long text...",
"maxLength": 200
}Returns: Summary with token counts and compression ratio.
luunix.forget
Delete memory entries by ID or filter.
Input:
{
"id": "specific-memory-id",
"sourceMcp": "github",
"tags": ["outdated"],
"olderThanDays": 30
}luunix.memory_stats
Show memory store statistics.
Returns:
{
"totalEntries": 42,
"totalTokens": 15000,
"totalTags": 12,
"bySource": { "github": 20, "notion": 22 }
}Built-in MCPs
| Name | Description |
|------|-------------|
| context7 | Get up-to-date documentation and code examples for any library |
| github | GitHub platform interaction - issues, PRs, repos |
| playwright | Browser automation - web scraping, screenshots, testing |
| filesystem | File system access - read and write local files |
| fetch | HTTP requests - fetch web content |
Configuration
Luunix Config
Stored at ~/.luunix/config.yaml:
installed:
- context7
- github
custom: {}
autoload: []
memory:
enabled: true
# dbPath: ~/.luunix/memory.db # Optional custom path
# embeddingModel: Xenova/all-MiniLM-L6-v2 # Optional custom modelMemory Layer
The memory layer uses:
- SQLite (
~/.luunix/memory.db) for persistent storage - Local Embedding (Xenova/all-MiniLM-L6-v2, 384 dimensions) for semantic search
- WAL mode for improved performance
First use will download the embedding model (~100MB). Subsequent calls are instant.
Claude Code Config
Luunix is configured in ~/.claude.json under mcpServers:
{
"mcpServers": {
"luunix": {
"command": "node",
"args": ["/path/to/luunix/bin/luunix.js", "serve"],
"type": "stdio"
}
}
}Or if installed globally:
{
"mcpServers": {
"luunix": {
"command": "luunix",
"args": ["serve"],
"type": "stdio"
}
}
}Cursor
Luunix is configured in .cursor/mcp.json in your project directory.
Custom MCPs
You can add custom MCP servers in ~/.luunix/config.yaml. See Custom MCP Guide for details.
Documentation
- Architecture - System design and module overview
- Custom MCP Guide - How to add your own MCPs
- Troubleshooting - Common issues and solutions
Token Savings Calculation
| Scenario | Traditional | Luunix | Savings | |----------|-------------|--------|---------| | 10 MCPs × 5 tools | ~7,500 tokens/turn | ~1,500 tokens/turn | 80% | | 10-turn conversation | ~75,000 tokens | ~16,500 tokens | 78% | | 100 MCPs available | ~75,000 tokens/turn | ~1,500 tokens/turn | 98% | | Large MCP response | ~2,000 tokens | ~200 tokens (summary) | 90% |
The key insight: Tool schemas in system prompt are sent every turn. Luunix moves schemas to conversation history (one-time cost). The memory layer further reduces context by replacing large responses with summaries.
License
MIT
