recursa-mcp
v0.1.10
Published
Git-Native AI agent with MCP protocol support
Maintainers
Readme
Recursa MCP: The Git-Native Memory Layer for Local-First LLMs
[Project Status: Active Development]
TL;DR: Recursa MCP gives your AI a perfect, auditable memory that lives and grows in your local filesystem. It's an open-source Model Context Protocol (MCP) server that uses your Logseq/Obsidian graph as a dynamic, version-controlled knowledge base. Your AI's brain becomes a plaintext repository you can grep, edit, and commit.
Forget wrestling with databases or opaque cloud APIs. This is infrastructure-free, plaintext-first memory for agents that create.
The Problem: Agent Amnesia & The RAG Ceiling
You're building an intelligent agent and have hit the memory wall. The industry's current solutions are fundamentally flawed, leading to agents that can't truly learn or evolve:
- Vector DBs (RAG): A read-only librarian. It's excellent for retrieving existing facts but is structurally incapable of creating new knowledge, forming novel connections, or evolving its understanding based on new interactions.
- Opaque Self-Hosted Engines: You're lured by "open source" but are now a part-time DevOps engineer, managing Docker containers and databases instead of focusing on intelligence.
- Black-Box APIs: You trade infrastructure pain for a vendor's prison. Your AI's memory is locked away, inaccessible to your tools, and impossible to truly audit.
Recursa is built on a different philosophy: Your AI's memory should be a dynamic, transparent, and versionable extension of its own thought process, running entirely on your machine.
The Recursa Philosophy: Core Features
Recursa isn't a database; it's a reasoning engine. It treats a local directory of plaintext files—ideally a Git repository—as the agent's primary memory.
- Git-Native Memory: Every change is a
git commit. You get a perfect, auditable history. Branch memory, merge concepts, and revert to previous states. - Plaintext Supremacy: The AI's brain is a folder of markdown files. Compatible with Obsidian and Logseq.
- Think-Act-Commit Loop: The agent reasons, generates TypeScript code to modify memory, executes it in a secure sandbox, and commits the result.
- Safety Checkpoints: Agents can
mem.saveCheckpoint()before complex operations andmem.revertToLastCheckpoint()if they fail. - Token-Aware: Tools like
mem.getTokenCount()help the agent manage context limits efficiently. - Cross-Platform & Mobile Ready: Runs on Linux, macOS, Windows, and Android via Termux.
How It Works: Architecture
Recursa is a local, stateless server that acts as a bridge between your MCP client (e.g., Claude Desktop, custom tools), an LLM, and your local knowledge graph.
graph TD
subgraph Your Local Machine
A[MCP Client]
B[Recursa MCP Server]
C(Logseq/Obsidian Graph)
A -- 1. User Query via Stdio --> B
B -- 2. Think-Act-Commit Loop --> D{LLM API}
B -- 3. Executes Sandboxed Code --> C
C -- 4. Reads/Writes .md files --> C
B -- 5. Final Reply & Notifications --> A
end
subgraph Cloud Service
D[OpenRouter / LLM Provider]
end
style C fill:#e6f3ff,stroke:#333,stroke-width:2px
style B fill:#fff2cc,stroke:#333,stroke-width:2px- Query via MCP: Client sends a query.
- Think-Act Loop: Recursa plans using the LLM.
- Generate & Execute: The LLM generates TypeScript code; Recursa runs it in a Node.js VM sandbox.
- Interact with Files: The code uses the
memAPI to read/write markdown files. - Commit & Reply: The agent commits changes to Git and replies to the user.
🚀 Getting Started
Prerequisites
- Node.js (v20+ recommended)
- A local Logseq or Obsidian graph (a folder of
.mdfiles) - An OpenRouter.ai API Key
1. Installation
Option 1: Install via npm (Recommended)
npm install -g recursa-mcpOption 2: Clone and build from source
git clone https://github.com/recursa-hq/recursa-doc.git
cd recursa-doc
npm install2. Configuration
Create a .env file:
cp .env.example .envEdit .env:
# Required: Your OpenRouter API Key
OPENROUTER_API_KEY="sk-or-..."
# Required: The ABSOLUTE path to your graph's directory
KNOWLEDGE_GRAPH_PATH="/path/to/your/notes"
# Optional: Transport type - 'stdio' for MCP clients, 'sse' for development/testing
TRANSPORT_TYPE="stdio"
# Optional: Model selection
LLM_MODEL="anthropic/claude-3-haiku-20240307"3. Building and Running
Standard Development:
# Build the project
npm run build
# Start the server (defaults to stdio mode)
npm startFor Termux (Android):
Recursa is optimized for mobile devices running Termux.
# Install dependencies with Termux compatibility
npm run install:termux
# Build for Termux
npm run build:termux
# Start the server
npm run start:termux4. Testing with MCP Inspector
For development and debugging, you can test the server using the MCP Inspector:
For Stdio Transport (default):
# Test with CLI mode
TRANSPORT_TYPE=stdio npx @modelcontextprotocol/inspector --cli node dist/server.js --method tools/list
# Test with UI mode
TRANSPORT_TYPE=stdio npx @modelcontextprotocol/inspector node dist/server.jsFor SSE Transport (development mode):
# Set transport type in .env file
TRANSPORT_TYPE=sse
# Then test with inspector
npx @modelcontextprotocol/inspector node dist/server.jsImportant Notes:
- Stdio transport requires explicit
TRANSPORT_TYPE=stdioenvironment variable - Ensure
OPENROUTER_API_KEYandKNOWLEDGE_GRAPH_PATHare set in environment - CLI mode is useful for automated testing and debugging
5. Connecting an MCP Client
Recursa runs as an MCP server over Stdio. Configure your MCP client (like Claude Desktop) to run the startup command:
For npm-installed version:
{
"mcpServers": {
"recursa": {
"command": "recursa-mcp",
"env": {
"OPENROUTER_API_KEY": "sk-or-v1-your-key-here",
"KNOWLEDGE_GRAPH_PATH": "/absolute/path/to/graph",
"TRANSPORT_TYPE": "stdio",
"LLM_MODEL": "anthropic/claude-3-haiku-20240307"
}
}
}
}Using npx (no installation required):
{
"mcpServers": {
"recursa": {
"command": "npx",
"args": ["-y", "recursa-mcp@latest"],
"env": {
"OPENROUTER_API_KEY": "sk-or-v1-your-key-here",
"KNOWLEDGE_GRAPH_PATH": "/absolute/path/to/graph",
"TRANSPORT_TYPE": "stdio",
"LLM_MODEL": "anthropic/claude-3-haiku-20240307"
}
}
}
}For source-built version:
{
"mcpServers": {
"recursa": {
"command": "node",
"args": ["/path/to/recursa-doc/dist/server.js"],
"env": {
"OPENROUTER_API_KEY": "sk-or-v1-your-key-here",
"KNOWLEDGE_GRAPH_PATH": "/absolute/path/to/graph",
"TRANSPORT_TYPE": "stdio"
}
}
}
}Important Configuration Notes:
KNOWLEDGE_GRAPH_PATHmust be an absolute path (e.g.,/home/user/notesorC:\Users\user\notes)TRANSPORT_TYPEshould be set to"stdio"for MCP clients- The knowledge graph directory will be created automatically if it doesn't exist
- Git will be initialized automatically in the knowledge graph directory
Troubleshooting: If you encounter connection errors, see TROUBLESHOOTING.md for detailed diagnostic steps.
🛠️ Implemented Tools
The agent has access to the following capabilities via the mem object:
- File Operations:
readFile,writeFile,updateFile(atomic CAS),deletePath,rename,fileExists,createDir,listFiles. - Git Operations:
commitChanges,gitLog,gitDiff,getChangedFiles. - Graph Operations:
queryGraph(property & link queries),getBacklinks,getOutgoingLinks,searchGlobal. - State Management:
saveCheckpoint,revertToLastCheckpoint,discardChanges. - Utilities:
getTokenCount,getGraphRoot.
🗺️ Roadmap
- [ ] Visualizer: A simple web UI to visualize the agent's actions and the knowledge graph's evolution.
- [ ] Multi-modal Support: Storing and referencing images.
- [ ] Agent-to-Agent Collaboration: Enabling two Recursa agents to collaborate via Git.
📜 License
MIT License.
Stop building infrastructure. Start building intelligence.
