git-slim
v1.0.0-slim.1.10
Published
git MCP (58% less tokens). Quick setup: npx git-slim --setup
Maintainers
Readme
git-slim
Git MCP server optimized for AI assistants — Reduce context window tokens by 59.0% while keeping full functionality. Compatible with Claude, ChatGPT, Gemini, Cursor, and all MCP clients.
What is git-slim?
A token-optimized version of the Git Model Context Protocol (MCP) server.
The Problem
MCP tool schemas consume significant context window tokens. When AI assistants like Claude or ChatGPT load MCP tools, each tool definition takes up valuable context space.
The original mcp-server-git loads 12 tools consuming approximately ~7,926 tokens — that's space you could use for actual conversation.
The Solution
git-slim intelligently groups 12 tools into 5 semantic operations, reducing token usage by 59.0% — with zero functionality loss.
Your AI assistant sees fewer, smarter tools. Every original capability remains available.
Performance
| Metric | Original | Slim | Reduction | |--------|----------|------|-----------| | Tools | 12 | 5 | -58% | | Schema Tokens | 1,086 | 400 | 63.2% | | Claude Code (est.) | ~7,926 | ~3,250 | ~59.0% |
Benchmark Info
- Original:
mcp-server-git@latest- Schema tokens measured with tiktoken (cl100k_base)
- Claude Code estimate includes ~570 tokens/tool overhead
Quick Start
One-Command Setup (Recommended)
# Claude Desktop - auto-configure
npx git-slim --setup claude
# Cursor - auto-configure
npx git-slim --setup cursor
# Interactive mode (choose your client)
npx git-slim --setupDone! Restart your app to use git.
CLI Tools (already have CLI?)
# Claude Code (creates .mcp.json in project root)
claude mcp add git -s project -- npx -y git-slim@latest
# Windows: use cmd /c wrapper
claude mcp add git -s project -- cmd /c npx -y git-slim@latest
# VS Code (Copilot, Cline, Roo Code)
code --add-mcp '{"name":"git","command":"npx","args":["-y","git-slim@latest"]}'Manual Setup
Claude Desktop
Add to your claude_desktop_config.json:
| OS | Path |
|----|------|
| Windows | %APPDATA%\Claude\claude_desktop_config.json |
| macOS | ~/Library/Application Support/Claude/claude_desktop_config.json |
{
"mcpServers": {
"git": {
"command": "npx",
"args": ["-y", "git-slim@latest"]
}
}
}Cursor
Add to .cursor/mcp.json (global) or <project>/.cursor/mcp.json (project):
{
"mcpServers": {
"git": {
"command": "npx",
"args": ["-y", "git-slim@latest"]
}
}
}How It Works
MCPSlim acts as a transparent bridge between AI models and the original MCP server:
┌─────────────────────────────────────────────────────────────────┐
│ Without MCPSlim │
│ │
│ [AI Model] ──── reads 12 tool schemas ────→ [Original MCP] │
│ (~7,926 tokens loaded into context) │
├─────────────────────────────────────────────────────────────────┤
│ With MCPSlim │
│ │
│ [AI Model] ───→ [MCPSlim Bridge] ───→ [Original MCP] │
│ │ │ │ │
│ Sees 5 grouped Translates to Executes actual │
│ tools only original call tool & returns │
│ (~3,250 tokens) │
└─────────────────────────────────────────────────────────────────┘How Translation Works
- AI reads slim schema — Only 5 grouped tools instead of 12
- AI calls grouped tool — e.g.,
interaction({ action: "click", ... }) - MCPSlim translates — Converts to original:
browser_click({ ... }) - Original MCP executes — Real server processes the request
- Response returned — Result passes back unchanged
Zero functionality loss. 59.0% token savings.
Available Tool Groups
| Group | Actions |
|-------|---------|
| mutation | 4 |
| navigation | 3 |
| query | 3 |
Plus 2 passthrough tools — tools that don't group well are kept as-is with optimized descriptions.
Compatibility
- ✅ Full functionality — All original
mcp-server-gitfeatures preserved - ✅ All AI assistants — Works with Claude, ChatGPT, Gemini, Copilot, and any MCP client
- ✅ Drop-in replacement — Same capabilities, just use grouped action names
- ✅ Tested — Schema compatibility verified via automated tests
FAQ
Does this reduce functionality?
No. Every original tool is accessible. Tools are grouped semantically (e.g., click, hover, drag → interaction), but all actions remain available via the action parameter.
Why do AI assistants need token optimization?
AI models have limited context windows. MCP tool schemas consume tokens that could be used for conversation, code, or documents. Reducing tool schema size means more room for actual work.
Is this officially supported?
MCPSlim is a community project. It wraps official MCP servers transparently — the original server does all the real work.
License
MIT
