@gesslar/lpc-mcp
v1.6.1
Published
MCP server for LPC language server
Maintainers
Readme
LPC MCP Server
Bring LPC development into 2025 with AI-powered code intelligence.
This MCP (Model Context Protocol) server wraps the jlchmura/lpc-language-server and exposes it to AI assistants, enabling natural language queries about your LPC codebase with real language server-powered understanding.
What This Enables
AI assistants can now:
- Understand your LPC code structure through the language server
- Get real documentation from hover information
- Jump to definitions and find references
- Answer natural language questions about your mudlib
- Trace inheritance chains and function calls
- Explain complex code patterns
All through conversation, powered by actual code intelligence instead of pattern matching.
Features
lpc_hover: Get documentation/hover information for symbolslpc_definition: Jump to definition of symbolslpc_references: Find all references to a symbollpc_diagnostics: Get real-time errors, warnings, and hints from the language server- Workspace-aware: Reads your
lpc-config.jsonfor proper symbol resolution - Fast: Direct JSON-RPC communication with the language server
Prerequisites
1. Install Node.js
Node.js 20+ required:
node --version # Should be v20.0.0 or higher2. Install the LPC Language Server Extension
The extension must be installed in VS Code (the server binary is bundled with it):
code --install-extension jlchmura.lpcVerify installation:
ls ~/.vscode/extensions/jlchmura.lpc-*/out/server/src/server.js3. Install Dependencies
cd /path/to/lpc-mcp
npm install4. Create lpc-config.json in Your Mudlib
The language server needs this config file at your mudlib root to understand includes, simul_efuns, etc.
Example /path/to/your/mudlib/lpc-config.json:
{
"driver": {
"type": "fluffos"
},
"libFiles": {
"master": "adm/obj/master.c",
"simul_efun": "adm/obj/simul_efun.c",
"global_include": "include/global.h"
},
"libInclude": [
"include",
"include/driver",
"adm/include"
],
"exclude": [
".git/",
"tmp/"
]
}Setup for Different AI Tools
Warp (Terminal)
Add to your Warp MCP configuration:
Location: Settings → AI → Model Context Protocol
{
"lpc": {
"command": "node",
"args": ["/absolute/path/to/lpc-mcp/index.js"],
"env": {
"LPC_WORKSPACE_ROOT": "/path/to/your/mudlib"
}
}
}Important: Use absolute paths! Replace:
/absolute/path/to/lpc-mcp/index.jswith the actual path to this repo/path/to/your/mudlibwith the directory containing yourlpc-config.json
Restart Warp after adding the configuration.
Claude Desktop
Add to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or equivalent:
{
"mcpServers": {
"lpc": {
"command": "node",
"args": ["/absolute/path/to/lpc-mcp/index.js"],
"env": {
"LPC_WORKSPACE_ROOT": "/path/to/your/mudlib"
}
}
}
}Restart Claude Desktop after configuration.
Cline (VS Code Extension)
Add to your Cline MCP settings:
{
"mcpServers": {
"lpc": {
"command": "node",
"args": ["/absolute/path/to/lpc-mcp/index.js"],
"env": {
"LPC_WORKSPACE_ROOT": "/path/to/your/mudlib"
}
}
}
}GitHub Copilot (VS Code)
Prerequisites:
- Install the Copilot MCP extension:
code --install-extension automatalabs.copilot-mcp
Configuration:
Add to ~/Library/Application Support/Code/User/mcp.json (macOS) or equivalent:
{
"servers": {
"lpc": {
"type": "node",
"command": "node",
"args": ["/absolute/path/to/lpc-mcp/index.js"],
"env": {
"LPC_WORKSPACE_ROOT": "/path/to/your/mudlib"
}
}
},
"inputs": []
}Other MCP-Compatible Tools
The configuration is the same for any MCP-compatible tool:
- Add the server to your MCP configuration
- Provide the Node.js command and path to
index.js - Set
LPC_WORKSPACE_ROOTenvironment variable to your mudlib root
Optional environment variables (LPC_DEBUG, LPC_LSP_PATH) can be added to the env block in the same way. See the Environment Variables section for the full list.
Usage Examples
Once configured, you can ask your AI assistant natural language questions:
"What does the query_short() function do in room.c?"
→ AI uses lpc_hover to get documentation
"Where is STD_OBJECT defined?"
→ AI uses lpc_definition to find the file
"Find all places that call set_room_size()"
→ AI uses lpc_references to locate all callers
"Explain how the maze generation algorithm works" → AI reads code and uses hover info to understand functions
"What's the inheritance tree for rooms?"
→ AI traces inherit statements and jumps to definitions
"Check if this LPC file has any syntax errors"
→ AI uses lpc_diagnostics to validate the code
"Why won't this LPC code compile?" → AI checks diagnostics for errors like undeclared variables or type mismatches
Testing
To verify the server works:
# Set workspace root for testing
export LPC_WORKSPACE_ROOT=/path/to/your/mudlib
# Start the server (it will wait for MCP protocol messages)
node index.jsThe server should output:
Starting LPC Language Server...
Initializing LSP...
LPC Language Server started successfully
LPC MCP Server running on stdioTroubleshooting
Server won't start
Check the LPC extension is installed:
code --list-extensions | grep jlchmura.lpcCheck logs (for Warp):
tail -f ~/.local/state/warp-terminal/mcp/*.logLanguage server not resolving symbols
Verify workspace root:
- Make sure
LPC_WORKSPACE_ROOTpoints to the directory withlpc-config.json - Use absolute paths, not relative
Check your lpc-config.json:
cat $LPC_WORKSPACE_ROOT/lpc-config.jsonExtension version mismatch
If the extension path doesn't match, update line 40 in index.js:
const lspPath = path.join(
process.env.HOME,
".vscode/extensions/jlchmura.lpc-VERSION/out/server/src/server.js"
);Find your version:
ls ~/.vscode/extensions/ | grep jlchmura.lpcEnvironment Variables
| Variable | Required | Description |
| --- | --- | --- |
| LPC_WORKSPACE_ROOT | Yes | Absolute path to your mudlib root (the directory containing lpc-config.json) |
| LPC_DEBUG | No | Set to true or 1 to enable debug mode, which loads a local development build of the LPC language server instead of the installed VS Code extension |
| LPC_LSP_PATH | No | Absolute path to a custom server.js for the LPC language server. Only used when LPC_DEBUG is enabled. If not set, debug mode falls back to out/server/src/server.js relative to process.cwd() |
Debug Mode Example
Shell / terminal:
# Use a local checkout of the LPC language server
export LPC_DEBUG=true
export LPC_LSP_PATH=/path/to/lpc-language-server/out/server/src/server.js
export LPC_WORKSPACE_ROOT=/path/to/your/mudlib
node index.jsMCP config file (.mcp.json, claude_desktop_config.json, etc.):
{
"mcpServers": {
"lpc": {
"command": "node",
"args": ["/absolute/path/to/lpc-mcp/index.js"],
"env": {
"LPC_WORKSPACE_ROOT": "/path/to/your/mudlib",
"LPC_DEBUG": "true",
"LPC_LSP_PATH": "/path/to/lpc-language-server/out/server/src/server.js"
}
}
}
}How It Works
AI Assistant
↓ (natural language)
MCP Protocol
↓ (tool calls: lpc_hover, lpc_definition, lpc_references)
This Server
↓ (JSON-RPC: textDocument/hover, etc.)
LPC Language Server
↓ (parses LPC, reads lpc-config.json)
Your Mudlib- AI assistant sends MCP tool requests
- Server reads the file and sends
textDocument/didOpento LSP - Server translates MCP → LSP JSON-RPC requests
- LSP analyzes code using your
lpc-config.json - Server returns LSP response as MCP result
- AI understands your code structure!
Credits
- John (jlchmura) - The INCOMPARABLY SKILLED MASTER PROGRAMMER whose LPC language server rescued LPC development from 1995. Without his greatness, kindness, and all-around hunk demeanour, we would still be
grep-ing through mudlibs like cavemen. This MCP server is merely a humble wrapper around his genius. - Model Context Protocol - The protocol making this possible
- Built in an hour of inspired hacking in 2025
License
@gesslar/lpc-mcp is released under the 0BSD.
This package includes or depends on third-party components under their own licenses:
| Dependency | License | | --- | --- | | @gesslar/toolkit | 0BSD | | @gesslar/uglier | 0BSD | | @modelcontextprotocol/sdk | MIT | | vscode-jsonrpc | MIT | | zod | MIT |
Note
The LPC language server itself (jlchmura/lpc-language-server) is under its own license.
