@ai-gentik/lm-studio-mcp
v0.1.3
Published
MCP server for LM Studio local model inference
Readme
@ai-gentik/lm-studio-mcp
MCP server that bridges Claude to LM Studio for local model inference via HTTP API.
Installation
npm install @ai-gentik/lm-studio-mcpUsage
With Claude Code
Add to your user scope (available globally):
claude mcp add --transport stdio lm-studio "npx @ai-gentik/lm-studio-mcp" --scope userOr to a specific project:
cd your-project
claude mcp add --transport stdio lm-studio "npx @ai-gentik/lm-studio-mcp" --scope projectDirect
npx @ai-gentik/lm-studio-mcpRequirements
- Node.js 18+
- LM Studio running locally (default:
http://localhost:1234) - LM Studio API server enabled
Configuration
Set environment variables to customize:
LM_STUDIO_API_URL— LM Studio API endpoint (default:http://localhost:1234/v1/)LM_STUDIO_TIMEOUT— Request timeout in milliseconds (default:300000)LM_STUDIO_CHAR_LIMIT— Max characters in response (default:8000)
LM_STUDIO_API_URL=http://localhost:1234/v1/ npx @ai-gentik/lm-studio-mcpTools
- generate_completion — Text generation from a prompt
- chat_completion — Chat-based conversation
- list_models — List available models
- load_model — Load a model into memory
- unload_model — Unload a model from memory
Development
npm install
npm run build
npm startBuild
npm run buildOutputs bundled server to dist/index.js ready for distribution.
License
MIT
