mcp-local-llm
v1.0.1
Published
MCP server that uses a local LLM to respond to queries - Binary distribution
Downloads
864
Readme
mcp-local-llm
MCP server that uses a local LLM to respond to queries - Binary distribution
Quick Start
Use directly with npx (no installation needed):
npx mcp-local-llmInstallation
Install globally:
npm install -g mcp-local-llmThen use:
mcp-local-llmOr install as a dependency:
npm install mcp-local-llmThen use:
npx mcp-local-llmOr directly:
node_modules/.bin/mcp-local-llmConfiguration
The server can be configured using environment variables:
export OLLAMA_URL=http://localhost:11434
export MODEL_NAME=llama3
export MAX_TOKENS=256
export TEMPERATURE=0.7Supported Platforms
- Linux (x64)
- Windows (x64)
- macOS (x64 and ARM64)
Requirements
- Node.js 18 or higher
- Ollama installed and running (https://ollama.ai/)
License
MIT
