webdocs-mcp-server
v1.0.0
Published
MCP server for fetching real-time documentation from Langchain, Llama-Index, OpenAI, UV, and Qubrid using web search
Maintainers
Readme
📚 WebDocs MCP Server
AI-Powered Documentation Search & Retrieval System
A Model Context Protocol (MCP) server that fetches real-time documentation from popular libraries and frameworks using web search.
🌟 Features
- 🔍 Real-time Documentation Search - Fetches the latest docs from official sources
- 🚀 Multiple Library Support - Langchain, Llama-Index, OpenAI, UV, and Qubrid
- 🧹 Clean Content Extraction - Removes HTML noise and returns readable text
- 🔗 Source Attribution - Every piece of content includes its source URL
- ⚡ Fast & Async - Built with async/await for optimal performance
- 🤖 MCP Compatible - Works seamlessly with Claude Desktop and other MCP clients
Claude Desktop Usage Demo Screenshots

📦 Supported Libraries
| Library | Documentation Site |
| ------------------ | ------------------------------ |
| 🦜 Langchain | python.langchain.com/docs |
| 🦙 Llama-Index | docs.llamaindex.ai/en/stable |
| 🤖 OpenAI | platform.openai.com/docs |
| 📦 UV | docs.astral.sh/uv |
| 🎯 Qubrid | docs.qubrid.com |
🛠️ Installation
Quick Start (Recommended - NPM)
The easiest way to use this MCP server is via npx:
npx -y webdocs-mcp-serverThis requires:
- Node.js 16+ installed
- UV package manager (installation guide)
Manual Installation (For Development)
Prerequisites
- Python 3.10 or higher
- UV package manager
Setup
Clone the repository
git clone <your-repo-url> cd Docu_MCPInstall dependencies
uv syncSet up environment variables
Create a
.envfile in the project root:SERPER_API_KEY=your_serper_api_key_here GROQ_API_KEY=your_groq_api_key_here
🔑 API Keys Setup
1. Serper API Key (Required)
The Serper API is used for web search functionality.
- Visit serper.dev
- Sign up for a free account
- Navigate to your dashboard
- Copy your API key
- Add it to your
.envfile
2. Groq API Key (Optional)
Currently reserved for future LLM integration features.
- Visit console.groq.com
- Sign up and get your API key
- Add it to your
.envfile
🚀 Usage
Option 1: Using the MCP Client (Python)
Test the server directly with the included client:
uv run mcp_client.pyExample output:
Available Tools: ['get_docs']
[Documentation content with sources...]You can customize the query in mcp_client.py:
query = "How to setup ComfyUI AI ML Template?"
library = "qubrid"
res = await session.call_tool(
"get_docs",
arguments={"user_query": query, "library": library}
)Option 2: Using Claude Desktop
Step 1: Configure Claude Desktop
Open your Claude Desktop config file:
- Windows:
%APPDATA%\Claude\claude_desktop_config.json - macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Linux:
~/.config/Claude/claude_desktop_config.json
🚀 Recommended: NPM Installation (Remote)
Add this configuration to use the published npm package:
{
"mcpServers": {
"WebDocs": {
"command": "npx",
"args": ["-y", "webdocs-mcp-server"],
"env": {
"SERPER_API_KEY": "your_serper_api_key_here",
"GROQ_API_KEY": "your_groq_api_key_here"
}
}
}
}💻 Alternative: Local Development
If you're developing locally, use this configuration:
{
"mcpServers": {
"WebDocs": {
"command": "C:\\Users\\YOUR_USERNAME\\.local\\bin\\uv.EXE",
"args": ["--directory", "D:\\Path\\To\\Docu_MCP", "run", "mcp_server.py"],
"env": {
"SERPER_API_KEY": "your_serper_api_key_here",
"GROQ_API_KEY": "your_groq_api_key_here"
}
}
}
}⚠️ Important:
- For NPM method: Just add your API keys
- For local method: Replace
YOUR_USERNAMEandD:\\Path\\To\\Docu_MCPwith your actual paths!
Step 2: Restart Claude Desktop
- Completely close Claude Desktop
- Reopen Claude Desktop
- The WebDocs tool should now be available
Step 3: Use the Tool
In Claude Desktop, you can ask:
"Use the WebDocs tool to search for how to create a ReAct agent in Langchain"
Claude will automatically call the get_docs tool with the appropriate parameters.
Option 3: Using MCP Inspector (Debugging)
The MCP Inspector is a powerful tool for testing and debugging your MCP server.
Start the Inspector
npx @modelcontextprotocol/inspector uv --directory "D:\AbhiiiMan Codes\Docu_MCP" run mcp_server.pyThis will:
- Start the MCP server
- Launch a web interface at
http://localhost:5173 - Allow you to interactively test the server
Using the Inspector
- Open your browser to the provided URL
- Click "Connect" to establish connection
- Navigate to "Tools" tab
- Select
get_docstool - Fill in the parameters:
user_query: Your search querylibrary: One oflangchain,llama-index,openai,uv, orqubrid
- Click "Run Tool" to test
Troubleshooting
If port is already in use:
# Find and kill the process using port 6277
netstat -ano | findstr :6277
taskkill /PID <process_id> /F🔧 Available Tools
get_docs
Searches and retrieves documentation content from supported libraries.
Parameters:
user_query(string, required): The search query- Example:
"How to use Langchain with OpenAI"
- Example:
library(string, required): The library to search- Options:
langchain,llama-index,openai,uv,qubrid
- Options:
Returns:
- Text content from documentation with source URLs
Example:
await session.call_tool(
"get_docs",
arguments={
"user_query": "vector store integration",
"library": "langchain"
}
)📁 Project Structure
Docu_MCP/
├── 📄 mcp_server.py # Main MCP server implementation
├── 📄 mcp_client.py # Test client for the server
├── 📄 constants.py # Configuration and constants
├── 📄 utils.py # Utility functions (HTML cleaning)
├── 📄 test_server.py # Server launch test script
├── 📄 .env # Environment variables (create this)
├── 📄 pyproject.toml # Project dependencies
└── 📄 README.md # This file🔍 How It Works
graph LR
A[User Query] --> B[MCP Server]
B --> C[Serper API]
C --> D[Google Search]
D --> E[Documentation URLs]
E --> F[Fetch Content]
F --> G[Clean HTML]
G --> H[Return Text]
H --> I[User/Claude]- Query Construction: Combines library domain with user query
- Web Search: Uses Serper API to search Google
- Content Fetching: Retrieves raw HTML from documentation pages
- Content Cleaning: Extracts readable text using Trafilatura
- Response Formation: Formats content with source attribution
🐛 Debugging & Testing
Test Server Launch
Run the test script to verify configuration:
uv run test_server.pyExpected output:
✅ Server process started successfully!
✅ Server is running and accepting connections!Common Issues
| Issue | Solution |
| ----------------------------- | ----------------------------------------------- |
| 🔴 PORT IS IN USE | Kill process on port 6277 or use different port |
| 🔴 SERPER_API_KEY not found | Check .env file exists and contains valid key |
| 🔴 program not found | Use full path to uv.EXE in Claude config |
Enable Debug Logging
Add to your .env:
LOG_LEVEL=DEBUG🎯 Example Use Cases
1. Learning New Framework
Query: "Getting started with vector stores"
Library: "langchain"
→ Returns: Setup guides, installation steps, basic examples2. Troubleshooting
Query: "Error handling in async queries"
Library: "llama-index"
→ Returns: Error handling patterns, best practices3. API Reference
Query: "Chat completion parameters"
Library: "openai"
→ Returns: Parameter documentation, examples, limits4. Tool Setup
Query: "Installing UV on Windows"
Library: "uv"
→ Returns: Installation guide, configuration steps🤝 Contributing
Contributions are welcome! Here's how you can help:
- Add More Libraries: Update
constants.pywith new documentation sources - Improve Content Cleaning: Enhance the HTML extraction in
utils.py - Add Features: Implement caching, rate limiting, or semantic search
- Fix Bugs: Report issues or submit pull requests
📝 License
This project is licensed under the MIT License - see the LICENSE file for details.
🙏 Acknowledgments
- FastMCP - MCP server framework
- Serper - Web search API
- Trafilatura - Web content extraction
- Model Context Protocol - MCP specification
📧 Support
Having issues? Here's how to get help:
- 📖 Check this README thoroughly
- 🔍 Use MCP Inspector for debugging
- 💬 Open an issue on GitHub
Built with ❤️ by AbhiiiMan
⭐ Star this repo if you find it helpful!
