crawlforge-mcp-server
v3.0.3
Published
CrawlForge MCP Server - Professional Model Context Protocol server with 19 comprehensive web scraping, crawling, and content processing tools.
Maintainers
Keywords
Readme
CrawlForge MCP Server
Professional web scraping and content extraction server implementing the Model Context Protocol (MCP). Get started with 1,000 free credits - no credit card required!
🎯 Features
- 19 Professional Tools: Web scraping, deep research, stealth browsing, content analysis
- Free Tier: 1,000 credits to get started instantly
- MCP Compatible: Works with Claude, Cursor, and other MCP-enabled AI tools
- Enterprise Ready: Scale up with paid plans for production use
- Credit-Based: Pay only for what you use
🚀 Quick Start (2 Minutes)
1. Install from NPM
npm install -g crawlforge-mcp-server2. Setup Your API Key
npx crawlforge-setupThis will:
- Guide you through getting your free API key
- Configure your credentials securely
- Verify your setup is working
Don't have an API key? Get one free at https://www.crawlforge.dev/signup
3. Configure Your IDE
Add to claude_desktop_config.json:
{
"mcpServers": {
"crawlforge": {
"command": "npx",
"args": ["crawlforge-mcp-server"]
}
}
}Location:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%/Claude/claude_desktop_config.json - Linux:
~/.config/Claude/claude_desktop_config.json
Restart Claude Desktop to activate.
Add to .cursorrules in your project:
mcp_servers:
crawlforge:
command: npx
args: ["crawlforge-mcp-server"]Or use the MCP plugin in Cursor settings.
📊 Available Tools
Basic Tools (1 credit each)
fetch_url- Fetch content from any URLextract_text- Extract clean text from web pagesextract_links- Get all links from a pageextract_metadata- Extract page metadata
Advanced Tools (2-3 credits)
scrape_structured- Extract structured data with CSS selectorssearch_web- Search the web with Google/DuckDuckGosummarize_content- Generate intelligent summariesanalyze_content- Comprehensive content analysis
Premium Tools (5-10 credits)
crawl_deep- Deep crawl entire websitesmap_site- Discover and map website structurebatch_scrape- Process multiple URLs simultaneouslydeep_research- Multi-stage research with source verificationstealth_mode- Anti-detection browser management
Heavy Processing (3-10 credits)
process_document- Multi-format document processingextract_content- Enhanced content extractionscrape_with_actions- Browser automation chainsgenerate_llms_txt- Generate AI interaction guidelineslocalization- Multi-language and geo-location management
💳 Pricing
| Plan | Credits/Month | Best For | |------|---------------|----------| | Free | 1,000 | Testing & personal projects | | Starter | 5,000 | Small projects & development | | Professional | 50,000 | Professional use & production | | Enterprise | 250,000 | Large scale operations |
All plans include:
- Access to all 19 tools
- Credits never expire and roll over month-to-month
- API access and webhook notifications
🔧 Advanced Configuration
Environment Variables
# Optional: Set API key via environment
export CRAWLFORGE_API_KEY="sk_live_your_api_key_here"
# Optional: Custom API endpoint (for enterprise)
export CRAWLFORGE_API_URL="https://api.crawlforge.dev"Manual Configuration
Your configuration is stored at ~/.crawlforge/config.json:
{
"apiKey": "sk_live_...",
"userId": "user_...",
"email": "[email protected]"
}📖 Usage Examples
Once configured, use these tools in your AI assistant:
"Search for the latest AI news"
"Extract all links from example.com"
"Crawl the documentation site and summarize it"
"Monitor this page for changes"
"Extract product prices from this e-commerce site"🔒 Security & Privacy
- API keys are stored locally and encrypted
- All connections use HTTPS
- No data is stored on our servers beyond usage logs
- Compliant with robots.txt and rate limits
- GDPR compliant
🆘 Support
- Documentation: https://www.crawlforge.dev/docs
- Issues: GitHub Issues
- Email: [email protected]
- Discord: Join our community
📄 License
MIT License - see LICENSE file for details.
🤝 Contributing
Contributions are welcome! Please read our Contributing Guide first.
Built with ❤️ by the CrawlForge team
