@chinchillaenterprises/mcp-recall
v1.1.0
Published
Event-driven MCP server for Recall.ai meeting transcription with enhanced speaker identification and local storage
Downloads
9
Readme
MCP Recall - Event-Driven Meeting Transcription Server
Event-driven MCP server for Recall.ai meeting transcription with enhanced speaker identification and local storage
🚀 Features
- Async Event-Driven Architecture: Start transcription jobs and check progress without blocking
- Enhanced Speaker Identification: Combines Recall.ai's speaker timeline with Whisper transcription
- Local Temporary Storage: Stores transcripts locally with automatic cleanup
- Multi-Region Support: Works with all Recall.ai regions (US, EU, Asia)
- Intelligent Summarization: Extracts key decisions, action items, and topics
- Chunked Access: Get specific time ranges or search within transcripts
- Background Processing: Uses OpenAI Whisper for high-quality transcription
- Job Management: Track multiple transcription jobs with detailed progress
📋 Prerequisites
- Node.js 18+
- Python 3.8+ with the following packages:
pip install openai-whisper requests - Recall.ai API Key - Get yours at recall.ai
🛠️ Installation
Quick Install for Claude Code
# Install globally for all projects
claude mcp add recall -s user -e RECALL_API_KEY=your_api_key -- npx @chinchillaenterprises/mcp-recall
# Or install for specific project
claude mcp add recall -s project -e RECALL_API_KEY=your_api_key -- npx @chinchillaenterprises/mcp-recallManual Installation
Install the package:
npm install -g @chinchillaenterprises/mcp-recallSet up environment variables:
export RECALL_API_KEY="your_recall_api_key" export RECALL_REGION="us-west-2" # Optional: us-east-1, us-west-2, eu-central-1, ap-northeast-1 export WHISPER_MODEL="base" # Optional: tiny, base, small, medium, largeConfigure Claude Code (
~/.claude.json):{ "mcpServers": { "recall": { "type": "stdio", "command": "npx", "args": ["-y", "@chinchillaenterprises/mcp-recall"], "env": { "RECALL_API_KEY": "your_api_key_here", "RECALL_REGION": "us-west-2", "WHISPER_MODEL": "base" } } } }
🎯 Quick Start
List available recordings:
Use recall_list_bots to see all Recall.ai bots and their recording statusStart transcription:
Use recall_start_transcription with bot_id "c6ce75ce-ffa6-489a-8ebd-41fcfd4e17d8" to begin async transcriptionCheck progress:
Use recall_get_job_status with the job_id to see transcription progressGet results:
Use recall_get_transcript_summary to get key decisions, action items, and participants
🔧 Available Tools
Core Operations
| Tool | Description | Parameters |
|------|-------------|------------|
| recall_list_bots | List all Recall.ai bots and recording status | None |
| recall_start_transcription | Start async transcription job | bot_id: string |
| recall_get_job_status | Get job status and progress | job_id: string |
| recall_list_jobs | List all transcription jobs | limit?: number |
Transcript Access
| Tool | Description | Parameters |
|------|-------------|------------|
| recall_get_transcript_summary | Get condensed summary with key points | job_id: string |
| recall_get_transcript_chunk | Get specific time range from transcript | job_id: string, start_time?: number, end_time?: number |
| recall_search_transcript | Search within transcript for keywords | job_id: string, query: string |
Maintenance
| Tool | Description | Parameters |
|------|-------------|------------|
| recall_cleanup_old_jobs | Manually trigger cleanup of old jobs | days_old?: number |
📊 Workflow Example
// 1. Check available recordings
await recall_list_bots()
// Returns: List of bots with recording status
// 2. Start transcription (async)
const jobId = await recall_start_transcription({ bot_id: "abc123" })
// Returns: job_id for tracking
// 3. Monitor progress
await recall_get_job_status({ job_id: jobId })
// Returns: status, progress %, metadata
// 4. Get condensed summary (when complete)
await recall_get_transcript_summary({ job_id: jobId })
// Returns: executive summary, decisions, action items, participants
// 5. Search specific content
await recall_search_transcript({ job_id: jobId, query: "action item" })
// Returns: matching lines with context
// 6. Get time-specific chunk
await recall_get_transcript_chunk({
job_id: jobId,
start_time: 300, // 5 minutes
end_time: 600 // 10 minutes
})🗂️ Data Storage
Local Storage Structure
~/.mcp-recall/
├── jobs.json # Job tracking database
├── recordings/
│ └── {job_id}.mp4 # Downloaded recordings
├── transcripts/
│ ├── enhanced_{job_id}.txt # Speaker-aligned transcripts
│ └── enhanced_{job_id}.json # Raw transcription data
└── summaries/
└── {job_id}.json # Processed summariesAutomatic Cleanup
- Daily cleanup at 2 AM removes jobs older than 3 days
- Manual cleanup available via
recall_cleanup_old_jobs - Graceful storage with configurable retention periods
⚙️ Configuration
Environment Variables
| Variable | Description | Default | Options |
|----------|-------------|---------|---------|
| RECALL_API_KEY | Recall.ai API key | Required | Your API key |
| RECALL_REGION | API region | us-west-2 | us-east-1, us-west-2, eu-central-1, ap-northeast-1 |
| WHISPER_MODEL | Whisper model size | base | tiny, base, small, medium, large |
Model Comparison
| Model | Size | Speed | Accuracy | RAM Usage |
|-------|------|-------|----------|-----------|
| tiny | ~39 MB | Fastest | Good | ~390 MB |
| base | ~74 MB | Fast | Better | ~500 MB |
| small | ~244 MB | Medium | Good | ~1 GB |
| medium | ~769 MB | Slow | Very Good | ~2 GB |
| large | ~1550 MB | Slowest | Best | ~4 GB |
🔍 Example Output
Job Status Response
{
"job_id": "550e8400-e29b-41d4-a716-446655440000",
"status": "completed",
"progress": 100,
"bot_id": "c6ce75ce-ffa6-489a-8ebd-41fcfd4e17d8",
"metadata": {
"duration": 2234,
"speaker_count": 5,
"file_size": 45189510
}
}Transcript Summary
{
"executive_summary": "37-minute team meeting with 5 participants discussing MCP server development, AI Academy launch, and Discord setup.",
"participants": ["Abel", "Ricardo", "Soroush", "Kael C.", "Hailee"],
"key_decisions": [
"Use event-driven architecture for MCP recall server",
"Implement dual storage pattern for credential persistence"
],
"action_items": [
"Ricardo: Finish Discord setup by Friday",
"Hailee: Post AI Academy job on LinkedIn"
],
"duration": "37 minutes",
"topics": ["MCP servers", "AI Academy", "Discord setup", "LinkedIn posting"],
"sentiment": "positive"
}Enhanced Transcript Sample
=== ENHANCED TRANSCRIPT WITH SPEAKERS ===
**Abel** [0.0s]: That is recall. Oh, it's so cool. He just joined automatically or wonder. So we've pushed him in here.
**Soroush** [36.0s]: Hey, how's it going? Good. Good.
**Abel** [39.0s]: Recall join. Did you push him or did you jump by himself?
**Soroush** [42.0s]: I just added it to the channel and it joins automatically.🚨 Error Handling
The server provides comprehensive error handling for:
- API Connection Issues: Automatic retry with exponential backoff
- Missing Dependencies: Clear error messages for Python/Whisper setup
- Storage Failures: Graceful degradation with temporary storage
- Transcription Errors: Detailed error reporting with context
- Job State Management: Robust state tracking across restarts
🧪 Development
Local Development Setup
# Clone the repository
git clone https://github.com/ChinchillaEnterprises/ChillMCP.git
cd ChillMCP/mcp-recall
# Install dependencies
npm install
# Install Python dependencies
pip install openai-whisper requests
# Build the project
npm run build
# Test locally
claude mcp add recall-local -s user -- node $(pwd)/dist/index.jsTesting
# Run unit tests
npm test
# Run with coverage
npm run test:coverage
# Run integration tests
npm run test:integration🤝 Contributing
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Make your changes and add tests
- Build and test:
npm run build && npm test - Commit your changes:
git commit -m 'Add amazing feature' - Push to the branch:
git push origin feature/amazing-feature - Open a pull request
📄 License
This project is licensed under the MIT License - see the LICENSE file for details.
🔗 Links
- NPM Package: @chinchillaenterprises/mcp-recall
- GitHub Repository: ChillMCP
- Recall.ai Documentation: docs.recall.ai
- Model Context Protocol: modelcontextprotocol.io
🆘 Support
- Issues: GitHub Issues
- Documentation: Claude Code Docs
- Community: Join our discussions in the repository
Built with ❤️ by Chinchilla Enterprises for the MCP ecosystem.
