breadcrumb-mcp
v1.0.1
Published
MCP Server for storing and retrieving user conversation breadcrumbs with semantic search using LlamaIndex
Maintainers
Readme
Breadcrumb MCP Server
A Model Context Protocol (MCP) server for storing and retrieving conversation breadcrumbs with semantic search capabilities using LlamaIndex. Perfect for maintaining context across AI assistant conversations.
Features
- 🔍 Semantic Search: Query conversation breadcrumbs using natural language and semantic similarity
- 💾 Local Storage: All data persists locally using JSON files with LlamaIndex vector storage
- 📋 Multiple Query Types: Recent breadcrumb retrieval, semantic search, and global search
- ⚙️ User Preferences: Store and manage user-specific settings across sessions
- 📊 Analytics: Get comprehensive statistics about stored breadcrumbs and preferences
- 🚀 Production Ready: Built with TypeScript, fully typed, and npm publishable
- 🐰 Bun Compatible: Optimized for Bun runtime with minimal dependencies
Two Versions Available
Simple Version (Recommended for Bun)
- Text-based search with relevance scoring
- No external AI dependencies
- Fast startup and lightweight
- Perfect for most use cases
Advanced Version (Optional)
- Semantic search with vector embeddings using LlamaIndex
- Requires additional dependencies
- Better for complex semantic queries
Installation
From npm (when published)
npm install -g breadcrumb-mcpUsing npx (no installation required)
npx breadcrumb-mcpFrom source
git clone <https://github.com/amantiwari57/breadcrump-mcp>
cd mcp-user-context-server
bun install
bun run buildUsage
Running the Server
Simple Version (Recommended)
# Development
bun run dev
# Production (from source)
bun run start
# Using npx (no installation required)
npx breadcrumb-mcp
# Or if installed globally
breadcrumb-mcpAdvanced Version (with LlamaIndex)
# Install optional dependencies first
bun install llamaindex
# Development
bun run dev:advanced
# Build advanced version
bun run build:advancedAvailable Tools
The server provides the following MCP tools:
store_context
Store a conversation breadcrumb with user context, metadata, and timestamp for future retrieval.
{
"userId": "user123",
"conversationId": "conv456", // optional
"content": "User asked about machine learning algorithms",
"metadata": { "topic": "ML", "sentiment": "curious" } // optional
}query_context
Search through user's conversation breadcrumbs using semantic similarity to find relevant past discussions.
{
"userId": "user123",
"query": "machine learning questions",
"limit": 5 // optional, default 5
}get_recent_context
Retrieve the most recent conversation breadcrumbs for a user in chronological order.
{
"userId": "user123",
"limit": 10 // optional, default 10
}get_user_preferences
Retrieve user preferences, settings, and configuration stored across conversation sessions.
{
"userId": "user123"
}update_user_preferences
Update or add user preferences and settings that persist across conversation sessions.
{
"userId": "user123",
"preferences": {
"theme": "dark",
"language": "en",
"notifications": true
}
}global_search
Search across all users' conversation breadcrumbs to find relevant discussions (administrative feature).
{
"query": "machine learning discussions",
"limit": 10 // optional, default 10
}get_user_stats
Get comprehensive statistics and analytics about a user's stored conversation breadcrumbs and preferences.
{
"userId": "user123"
}Configuration
The server uses local storage in a data/ directory with the following structure:
data/
├── contexts/ # User conversation breadcrumb JSON files
│ ├── user123.json
│ └── user456.json
├── preferences/ # User preference JSON files
│ ├── user123.json
│ └── user456.json
└── vectors/ # LlamaIndex vector storage for semantic search
├── docstore.json
├── index_store.json
└── vector_store.jsonConfiguration with Claude Desktop
Add to your Claude Desktop MCP configuration:
{
"mcpServers": {
"breadcrumb-mcp": {
"command": "npx",
"args": ["breadcrumb-mcp"]
}
}
}Or if you have it installed globally:
{
"mcpServers": {
"breadcrumb-mcp": {
"command": "breadcrumb-mcp",
"args": []
}
}
}Or if running from source:
{
"mcpServers": {
"breadcrumb-mcp": {
"command": "bun",
"args": ["run", "/path/to/breadcrumb-mcp/src/simple-storage.ts"]
}
}
}For the advanced version with LlamaIndex:
{
"mcpServers": {
"breadcrumb-mcp": {
"command": "bun",
"args": ["run", "/path/to/breadcrumb-mcp/src/server.ts"]
}
}
}Development
Requirements
- Node.js 18+
- Bun (recommended) or npm
Setup
bun installScripts
bun run dev # Run in development mode
bun run build # Build for production
bun run start # Run built versionProject Structure
src/
├── simple-storage.ts # Simple text-based MCP server (recommended)
├── index.ts # Core storage and vector operations (advanced)
├── server.ts # MCP server with LlamaIndex (advanced)
package.json
README.md
tsconfig.jsonTechnical Details
Vector Embeddings
- Uses HuggingFace's
BAAI/bge-small-en-v1.5model for embeddings - Supports semantic search across stored conversation breadcrumbs
- Automatic persistence of vector indices
Storage Format
- Conversation breadcrumbs stored as JSON with metadata
- Vector indices persisted using LlamaIndex
- Graceful handling of concurrent access
Error Handling
- Comprehensive error handling with detailed error messages
- Graceful degradation when vector index is unavailable
- Automatic retry logic for transient failures
API Response Format
All tools return JSON responses with the following structure:
{
"success": true,
"data": { /* tool-specific data */ },
"message": "Optional status message"
}Error responses:
{
"success": false,
"error": "Error description",
"tool": "tool_name"
}License
MIT License
Copyright (c) 2025 Breadcrumb MCP Server
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
Support
For issues and questions, please open an issue on the GitHub repository.
