@prosci/kaiya-mcp-server
v0.2.0
Published
MCP server for Kaiya AI assistant integration
Readme
Kaiya MCP Server
A Model Context Protocol (MCP) server that provides integration with the Kaiya AI assistant. This server allows MCP-compatible clients (like Claude Desktop) to interact with your Kaiya AI backend through a standardized protocol.
Features
- 🤖 AI Chat Integration: Direct communication with Kaiya AI assistant
- 🔐 OAuth 2.0 Authentication: Full OAuth flow with Dynamic Client Registration for Claude.ai
- 🔑 JWT Token Support: Users provide their JWT tokens during OAuth authorization
- 📝 Session Support: Optional chat session ID for conversation continuity
- 🔍 Debug Logging: Built-in request/response logging for troubleshooting
- 🌐 HTTP Mode: Can run as a remote MCP server for Claude.ai web interface
- 📡 Streamable HTTP Transport: Supports modern MCP protocol with SSE streaming
Prerequisites
- Node.js 20.0.0 or higher
- Valid Bearer token for Kaiya API authentication
- Redis (for production deployment - optional for local development)
Installation
- Clone or download this repository
- Install dependencies:
npm install
Configuration
The server uses environment variables for configuration:
Required
KAIYA_API_KEY: Your Bearer token for Kaiya API authentication
Optional
REDIS_URL: Redis connection URL (e.g.,redis://localhost:6379)- Production: Required for persistent token storage on Heroku
- Development: If not set, uses in-memory storage (tokens lost on restart)
LOG_LEVEL: Logging level (debug,info,warn,error). Default:infoRAILS_BASE_URL: Rails API base URL. Default:https://portal.prosci.com
Redis Setup
For Local Development
# Install Redis (macOS)
brew install redis
# Start Redis server
redis-server
# Set REDIS_URL in .env (optional - will use memory storage if not set)
REDIS_URL=redis://localhost:6379For Production (Heroku)
Redis is automatically configured when you add the Heroku Redis addon (see Heroku Deployment section below).
Setting Environment Variables
Option 1: .env File (Recommended for development)
Create a .env file in the project root:
KAIYA_API_KEY=your-bearer-token-here
REDIS_URL=redis://localhost:6379
LOG_LEVEL=debugOption 2: System Environment Variables
export KAIYA_API_KEY=your-bearer-token-here
export REDIS_URL=redis://localhost:6379
export LOG_LEVEL=debugRunning Modes
The server can run in two modes:
1. Stdio Mode (Default - for Claude Desktop)
Standard input/output mode for local Claude Desktop integration.
2. HTTP Mode (for Claude.ai Web)
HTTP server mode for remote MCP connections via Claude.ai Connectors.
Usage
Testing with MCP Inspector
# With .env file
npx @modelcontextprotocol/inspector index.js
# Or with inline environment variables
KAIYA_API_KEY=your-token npx @modelcontextprotocol/inspector index.jsThe inspector will start:
- Client UI: http://localhost:5173 (default)
- MCP Proxy Server: http://localhost:3000 (default)
Integration with Claude Desktop
Add to your claude_desktop_config.json:
{
"mcpServers": {
"Kaiya": {
"command": "npx",
"args": [
"-y",
"@prosci/kaiya-mcp-server"
],
"env": {
"KAIYA_API_KEY": "your-bearer-token-here"
}
},
}
}Important: Use absolute paths in the configuration file.
HTTP Mode for Claude.ai Web
Run the server in HTTP mode for remote access:
# Start HTTP server locally
npm run start:http
# or
MODE=http npm start
# The server will run on port 3000 by default
# Endpoint: http://localhost:3000/mcpHeroku Deployment
Deploy the server to Heroku for use with Claude.ai Custom Connectors:
1. Prerequisites
- Heroku CLI installed
- Heroku account
2. Create Heroku App
heroku create your-app-name3. Add Redis Addon
IMPORTANT: Redis is required for persistent token storage on Heroku's ephemeral filesystem.
# Add Redis addon (mini tier - free for hobby projects)
heroku addons:create heroku-redis:mini -a your-app-name
# Verify Redis URL is set (this is automatic)
heroku config:get REDIS_URL -a your-app-name4. Set Environment Variables
heroku config:set KAIYA_API_KEY=your-bearer-token-here
heroku config:set RAILS_BASE_URL=https://portal.prosci.com
heroku config:set LOG_LEVEL=info5. Deploy
git add .
git commit -m "Deploy to Heroku"
git push heroku main6. Verify Deployment
# Check application logs
heroku logs --tail -a your-app-name
# Test health endpoint
curl https://your-app-name.herokuapp.com/health5. Configure Claude.ai Custom Connector
In Claude.ai, add a Custom Connector:
- Go to Claude.ai settings
- Navigate to Integrations → Custom Connectors
- Click "Add Connector"
- Enter the server URL:
https://kaiya-mcp-server.prosci.com/mcp - Click "Connect"
- You'll be redirected to an authorization page
- Enter your JWT Bearer token when prompted
- Click "Authorize" to complete the connection
How OAuth Integration Works:
- The server implements OAuth 2.0 with Dynamic Client Registration
- When you connect, Claude registers as a client automatically
- You enter your JWT token on the authorization page
- The JWT token is securely passed to the MCP server for API authentication
- Each user can use their own JWT token for personalized access
Available Tools
kaiya_chat
Chat with the Kaiya AI assistant.
Parameters:
prompt(required): The message/prompt to send to Kaiyachat_session_id(optional): Session identifier for conversation continuity
Example:
{
"prompt": "How can I improve my project management skills?",
"chat_session_id": "session_123"
}Response: Returns the Kaiya AI response as formatted JSON.
API Endpoint
The server communicates with your Rails API at:
POST {RAILS_BASE_URL}/api/v1/kaiya/chatRequest Format:
{
"prompt": "user message",
"chat_session_id": "optional_session_id"
}Headers:
Content-Type: application/jsonAccept: */*Authorization: Bearer {KAIYA_API_KEY}
Development
Running the Server
Stdio Mode (for Claude Desktop)
npm run start:stdio
# or simply
node index.jsHTTP Mode (for remote access)
npm run start:http
# or
MODE=http npm startWith Auto-reload (Development)
# Stdio mode with watch
npm run dev
# HTTP mode with watch
npm run dev:httpDebug Mode
The server automatically logs:
- Request URLs and payloads
- API key presence (without exposing the actual key)
- Response status
Troubleshooting
Common Issues
"Access token invalid or expired" Error
- This means the token was not found in storage
- On Heroku: Ensure Redis addon is installed (
heroku addons -a your-app-name) - Locally: Check Redis is running (
redis-cli pingshould returnPONG) - Try disconnecting and reconnecting in Claude.ai to re-authenticate
- Check logs with
LOG_LEVEL=debugto see token storage operations
401 Unauthorized Error
- Check that
KAIYA_API_KEYis set correctly - Verify the Bearer token is valid
- Ensure your Rails API is running and accessible
- Check that
Tokens Lost After Dyno Restart (Heroku)
- Root cause: Missing Redis addon
- Solution: Add Redis addon (see Heroku Deployment section)
- Without Redis, tokens are stored in memory and lost on restart
Connection Refused
- Verify
RAILS_BASE_URLpoints to your running Rails server - Check that the Rails server is accepting connections
- Ensure the
/api/v1/kaiya/chatendpoint exists
- Verify
MCP Inspector Not Connecting
- Make sure you're using Node.js 20.0.0 or higher
- Try restarting the inspector
- Check console logs for error messages
Environment Variables Not Loading
- Ensure
.envfile is in the project root - Check that
dotenv/configis imported correctly - Verify environment variable names match exactly
- Ensure
Redis Connection Issues
- Locally: Ensure Redis is running (
redis-server) - Heroku: Check Redis addon status (
heroku redis:info -a your-app-name) - Server will fallback to memory storage if Redis unavailable (check logs)
- Locally: Ensure Redis is running (
Debug Logging
The server supports structured logging with multiple levels:
# Set log level in .env or environment
LOG_LEVEL=debug # Shows all logs (debug, info, warn, error)
LOG_LEVEL=info # Production default (info, warn, error)
LOG_LEVEL=warn # Only warnings and errors
LOG_LEVEL=error # Only errorsLogs include:
- Token storage operations (set, get, delete)
- OAuth flow (authorization, token exchange, refresh)
- MCP requests and responses
- Rails API calls
- Redis connection status
- Error stack traces
Viewing logs:
# Locally
npm run dev:http
# On Heroku
heroku logs --tail -a your-app-name
# Filter for specific component
heroku logs --tail -a your-app-name | grep "Token Storage"Project Structure
kaiya-mcp-server/
├── index.js # Main server implementation
├── redis-token-storage.js # Redis-based token persistence
├── token-storage.js # File-based token storage (dev fallback)
├── package.json # Node.js dependencies and scripts
├── package-lock.json # Dependency lock file
├── .env # Environment variables (create this)
├── .tokens.json # Token storage file (auto-generated, dev only)
└── README.md # This fileDependencies
@modelcontextprotocol/sdk: MCP protocol implementation with Streamable HTTP supportexpress: Web framework for HTTP server modecors: CORS middleware for Claude.ai integrationnode-fetch: HTTP client for API requestsdotenv: Environment variable loadingredis: Redis client for persistent token storage
License
This project is licensed under the MIT License.
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Test with MCP Inspector
- Submit a pull request
Support
For issues and questions:
- Check the troubleshooting section above
- Review server logs for error details
- Verify your Rails API is responding correctly
- Test with MCP Inspector before integrating with other clients
Note: This MCP server is designed to work with the Kaiya AI Rails backend. Ensure your Rails API implements the expected /api/v1/kaiya/chat endpoint with Bearer token authentication.
