@aiready/mcp-server
v0.1.5
Published
AIReady Model Context Protocol (MCP) Server
Readme
AIReady MCP Server
The AIReady MCP Server provides an integration point for AI agents (like Claude Desktop, Cursor, Windsurf, etc.) to assess AI-readiness and improve AI leverage directly within their conversational interfaces using the Model Context Protocol (MCP).
Installation & Distribution Channels
You can install and use the AIReady MCP server through several supported channels.
1. Dedicated MCP Registries
- Smithery: Discover and install our server directly via the Smithery CLI:
npx @smithery/cli install @aiready/mcp-server - Glama: View our listing and integration options on the Glama directory.
- Pulsar: Find us on the Pulsar registry for MCP servers.
2. Direct IDE / Assistant Integrations
Claude Desktop App
To use the AIReady MCP server in the Claude Desktop app, add the following configuration to your claude_desktop_config.json:
"mcpServers": {
"aiready": {
"command": "npx",
"args": ["-y", "@aiready/mcp-server"]
}
}Cursor IDE
- Open Cursor Settings.
- Navigate to Features -> MCP Servers.
- Add a new server.
- Set the command to:
npx -y @aiready/mcp-server
Windsurf IDE
- Open Windsurf Settings or local environment configuration.
- Add a new MCP Server integration.
- Configure the execution command:
npx -y @aiready/mcp-server
3. Containerized Distribution (Docker)
If you prefer running MCP servers in isolated environments, you can use our Docker image:
docker run -i --rm ghcr.io/caopengau/aiready-mcp-server(Note: Docker image distribution is currently being set up. Use the command above once published.)
4. Existing AIReady Channels
We are also integrating the MCP server with our existing distribution methods:
- Homebrew:
brew install aiready-mcp(Coming soon) - VS Code Extension: Bundled within the AIReady extension for editor-native AI chats. (Coming soon)
Quick Start
To test the server locally, you can run:
npx @aiready/mcp-serverFor more details on AIReady, visit getaiready.dev.
