@elephant-xyz/mcp
v1.6.0
Published
Model Context Protocol server that exposes Elephant data graph tooling
Readme
Elephant MCP Server
Elephant MCP connects Claude-compatible clients to the Elephant data graph, exposing discoverable tools for listing data groups, classes, and individual property schemas. The server is published on npm as @elephant-xyz/mcp.
Embedding Provider: The
getVerifiedScriptExamplestool uses text embeddings for semantic code search. The server supports two embedding providers:
- OpenAI (preferred when
OPENAI_API_KEYis set) - Usestext-embedding-3-smallwith 1024 dimensions- AWS Bedrock (automatic fallback) - Uses
amazon.titan-embed-text-v2via IAM authenticationWhen running on AWS, the server automatically uses Bedrock if no OpenAI key is provided.
🚀 Prompt Recommendations
For best results with Elephant MCP, always specify the Data Group you're working on in your prompts and add use elephant mcp at the end.
Example prompts:
"I'm working on the 'County' data group. Can you help me explore the available classes? use elephant mcp"
"What properties are available in the 'property' class? I'm working with the 'County' data group. use elephant mcp"This helps the AI understand which data context to use and ensures it leverages the Elephant MCP tools effectively.
Why Elephant?
- Ready-to-use
npxlauncher compatible with Claude, Cursor, VS Code, Gemini CLI, and other MCP clients. - Tools to enumerate Elephant data groups, related classes, and full JSON Schema fragments.
- Structured MCP logging to stream diagnostics into every connected client.
Available Tools
listClassesByDataGroup– Lists classes attached to an Elephant data group, including friendly names and descriptions.listPropertiesByClassName– Returns schema property keys for a class (excluding transport-only fields).getPropertySchema– Fetches the full JSON Schema for a specific property and class combination.getVerifiedScriptExamples– Returns a list of working examples of the code, that maps data to the Elephant schema.
Supported MCP Clients
Cursor
- Ensure Node.js 22.18+ is installed.
- Cursor will open a configuration screen pre-filled with:
For OpenAI, replace the placeholder with your actual key. For AWS Bedrock, remove the{ "command": "npx", "args": ["-y", "@elephant-xyz/mcp@latest"], "env": { // Option 1: Use OpenAI embeddings "OPENAI_API_KEY": "sk-your-openai-key", // Option 2: Use AWS Bedrock (omit OPENAI_API_KEY) // "AWS_REGION": "us-east-1" // optional, defaults to us-east-1 }, }OPENAI_API_KEYline and ensure your environment has valid AWS credentials (IAM role, environment variables, or AWS credentials file). - Save and toggle the Elephant connection inside Cursor's MCP panel.
- If you are hacking on a local checkout, switch the command to
npm startand setcwdto your repository path.
Visual Studio Code
- Install the Model Context Protocol extension.
- Accept the pre-populated install flow above or add manually under Settings → MCP → Servers with:
- OpenAI:
OPENAI_API_KEY=sk-your-openai-key npx -y @elephant-xyz/mcp@latest - AWS Bedrock:
npx -y @elephant-xyz/mcp@latest(uses IAM credentials from environment)
- OpenAI:
- Reload VS Code and enable the Elephant server in the MCP panel.
Claude Code
macOS/Linux with OpenAI:
claude mcp add elephant --env OPENAI_API_KEY=sk-your-openai-key -- npx -y @elephant-xyz/mcp@latestmacOS/Linux with AWS Bedrock (uses IAM credentials):
claude mcp add elephant -- npx -y @elephant-xyz/mcp@latestRestart Claude Code after adding the server so the tools appear in the @tools palette.
OpenAI Codex
CLI setup
With OpenAI:
codex mcp add elephant --env OPENAI_API_KEY=sk-your-openai-key -- npx -y @elephant-xyz/mcp@latestWith AWS Bedrock:
codex mcp add elephant -- npx -y @elephant-xyz/mcp@latestYou can explore additional options with
codex mcp --help. Inside the Codex TUI, run/mcpto view currently connected servers.config.toml setup Edit
~/.codex/config.toml(or open MCP settings → Open config.toml from the IDE extension) and add:For OpenAI:
[mcp.elephant] command = "npx" args = ["-y", "@elephant-xyz/mcp@latest"] env = { OPENAI_API_KEY = "sk-your-openai-key" }For AWS Bedrock:
[mcp.elephant] command = "npx" args = ["-y", "@elephant-xyz/mcp@latest"] # Uses IAM credentials from environment; optionally set AWS_REGIONSave the file and restart Codex to load the new server.
Gemini CLI
Create (or edit) .gemini/settings.json in your project and add:
With OpenAI:
{
"mcpServers": {
"elephant": {
"command": "npx",
"args": ["-y", "@elephant-xyz/mcp@latest"],
"env": {
"OPENAI_API_KEY": "sk-your-openai-key",
},
},
},
}With AWS Bedrock:
{
"mcpServers": {
"elephant": {
"command": "npx",
"args": ["-y", "@elephant-xyz/mcp@latest"],
// Uses IAM credentials from environment
},
},
}Restart Gemini CLI or run gemini tools sync to pick up the new server.
Configuration
The stdio transport means no port or server identity flags are required. Environment variables handled by src/config.ts:
| Variable | Description | Default |
|----------|-------------|---------|
| OPENAI_API_KEY | OpenAI API key for embeddings. When set, OpenAI is used; otherwise falls back to AWS Bedrock. | (optional) |
| AWS_REGION | AWS region for Bedrock API calls. | us-east-1 |
| LOG_LEVEL | Pino log level (error, warn, info, debug). | info |
AWS Bedrock Authentication
When using AWS Bedrock (no OPENAI_API_KEY set), the server authenticates using the standard AWS credential chain:
- Environment variables (
AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY) - Shared credentials file (
~/.aws/credentials) - ECS/Lambda container credentials (
AWS_CONTAINER_CREDENTIALS_*) - IAM instance role (when running on EC2/ECS/Lambda)
Ensure your IAM role or user has the bedrock:InvokeModel permission and access to the amazon.titan-embed-text-v2:0 embedding model in the configured AWS_REGION. In some regions, you must explicitly request access to this model in the AWS Bedrock Console before it can be invoked.
Important: At least one embedding provider must be configured. If neither OPENAI_API_KEY nor AWS credentials are available, the getVerifiedScriptExamples tool will return an error prompting you to configure credentials.
Credential Verification
At startup, the server verifies embedding provider credentials:
- For OpenAI: Checks that
OPENAI_API_KEYis set - For AWS Bedrock: Resolves credentials through the full AWS credential provider chain and logs the detected source
The verification result is logged and included in the MCP startup message for debugging.
Database Compatibility
The embedding database is automatically rebuilt when switching between embedding models with different vector dimensions (e.g., switching from a 1536-dimension model to a 1024-dimension model). This ensures the getVerifiedScriptExamples tool works correctly after model changes. The server will re-index all verified scripts after a rebuild.
Zod compatibility note: this server and its dependencies require zod v3. Installs will fail if a v4 copy is hoisted into node_modules; the postinstall script enforces the v3 constraint to avoid runtime errors such as keyValidator._parse is not a function.
Need to Contribute?
Development setup, testing, and release workflows live in CONTRIBUTING.md.
Support
Open an issue with your Node.js version, client details, and any relevant log output if you run into trouble. We're happy to help you get connected.
