@aigne/example-afs-mcp-server
v1.1.3
Published
A demonstration of using AIGNE Framework with AFS mount a MCP server
Readme
AFS MCP Server Example
This example shows how to mount any MCP (Model Context Protocol) server as an AFS module, making it accessible to AI agents through a unified file system interface. We use the GitHub MCP Server as a real-world demonstration.
What You'll See
User asks: "Search for a repo named aigne"
Behind the scenes:
- LLM calls
afs_exec→/modules/github-mcp-server/search_repositories - MCP server searches GitHub and returns JSON results
- LLM presents results naturally: "Found 89 repositories. Notable matches: aigne-framework..."
The power: AI agents can access GitHub (or any MCP server) through a simple, unified AFS interface - just like accessing files!
Prerequisites
- Node.js (>=20.0) and npm installed on your machine
- Docker installed and running
- A GitHub Personal Access Token for GitHub API access
- An OpenAI API key for interacting with OpenAI's services
- Optional dependencies (if running the example from source code):
Quick Start (No Installation Required)
# Set your GitHub Personal Access Token
export GITHUB_PERSONAL_ACCESS_TOKEN=your_github_token_here
# Set your OpenAI API key
export OPENAI_API_KEY=your_openai_api_key_here
# Run in interactive chat mode
npx -y @aigne/example-afs-mcp-server --chat
# Ask a specific question
npx -y @aigne/example-afs-mcp-server --input "Search for a repo named aigne"See It In Action
Here's what happens when you ask to search for a repository:
👤 You: "Search for a repo named aigne"
🤖 Agent thinks: I need to search GitHub repositories...
→ Calls: afs_exec("/modules/github-mcp-server/search_repositories")
📡 GitHub MCP Server:
✓ Found 89 repositories matching "aigne"
🤖 AI: "I searched GitHub for 'aigne'. Results: 89 repositories found.
Notable matches:
• aigne-framework (AIGNE-io/aigne-framework) - ⭐ 150 stars
• aigne-examples (user/aigne-examples) - ⭐ 12 stars
...
Would you like me to open any of these repos or see more details?"Key insight: The agent treats the GitHub MCP Server like any other AFS module - no special integration code needed!
Installation
Clone the Repository
git clone https://github.com/AIGNE-io/aigne-frameworkInstall Dependencies
cd aigne-framework/examples/afs-mcp-server
pnpm installSetup Environment Variables
Setup your API keys in the .env.local file:
GITHUB_PERSONAL_ACCESS_TOKEN="" # Set your GitHub Personal Access Token here
OPENAI_API_KEY="" # Set your OpenAI API key hereUsing Different Models
You can use different AI models by setting the MODEL environment variable along with the corresponding API key. The framework supports multiple providers:
- OpenAI:
MODEL="openai:gpt-4.1"withOPENAI_API_KEY - Anthropic:
MODEL="anthropic:claude-3-7-sonnet-latest"withANTHROPIC_API_KEY - Google Gemini:
MODEL="gemini:gemini-2.0-flash"withGEMINI_API_KEY - AWS Bedrock:
MODEL="bedrock:us.amazon.nova-premier-v1:0"with AWS credentials - DeepSeek:
MODEL="deepseek:deepseek-chat"withDEEPSEEK_API_KEY - OpenRouter:
MODEL="openrouter:openai/gpt-4o"withOPEN_ROUTER_API_KEY - xAI:
MODEL="xai:grok-2-latest"withXAI_API_KEY - Ollama:
MODEL="ollama:llama3.2"withOLLAMA_DEFAULT_BASE_URL
For detailed configuration examples, please refer to the .env.local.example file in this directory.
Run the Example
# Run in interactive chat mode
pnpm start --chat
# Run with a single message
pnpm start --input "What are the recent issues in the AIGNE repository?"How It Works: 3 Simple Steps
1. Launch the MCP Server
import { MCPAgent } from "@aigne/core";
const mcpAgent = await MCPAgent.from({
command: "docker",
args: [
"run", "-i", "--rm",
"-e", `GITHUB_PERSONAL_ACCESS_TOKEN=${process.env.GITHUB_PERSONAL_ACCESS_TOKEN}`,
"ghcr.io/github/github-mcp-server",
],
});2. Mount It as an AFS Module
import { AFS } from "@aigne/afs";
import { AFSHistory } from "@aigne/afs-history";
const afs = new AFS()
.mount(new AFSHistory({ storage: { url: ":memory:" } }))
.mount(mcpAgent); // Mounted at /modules/github-mcp-server3. Create an AI Agent
import { AIAgent } from "@aigne/core";
const agent = AIAgent.from({
instructions: "Help users interact with GitHub via the github-mcp-server module.",
inputKey: "message",
afs, // Agent automatically gets access to all mounted modules
});That's it! The agent can now call /modules/github-mcp-server/search_repositories, /modules/github-mcp-server/list_issues, and all other GitHub MCP tools through the AFS interface.
Try These Examples
# Search for repositories
npx -y @aigne/example-afs-mcp-server --input "Search for a repo named aigne"
# Get repository information
npx -y @aigne/example-afs-mcp-server --input "Tell me about the AIGNE-io/aigne-framework repository"
# Check recent issues
npx -y @aigne/example-afs-mcp-server --input "What are the recent open issues in AIGNE-io/aigne-framework?"
# Interactive mode - ask follow-up questions naturally
npx -y @aigne/example-afs-mcp-server --chatIn chat mode, try:
- "Show me the most popular AIGNE repositories"
- "Search for repos about AI agents"
- "What pull requests are open in aigne-framework?"
- "Find code examples of MCPAgent usage"
Why Mount MCP as AFS?
The Problem: Each MCP server has its own protocol and tools. AI agents need custom code to work with each one.
The Solution: Mount all MCP servers as AFS modules:
const afs = new AFS()
.mount("/github", await MCPAgent.from({ /* GitHub MCP */ }))
.mount("/slack", await MCPAgent.from({ /* Slack MCP */ }))
.mount("/notion", await MCPAgent.from({ /* Notion MCP */ }));
// Now the agent uses ONE interface (afs_exec) to access ALL services!Benefits:
- Unified Interface: All MCP servers accessible through
afs_list,afs_read,afs_exec - Composability: Mix MCP servers with file systems, databases, custom modules
- Path-Based: Multiple MCP servers coexist at different paths
- No Rewiring: AI agents work with any mounted MCP server automatically
Use Any MCP Server
Replace GitHub with any MCP server:
// Slack MCP Server
.mount(await MCPAgent.from({
command: "npx",
args: ["-y", "@modelcontextprotocol/server-slack"],
env: { SLACK_BOT_TOKEN: process.env.SLACK_BOT_TOKEN },
}))
// File System MCP Server
.mount(await MCPAgent.from({
command: "npx",
args: ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/files"],
}))
// Postgres MCP Server
.mount(await MCPAgent.from({
command: "npx",
args: ["-y", "@modelcontextprotocol/server-postgres"],
env: { POSTGRES_CONNECTION_STRING: process.env.DATABASE_URL },
}))Mix MCP with Other AFS Modules
import { LocalFS } from "@aigne/afs-local-fs";
import { UserProfileMemory } from "@aigne/afs-user-profile-memory";
const afs = new AFS()
.mount(new AFSHistory({ storage: { url: ":memory:" } }))
.mount(new LocalFS({ localPath: "./docs" }))
.mount(new UserProfileMemory({ context }))
.mount(await MCPAgent.from({ /* GitHub MCP */ }))
.mount(await MCPAgent.from({ /* Slack MCP */ }));
// Agent now has: history, local files, user profiles, GitHub, Slack!Related Examples
- AFS Memory Example - Conversational memory with user profiles
- AFS LocalFS Example - File system access with AI agents
MCP Resources
TypeScript Support
This package includes full TypeScript type definitions.
