npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@aigne/example-afs-mcp-server

v1.1.3

Published

A demonstration of using AIGNE Framework with AFS mount a MCP server

Readme

AFS MCP Server Example

This example shows how to mount any MCP (Model Context Protocol) server as an AFS module, making it accessible to AI agents through a unified file system interface. We use the GitHub MCP Server as a real-world demonstration.

What You'll See

User asks: "Search for a repo named aigne"

Behind the scenes:

  1. LLM calls afs_exec/modules/github-mcp-server/search_repositories
  2. MCP server searches GitHub and returns JSON results
  3. LLM presents results naturally: "Found 89 repositories. Notable matches: aigne-framework..."

The power: AI agents can access GitHub (or any MCP server) through a simple, unified AFS interface - just like accessing files!

Prerequisites

  • Node.js (>=20.0) and npm installed on your machine
  • Docker installed and running
  • A GitHub Personal Access Token for GitHub API access
  • An OpenAI API key for interacting with OpenAI's services
  • Optional dependencies (if running the example from source code):
    • Pnpm for package management
    • Bun for running unit tests & examples

Quick Start (No Installation Required)

# Set your GitHub Personal Access Token
export GITHUB_PERSONAL_ACCESS_TOKEN=your_github_token_here

# Set your OpenAI API key
export OPENAI_API_KEY=your_openai_api_key_here

# Run in interactive chat mode
npx -y @aigne/example-afs-mcp-server --chat

# Ask a specific question
npx -y @aigne/example-afs-mcp-server --input "Search for a repo named aigne"

See It In Action

Here's what happens when you ask to search for a repository:

👤 You: "Search for a repo named aigne"

🤖 Agent thinks: I need to search GitHub repositories...
   → Calls: afs_exec("/modules/github-mcp-server/search_repositories")

📡 GitHub MCP Server:
   ✓ Found 89 repositories matching "aigne"

🤖 AI: "I searched GitHub for 'aigne'. Results: 89 repositories found.

       Notable matches:
       • aigne-framework (AIGNE-io/aigne-framework) - ⭐ 150 stars
       • aigne-examples (user/aigne-examples) - ⭐ 12 stars
       ...

       Would you like me to open any of these repos or see more details?"

Key insight: The agent treats the GitHub MCP Server like any other AFS module - no special integration code needed!

Installation

Clone the Repository

git clone https://github.com/AIGNE-io/aigne-framework

Install Dependencies

cd aigne-framework/examples/afs-mcp-server

pnpm install

Setup Environment Variables

Setup your API keys in the .env.local file:

GITHUB_PERSONAL_ACCESS_TOKEN="" # Set your GitHub Personal Access Token here
OPENAI_API_KEY="" # Set your OpenAI API key here

Using Different Models

You can use different AI models by setting the MODEL environment variable along with the corresponding API key. The framework supports multiple providers:

  • OpenAI: MODEL="openai:gpt-4.1" with OPENAI_API_KEY
  • Anthropic: MODEL="anthropic:claude-3-7-sonnet-latest" with ANTHROPIC_API_KEY
  • Google Gemini: MODEL="gemini:gemini-2.0-flash" with GEMINI_API_KEY
  • AWS Bedrock: MODEL="bedrock:us.amazon.nova-premier-v1:0" with AWS credentials
  • DeepSeek: MODEL="deepseek:deepseek-chat" with DEEPSEEK_API_KEY
  • OpenRouter: MODEL="openrouter:openai/gpt-4o" with OPEN_ROUTER_API_KEY
  • xAI: MODEL="xai:grok-2-latest" with XAI_API_KEY
  • Ollama: MODEL="ollama:llama3.2" with OLLAMA_DEFAULT_BASE_URL

For detailed configuration examples, please refer to the .env.local.example file in this directory.

Run the Example

# Run in interactive chat mode
pnpm start --chat

# Run with a single message
pnpm start --input "What are the recent issues in the AIGNE repository?"

How It Works: 3 Simple Steps

1. Launch the MCP Server

import { MCPAgent } from "@aigne/core";

const mcpAgent = await MCPAgent.from({
  command: "docker",
  args: [
    "run", "-i", "--rm",
    "-e", `GITHUB_PERSONAL_ACCESS_TOKEN=${process.env.GITHUB_PERSONAL_ACCESS_TOKEN}`,
    "ghcr.io/github/github-mcp-server",
  ],
});

2. Mount It as an AFS Module

import { AFS } from "@aigne/afs";
import { AFSHistory } from "@aigne/afs-history";

const afs = new AFS()
  .mount(new AFSHistory({ storage: { url: ":memory:" } }))
  .mount(mcpAgent);  // Mounted at /modules/github-mcp-server

3. Create an AI Agent

import { AIAgent } from "@aigne/core";

const agent = AIAgent.from({
  instructions: "Help users interact with GitHub via the github-mcp-server module.",
  inputKey: "message",
  afs,  // Agent automatically gets access to all mounted modules
});

That's it! The agent can now call /modules/github-mcp-server/search_repositories, /modules/github-mcp-server/list_issues, and all other GitHub MCP tools through the AFS interface.

Try These Examples

# Search for repositories
npx -y @aigne/example-afs-mcp-server --input "Search for a repo named aigne"

# Get repository information
npx -y @aigne/example-afs-mcp-server --input "Tell me about the AIGNE-io/aigne-framework repository"

# Check recent issues
npx -y @aigne/example-afs-mcp-server --input "What are the recent open issues in AIGNE-io/aigne-framework?"

# Interactive mode - ask follow-up questions naturally
npx -y @aigne/example-afs-mcp-server --chat

In chat mode, try:

  • "Show me the most popular AIGNE repositories"
  • "Search for repos about AI agents"
  • "What pull requests are open in aigne-framework?"
  • "Find code examples of MCPAgent usage"

Why Mount MCP as AFS?

The Problem: Each MCP server has its own protocol and tools. AI agents need custom code to work with each one.

The Solution: Mount all MCP servers as AFS modules:

const afs = new AFS()
  .mount("/github", await MCPAgent.from({ /* GitHub MCP */ }))
  .mount("/slack", await MCPAgent.from({ /* Slack MCP */ }))
  .mount("/notion", await MCPAgent.from({ /* Notion MCP */ }));

// Now the agent uses ONE interface (afs_exec) to access ALL services!

Benefits:

  • Unified Interface: All MCP servers accessible through afs_list, afs_read, afs_exec
  • Composability: Mix MCP servers with file systems, databases, custom modules
  • Path-Based: Multiple MCP servers coexist at different paths
  • No Rewiring: AI agents work with any mounted MCP server automatically

Use Any MCP Server

Replace GitHub with any MCP server:

// Slack MCP Server
.mount(await MCPAgent.from({
  command: "npx",
  args: ["-y", "@modelcontextprotocol/server-slack"],
  env: { SLACK_BOT_TOKEN: process.env.SLACK_BOT_TOKEN },
}))

// File System MCP Server
.mount(await MCPAgent.from({
  command: "npx",
  args: ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/files"],
}))

// Postgres MCP Server
.mount(await MCPAgent.from({
  command: "npx",
  args: ["-y", "@modelcontextprotocol/server-postgres"],
  env: { POSTGRES_CONNECTION_STRING: process.env.DATABASE_URL },
}))

Mix MCP with Other AFS Modules

import { LocalFS } from "@aigne/afs-local-fs";
import { UserProfileMemory } from "@aigne/afs-user-profile-memory";

const afs = new AFS()
  .mount(new AFSHistory({ storage: { url: ":memory:" } }))
  .mount(new LocalFS({ localPath: "./docs" }))
  .mount(new UserProfileMemory({ context }))
  .mount(await MCPAgent.from({ /* GitHub MCP */ }))
  .mount(await MCPAgent.from({ /* Slack MCP */ }));

// Agent now has: history, local files, user profiles, GitHub, Slack!

Related Examples

MCP Resources

TypeScript Support

This package includes full TypeScript type definitions.

License

MIT