openapi-to-mcp
v0.1.5
Published
Model Control Protocol generator from OpenAPI/Swagger specifications
Maintainers
Readme
OpenAPI-to-MCP
Generate AI-friendly interfaces from your existing Swagger/OpenAPI specs.
This tool converts APIs into Model Control Protocol (MCP) format — used to control tools and environments via large language models.
Overview
This tool converts OpenAPI/Swagger specifications into Model Control Protocol (MCP) format, making it easy to create AI agents that can interact with your APIs.
Features
- 📥 Generate universal mcp.json from OpenAPI/Swagger files
- 🔄 Auto-extract state schema from suitable GET endpoints
- 📤 Export to multiple formats:
- 📜 Prompt instructions with state information
- 🧱 JSON action templates with getState action
- 🔧 Function schemas with getState function (OpenAI compatible)
- 🚀 MCP server TypeScript file compatible with Claude Desktop
- 🔌 Standalone executor for API interaction (LangChain/Web compatible)
- 📂 Plug-and-play handler files with README
- 🔗 LangChain tools with argsSchema and toolloader
- 🔌 OpenAI plugin manifest with deployment instructions
- 🧪 Simulation mode for testing without backend changes
Use Cases
- Build Claude Desktop-compatible tools in seconds — Generate an MCP server that works directly with Claude
- Turn any OpenAPI spec into LangChain/AutoGPT tools — Use the executor or LangChain tools for AI interfaces in any framework
- Power AI interfaces for existing microservices — No backend changes required, works with existing APIs
- Create ChatGPT plugins effortlessly — Generate OpenAI plugin manifests with proper schemas
- Prototype AI agents with minimal setup — Use simulation mode to test AI interaction with your API
- Create custom handler logic — Extend handlers with preprocessing, caching, or business logic
Installation
# Install globally
npm install -g openapi-to-mcp
# Or use directly with npx
npx openapi-to-mcp <swagger-file>Usage
# Basic usage - generates all outputs
openapi-to-mcp path/to/swagger.yaml
# Specify output directory
openapi-to-mcp path/to/swagger.yaml -o ./custom-output
# Generate only specific formats
openapi-to-mcp path/to/swagger.yaml --prompt --functions
# Generate MCP server for Claude Desktop
openapi-to-mcp path/to/swagger.yaml --server --api-url https://your-api.com
# Generate standalone executor for any framework
openapi-to-mcp path/to/swagger.yaml --executor --api-url https://your-api.com
# Generate individual handler files for customization
openapi-to-mcp path/to/swagger.yaml --handlers --api-url https://your-api.com
# Generate LangChain tools for your API
openapi-to-mcp path/to/swagger.yaml --langchain --api-url https://your-api.com
# Generate OpenAI plugin manifest files
openapi-to-mcp path/to/swagger.yaml --openai-plugin --api-url https://your-api.com
# Specify a particular endpoint for state schema
openapi-to-mcp path/to/swagger.yaml --state-endpoint /status
# Simulate AI interaction with your API
openapi-to-mcp path/to/swagger.yaml --simulate "list all available pets" --api-url https://your-api.com
# Simulate using Claude instead of OpenAI (default)
openapi-to-mcp path/to/swagger.yaml --simulate "add a new pet" --provider claude --api-url https://your-api.com
# See all options
openapi-to-mcp --helpOutput Files
Running the generator creates the following files:
generated.mcp.json: The MCP specification for your APIprompt.txt: Prompt instructions for LLMstemplates.json: JSON action templatesfunctionSchemas.json: OpenAI-compatible function schemasmcp-server.ts: Ready-to-use TypeScript MCP server implementation for Claude Desktopexecutor.ts: Standalone executor for using API actions in any frameworkhandlers/: Directory with individual handler implementations for each actionlangchain-tools.ts: Ready-to-use LangChain tools with Zod validationlangchain-toolloader.ts: Helper for selective tool loading.well-known/ai-plugin.json: OpenAI plugin manifest fileOPENAI-PLUGIN-README.md: Deployment instructions for the OpenAI plugin
Using the Generated Files
MCP Server for Claude Desktop
- Install dependencies:
npm install @modelcontextprotocol/sdk zod - Compile the server:
tsc mcp-server.ts --esModuleInterop --module nodenext - Run with Claude Desktop:
claude tools register mcp-server.js
Standalone Executor
// Example usage with any framework
import { ApiExecutor } from "./executor";
async function main() {
const api = new ApiExecutor("https://your-api.com");
// Get API state
const state = await api.getState();
console.log("Current state:", state);
// Execute an action
const result = await api.execute("listPets", { limit: 10 });
console.log("Pets:", result);
}LangChain Tools
// Example usage with LangChain
import { ChatOpenAI } from "langchain/chat_models/openai";
import { AgentExecutor, createStructuredChatAgent } from "langchain/agents";
import { loadTools } from "./langchain-toolloader";
async function main() {
const model = new ChatOpenAI({
temperature: 0,
modelName: "gpt-4-turbo",
});
// Load all tools or specify which ones to load
const tools = loadTools(["listPets", "getPet", "getState"]);
const agent = createStructuredChatAgent({
llm: model,
tools,
});
const agentExecutor = new AgentExecutor({
agent,
tools,
});
const result = await agentExecutor.invoke({
input:
"What pets are available and can you show me details of pet with ID 1?",
});
console.log(result.output);
}OpenAI Plugin
Follow the instructions in OPENAI-PLUGIN-README.md to deploy your OpenAI plugin:
- Host your API on a public server
- Copy the
.well-known/ai-plugin.jsonto your server - Ensure your OpenAPI spec is available at the URL specified in the plugin manifest
- Register your plugin with OpenAI
Simulation Mode
Simulation mode lets you test AI interaction with your API without requiring setup:
# Set your API key (required for simulation)
export OPENAI_API_KEY=your_key_here
# Or for Claude
export CLAUDE_API_KEY=your_key_here
# Run a simulation
openapi-to-mcp path/to/swagger.yaml --simulate "find pets with tag 'dog'" --api-url https://pet-api.comThis will:
- Parse your OpenAPI spec
- Generate necessary handler files
- Send the request to the LLM with function schemas
- Execute API call via the executor
- Return the LLM's final response with data
State Integration
The tool integrates state information into all exports:
- MCP JSON: Includes a complete
stateSchemasection with structure and examples - Prompt Text: Describes the state structure and provides an example
- Function Schemas: Adds a
getStatefunction for retrieving the current state - Action Templates: Includes a
getStateaction with empty parameters - Executor: Includes a
getState()method for retrieving current state - Handler Files: Includes a
getState.tshandler file - LangChain Tools: Includes a
getStatetool for retrieving current state - OpenAI Plugin: Includes state description in the plugin manifest
State Schema Detection
The tool automatically searches for suitable GET endpoints to use as state schema sources, with priority given to endpoints with names containing:
- state
- status
- scene
- objects
- tracks
- world
You can also manually specify an endpoint using the --state-endpoint option.
Project Structure
openapi-to-mcp/
├── index.ts // CLI entry point
├── parser.ts // Swagger parsing
├── generator/
│ ├── extractor.ts // Convert Swagger → actions
│ ├── extractState.ts // Extract state schema
│ ├── mcpBuilder.ts // Generate MCP JSON
│ ├── exporters/
│ │ ├── toPrompt.ts
│ │ ├── toFunctionSchemas.ts
│ │ ├── toTemplates.ts
│ │ ├── generateMcpServer.ts
│ │ ├── generateExecutor.ts
│ │ ├── generateHandlers.ts
│ │ ├── simulate.ts
│ │ ├── toLangChainTools.ts
│ │ └── toOpenAIPlugin.tsDevelopment and Publishing
Local Development
- Clone the repository
- Install dependencies:
npm install - Build the project:
npm run build - Run locally:
npm start -- path/to/swagger.yaml
Publishing to npm
Manual Publishing
- Log in to npm:
npm login - Build and publish:
npm run build npm publish --access public
Automated Publishing via GitHub Actions
This project uses GitHub Actions for CI/CD to automatically publish new versions to npm when a new tag is pushed:
Update version in package.json:
npm version patch # or minor/majorThis will automatically create a git tag.
Push the new tag to GitHub:
git push origin --tagsThe GitHub Action will trigger and publish the new version to npm.
Setting up CI/CD for Your Fork
If you fork this project, you'll need to set up your own npm publishing:
- Create an npm account and get an access token from npmjs.com → Access Tokens
- Add this token to your GitHub repository as a secret named
NPM_TOKEN - Update the package name in package.json to avoid conflicts
Contributing
We welcome contributions from the community! Please read our contribution guidelines before submitting a pull request.
