petstormcp-cliente
v0.1.1
Published
Model Context Protocol (MCP) Server for the *petstormcp-cliente* API.
Readme
petstormcp-cliente
Model Context Protocol (MCP) Server for the petstormcp-cliente API.
[!IMPORTANT] This MCP Server is not yet ready for production use. To complete setup please follow the steps outlined in your workspace. Delete this notice before publishing to a package manager.
Summary
Petstore - OpenAPI 3.1: This is a sample Pet Store Server based on the OpenAPI 3.1 specification.
Some useful links:
For more information about the API: Find out more about Swagger
Table of Contents
Installation
[!TIP] To finish publishing your MCP Server to npm and others you must run your first generation action.
Install the MCP server as a Desktop Extension using the pre-built mcp-server.mcpb file:
Simply drag and drop the mcp-server.mcpb file onto Claude Desktop to install the extension.
The MCP bundle package includes the MCP server and all necessary configuration. Once installed, the server will be available without additional setup.
[!NOTE] MCP bundles provide a streamlined way to package and distribute MCP servers. Learn more about Desktop Extensions.
Or manually:
- Open Cursor Settings
- Select Tools and Integrations
- Select New MCP Server
- If the configuration file is empty paste the following JSON into the MCP Server Configuration:
{
"command": "npx",
"args": [
"petstormcp-cliente",
"start",
"--environment",
"prod",
"--api-key",
""
]
}claude mcp add Petstore -- npx -y petstormcp-cliente start --environment prod --api-key gemini mcp add Petstore -- npx -y petstormcp-cliente start --environment prod --api-key Refer to Official Windsurf documentation for latest information
- Open Windsurf Settings
- Select Cascade on left side menu
- Click on
Manage MCPs. (To Manage MCPs you should be signed in with a Windsurf Account) - Click on
View raw configto open up the mcp configuration file. - If the configuration file is empty paste the full json
{
"command": "npx",
"args": [
"petstormcp-cliente",
"start",
"--environment",
"prod",
"--api-key",
""
]
}Or manually:
Refer to Official VS Code documentation for latest information
- Open Command Palette
- Search and open
MCP: Open User Configuration. This should open mcp.json file - If the configuration file is empty paste the full json
{
"command": "npx",
"args": [
"petstormcp-cliente",
"start",
"--environment",
"prod",
"--api-key",
""
]
}npx petstormcp-cliente start --environment prod --api-key For a full list of server arguments, run:
npx petstormcp-cliente --helpProgressive Discovery
MCP servers with many tools can bloat LLM context windows, leading to increased token usage and tool confusion. Dynamic mode solves this by exposing only a small set of meta-tools that let agents progressively discover and invoke tools on demand.
To enable dynamic mode, pass the --mode dynamic flag when starting your server:
{
"mcpServers": {
"Petstore": {
"command": "npx",
"args": ["petstormcp-cliente", "start", "--mode", "dynamic"],
// ... other server arguments
}
}
}In dynamic mode, the server registers only the following meta-tools instead of every individual tool:
list_tools: Lists all available tools with their names and descriptions.describe_tool: Returns the input schema for one or more tools by name.execute_tool: Executes a tool by name with the provided input parameters.
This approach significantly reduces the number of tokens sent to the LLM on each request, which is especially useful for servers with a large number of tools.
Development
Run locally without a published npm package:
- Clone this repository
- Run
npm install - Run
npm run build - Run
node ./bin/mcp-server.js start --environment prod --api-keyTo use this local version with Cursor, Claude or other MCP Clients, you'll need to add the following config:
{
"command": "node",
"args": [
"./bin/mcp-server.js",
"start",
"--environment",
"prod",
"--api-key",
""
]
}Or to debug the MCP server locally, use the official MCP Inspector:
npx @modelcontextprotocol/inspector node ./bin/mcp-server.js start --environment prod --api-key Publishing to Anthropic MCP Registry
This server generates a server.json that conforms to the official MCP Registry schema. You can publish automatically via your Speakeasy workflow or manually using the mcp-publisher CLI.
Automated Publishing (Recommended)
Add mcpRegistry to the publish block in your workflow.yaml:
targets:
my-mcp:
target: mcp-typescript
source: my-source
publish:
npm:
token: $NPM_TOKEN
mcpRegistry:
auth: github-oidc # recommended, no token neededThe github-oidc method uses GitHub Actions OIDC — no secrets required. For other auth methods:
github— requires aMCP_REGISTRY_TOKENsecret (GitHub PAT withread:org+read:userscopes)dns— requires aMCP_REGISTRY_TOKENsecret (Ed25519 private key for custom domain namespaces)
When the Speakeasy workflow runs, it will automatically publish to npm first, then to the MCP Registry.
Manual Publishing
If you prefer to publish manually, follow the official publishing guide:
- Publish to npm:
npm publish --access public - Install the publisher CLI:
curl -sL "https://github.com/modelcontextprotocol/registry/releases/latest/download/mcp-publisher_$(uname -s | tr '[:upper:]' '[:lower:]')_$(uname -m | sed 's/x86_64/amd64/;s/aarch64/arm64/').tar.gz" | tar xz mcp-publisher && sudo mv mcp-publisher /usr/local/bin/ - Authenticate (GitHub OAuth for
io.github.*namespaces):mcp-publisher login github - Publish:
mcp-publisher publish - Verify:
curl "https://registry.modelcontextprotocol.io/v0/servers?search=<your-mcp-name>"
Contributions
While we value contributions to this MCP Server, the code is generated programmatically. Any manual changes added to internal files will be overwritten on the next generation. We look forward to hearing your feedback. Feel free to open a PR or an issue with a proof of concept and we'll do our best to include it in a future release.
