npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

json-mcp

v1.2.0

Published

MCP server for efficient JSON manipulation

Readme

json-mcp

Ergonomic MCP tools for efficient JSON manipulation in Claude Code.

Stop wrestling with bash/jq pipelines and multi-turn operations. json-mcp provides four simple tools that handle JSON files of any size with intuitive syntax and zero setup overhead.

Why json-mcp?

Before (without MCP tools):

  • 4+ turns to filter, transform, or update JSON
  • Complex jq syntax and bash pipelines
  • Frequent errors with large files
  • Setup overhead (npm install, script writing)

After (with json-mcp):

  • 1 turn for most operations
  • Simple filter syntax: {"price": ">1000"}
  • Handles 100MB+ files with automatic streaming
  • Zero setup - just install and go

Proven Results

Rigorous testing across 5 real-world scenarios showed:

  • 76% reduction in turns per task
  • 83% reduction in tool calls
  • 100% error elimination
  • Perfect success rate maintained

Installation

Option 1: Global Installation (Recommended)

npm install -g json-mcp

Verify installation:

json-mcp --version

Note: Ensure your npm global bin directory is on PATH. Check with:

npm bin -g

Option 2: One-Off Usage

npx json-mcp@latest

Configuration

Global Configuration

Add to ~/.claude.json:

{
  "mcpServers": {
    "json-mcp": {
      "command": "json-mcp"
    }
  }
}

After configuring, restart Claude Code for changes to take effect.

Per-Project Configuration (Portable)

Create .mcp.json in your project root:

{
  "mcpServers": {
    "json-mcp": {
      "command": "json-mcp"
    }
  }
}

This allows the MCP server to be automatically available when working in that directory. Commit .mcp.json to share the configuration with your team.

Tools

json_query

Extract and filter data from JSON files with simple syntax.

Features:

  • Simple comparison operators: >, <, >=, <=, =
  • Grouping and aggregation: count
  • Multiple output formats: json, table, csv, keys
  • Automatic JSONL streaming for large files
  • Preview mode with limit

Examples:

// Find expensive products
json_query({
  file: "products.json",
  filter: { "price": ">1000" },
  output: "table"
})

// Group error logs by type
json_query({
  file: "logs.jsonl",
  filter: { "level": "error" },
  groupBy: "error_type",
  aggregate: "count"
})

// Preview first 5 results
json_query({
  file: "large-dataset.json",
  filter: { "status": "active" },
  limit: 5
})

json_transform

Transform JSON structure and convert between formats.

Features:

  • JSON → CSV conversion, and JSON → JSONL
  • Pre-filtering before transformation
  • Field mapping and renaming
  • Handles large files efficiently

Examples:

// Convert JSON to CSV
json_transform({
  file: "products.json",
  output: "products.csv",
  format: "csv"
})

// Filter then transform
json_transform({
  file: "users.json",
  output: "active-users.csv",
  format: "csv",
  filter: { "status": "active" }
})

// Field mapping
json_transform({
  file: "data.json",
  output: "mapped.json",
  format: "json",
  template: {
    "id": "sku",
    "name": "title"
  }
})

json_update

Update or create fields in JSON files atomically with wildcard paths.

Features:

  • Create new fields: Automatically creates nested paths that don't exist
  • Wildcard path syntax: **.fieldName (no jq expertise required)
  • Conditional updates with where clause
  • Dry-run preview mode
  • Automatic backup creation
  • Atomic writes (no broken JSON)

Examples:

// Update existing field
json_update({
  file: "config.json",
  updates: [{
    path: "$.server.port",
    value: 8080
  }]
})

// Create new nested field (creates intermediate objects automatically)
json_update({
  file: "settings.json",
  updates: [{
    path: "$.mcpServers.vision",
    value: {
      command: "npx",
      args: ["-y", "mcp-gemini-vision"]
    }
  }]
})

// Update all timeout values conditionally
json_update({
  file: "config.json",
  updates: [{
    path: "**.timeout",
    value: 60000,
    where: { oldValue: 30000 }
  }]
})

// Preview changes first
json_update({
  file: "config.json",
  updates: [{
    path: "$.server.port",
    value: 8080
  }],
  dryRun: true
})

// Multiple updates at once
json_update({
  file: "settings.json",
  updates: [
    { path: "$.theme", value: "dark" },
    { path: "$.notifications.enabled", value: true },
    { path: "$.newSection.subsection.field", value: "creates full path" },
    { path: "$.permissions.allow[0]", value: "mcp__chrome-devtools__*" } // creates array as needed
  ]
})

json_validate

Validate JSON files against schemas with detailed error reporting.

Features:

  • JSON Schema validation using Ajv
  • Batch validation of multiple files
  • Detailed or summary output modes
  • Clear violation reporting
  • 100% accuracy

Examples:

// Validate single file
json_validate({
  files: "data.json",
  schema: "schema.json"
})

// Validate multiple files
json_validate({
  files: ["file1.json", "file2.json", "file3.json"],
  schema: "schema.json",
  output: "detailed"
})

// Summary mode
json_validate({
  files: "products.json",
  schema: "product-schema.json",
  output: "summary"
})

Use Cases

Query & Filter

# Find all error logs from the last hour
json_query logs.jsonl --filter '{"level":"error","timestamp":">=2024-10-22T10:00:00Z"}' --groupBy error_type --aggregate count

Transform & Export

# Convert product catalog to CSV for Excel
json_transform products.json --output products.csv --format csv --filter '{"category":"Electronics"}'

Update Configuration

# Update all API timeouts across nested config
json_update config.json --path "**.timeout" --value 60000

Validate Data Quality

# Validate multiple data files against schema
json_validate data-*.json --schema product-schema.json --output detailed

json_audit

Explore schema and structure of large JSON/JSONL files with streaming inference.

Features:

  • Streaming inference for arrays/objects/JSONL
  • JSON Schema generation (2020-12)
  • Field statistics (types, null rates, examples)
  • Sampling and item limits for very large datasets
  • Map/dictionary detection: Objects with many dynamic keys (author names, dates, IDs) are automatically collapsed to additionalProperties pattern instead of listing thousands of keys

Examples:

// Audit file structure
json_audit({ file: "data.json" })

// Sample large files
json_audit({ file: "large.json", sampling: 0.25, maxItems: 50000 })

// Disable map collapsing for full schema
json_audit({ file: "data.json", collapseMapThreshold: 0 })

Performance

json-mcp is designed to handle files of any size:

  • Automatic streaming for JSONL files (tested with 100MB+)
  • Memory-efficient operations with large JSON arrays
  • Fast filtering without loading entire file into memory
  • Atomic writes prevent data corruption

Tested on:

  • 5MB product catalogs (51,000+ records)
  • 100MB+ JSONL logs (500,000+ entries)
  • 189MB GeoJSON files (deeply nested structures)

Architecture

json-mcp uses a declarative tool pattern that makes it easy to add new tools:

export const tools: Tool[] = [
  {
    name: "json_query",
    description: "Extract and query JSON data...",
    inputSchema: { /* JSON Schema */ },
    handler: jsonQueryHandler
  },
  // Add more tools...
];

See ARCHITECTURE.md for implementation details.

Development

Setup

npm install
npm run build

Development Mode

npm run dev

Testing

The project includes comprehensive test scenarios with real-world datasets. See ARCHIVE/ for testing methodology and results.

Contributing

Contributions welcome! The declarative architecture makes adding new tools straightforward:

  1. Add tool definition to src/tools/index.ts
  2. Implement handler function
  3. Update tests
  4. Submit PR

Research & Validation

This project underwent rigorous comparative testing to validate its effectiveness:

  • 5 real-world test scenarios (query, transform, update, validate, large files)
  • Parallel baseline testing (without MCP tools)
  • Comparative testing (with json-mcp tools)
  • Quantified improvements (76% turn reduction, 83% tool call reduction)

For detailed methodology and results, see ARCHIVE/.

Challenge Results

We completed multiple JSON-MCP challenges to validate real-world usage:

  • JSONTestSuite parsing compatibility: see CHALLENGE_RESULTS.md
  • Real dataset filtering/aggregation/transform: see REAL_CHALLENGE_RESULTS.md

Commands used and prompts are captured in JSON_CHALLENGES.md.

License

MIT

Links

Support


Built with the Model Context Protocol for Claude Code