npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@cyanheads/jinaai-mcp-server

v1.0.4

Published

A Model Context Protocol (MCP) server that provides intelligent web reading capabilities using the Jina AI Reader API. It extracts clean, LLM-ready content from any URL.

Downloads

34

Readme

JinaAI MCP Server

An intelligent web reader tool powered by the Jina.ai Reader API, delivered as a production-grade Model Context Protocol (MCP) server.

TypeScript Model Context Protocol Version Coverage License Status GitHub

Model Context Protocol (MCP) Server providing a robust, developer-friendly interface to the Jina.ai Reader API. Enables LLMs and AI agents to read, process, and understand content from any webpage programmatically.

Built on the cyanheads/mcp-ts-template, this server follows a modular architecture with robust error handling, logging, and security features.

🚀 Core Capabilities: Jina AI Tools 🛠️

This server equips your AI with a specialized tool to interact with web content:

| Tool Name | Description | Example | | :---------------------------------------------------------- | :------------------------------------------------------------------------------------------------------------------------------------------------------------------- | :---------------------------------------------------- | | jinaai_read_webpage | Extracts and processes the main content from a given URL using Jina AI's ReaderLM engine. It returns a clean, markdown-formatted text representation of the content. | View Example |


Table of Contents

| Overview | Features | Installation | | :------------------------------ | :-------------------------------------- | :--------------------------------------------- | | Configuration | Project Structure | Development & Testing | | License | | |

Overview

The JinaAI MCP Server acts as a bridge, allowing applications that understand the Model Context Protocol (MCP)—like advanced AI assistants, IDE extensions, or custom research tools—to interact directly and efficiently with web content.

Instead of dealing with raw HTML or complex scraping logic, your agents can leverage this server to:

  • Automate Information Gathering: Read articles, documentation, and other web content programmatically.
  • Gain Deeper Understanding: Access clean, LLM-ready text from any URL without leaving the host application.
  • Integrate Web Content into AI Workflows: Enable LLMs to perform research, summarize articles, and incorporate real-time web data into their responses.

Developer Note: This repository includes a .clinerules file that serves as a developer cheat sheet for your LLM coding agent with quick reference for the codebase patterns, file locations, and code snippets.

Features

Core Utilities

Leverages the robust utilities provided by the mcp-ts-template:

  • Logging: Structured, configurable logging with sensitive data redaction.
  • Error Handling: Centralized error processing and standardized error types (McpError).
  • Configuration: Environment variable loading (dotenv) with validation using Zod.
  • Input Validation: Uses zod for all tool input schemas.
  • Request Context: End-to-end operation tracking via unique request IDs.
  • Type Safety: Enforced by TypeScript and Zod schemas.
  • HTTP Transport: High-performance HTTP server using Hono, featuring session management and authentication support.
  • Authentication: Robust authentication layer supporting JWT and OAuth 2.1.
  • Observability: Integrated OpenTelemetry for distributed tracing and metrics.

Jina AI Integration

  • Intelligent Content Extraction: Utilizes Jina's readerlm-v2 engine to parse main content and remove boilerplate.
  • Multiple Formats: Supports output in Markdown, HTML, or plain text.
  • Flexible Options: Control over including links, images, and using the cache.

Installation

Prerequisites

MCP Client Settings

Add the following to your MCP client's configuration file (e.g., cline_mcp_settings.json). This configuration uses npx to run the server, which will automatically install the package if not already present. The JINA_API_KEY is required for the server to function.

{
  "mcpServers": {
    "jinaai-mcp-server": {
      "command": "npx",
      "args": ["@cyanheads/jinaai-mcp-server"],
      "env": {
        "MCP_TRANSPORT_TYPE": "http",
        "MCP_HTTP_PORT": "3018",
        "JINA_API_KEY": "YOUR_JINA_API_KEY_HERE"
      }
    }
  }
}

From Source

  1. Clone the repository:
    git clone https://github.com/cyanheads/jinaai-mcp-server.git
    cd jinaai-mcp-server
  2. Install dependencies:
    npm install
  3. Build the project:
    npm run build

Configuration

Environment Variables

Configure the server using environment variables. For local development, create a .env file at the project root.

| Variable | Description | Default | | :------------------- | :------------------------------------------------------- | :------------ | | JINA_API_KEY | Required. Your API key for the Jina AI service. | (none) | | MCP_TRANSPORT_TYPE | Transport mechanism: stdio or http. | stdio | | MCP_HTTP_PORT | Port for the HTTP server (if MCP_TRANSPORT_TYPE=http). | 3018 | | LOGS_DIR | Directory for log file storage. | logs/ | | NODE_ENV | Runtime environment (development, production). | development |

Project Structure

The codebase follows a modular structure within the src/ directory:

src/
├── index.ts              # Entry point: Initializes and starts the server
├── config/               # Configuration loading (env vars)
│   └── index.ts
├── mcp-server/           # Core MCP server logic and capability registration
│   ├── server.ts         # Server setup, tool registration
│   └── tools/            # MCP Tool implementations
│       └── jinaReader/   # The Jina AI Reader tool
└── utils/                # Common utility functions (logger, error handler, etc.)

For a detailed file tree, run npm run tree or see docs/tree.md.

Development & Testing

Development Scripts

# Build the project (compile TS to JS in dist/)
npm run build

# Clean build artifacts and then rebuild the project
npm run rebuild

# Format code with Prettier
npm run format

Testing

This project uses Vitest for unit and integration testing.

# Run all tests once
npm test

# Run tests in watch mode
npm run test:watch

# Run tests and generate a coverage report
npm run test:coverage

Running the Server

# Start the server using stdio (default)
npm run start:server

# Start the server using HTTP transport
npm run start:server:http

License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.