npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@pinecone-database/mcp

v0.1.17

Published

The [Model Context Protocol](https://modelcontextprotocol.io/introduction) (MCP) is a standard that allows coding assistants and other AI tools to interact with platforms like Pinecone. The Pinecone Developer MCP Server allows you to connect these tools w

Readme

Pinecone Developer MCP Server

The Model Context Protocol (MCP) is a standard that allows coding assistants and other AI tools to interact with platforms like Pinecone. The Pinecone Developer MCP Server allows you to connect these tools with Pinecone projects and documentation.

Once connected, AI tools can:

  • Search Pinecone documentation to answer questions accurately.
  • Help you configure indexes based on your application's needs.
  • Generate code informed by your index configuration and data, as well as Pinecone documentation and examples.
  • Upsert and search for data in indexes, allowing you to test queries and evaluate results within your dev environment.

See the docs for more detailed information.

This MCP server is focused on improving the experience of developers working with Pinecone as part of their technology stack. It is intended for use with coding assistants. Pinecone also offers the Assistant MCP, which is designed to provide AI assistants with relevant context sourced from your knowledge base.

Setup

To configure the MCP server to access your Pinecone project, you will need to generate an API key using the console. Without an API key, your AI tool will still be able to search documentation. However, it will not be able to manage or query your indexes.

The MCP server requires Node.js. Ensure that node and npx are available in your PATH.

Next, you will need to configure your AI assistant to use the MCP server.

Configure Cursor

To add the Pinecone MCP server to a project, create a .cursor/mcp.json file in the project root (if it doesn't already exist) and add the following configuration:

{
  "mcpServers": {
    "pinecone": {
      "command": "npx",
      "args": [
        "-y", "@pinecone-database/mcp"
      ],
      "env": {
        "PINECONE_API_KEY": "<your pinecone api key>"
      }
    }
  }
}

You can check the status of the server in Cursor Settings > MCP.

To enable the server globally, add the configuration to the .cursor/mcp.json in your home directory instead.

It is recommended to use rules to instruct Cursor on proper usage of the MCP server. Check out the docs for some suggestions.

Configure Claude desktop

Use Claude desktop to locate the claude_desktop_config.json file by navigating to Settings > Developer > Edit Config. Add the following configuration:

{
  "mcpServers": {
    "pinecone": {
      "command": "npx",
      "args": [
        "-y", "@pinecone-database/mcp"
      ],
      "env": {
        "PINECONE_API_KEY": "<your pinecone api key>"
      }
    }
  }
}

Restart Claude desktop. On the new chat screen, you should see a hammer (MCP) icon appear with the new MCP tools available.

Use as a Gemini CLI extension

To install this as a Gemini CLI extension, run the following command:

gemini extensions install https://github.com/pinecone-io/pinecone-mcp

You will need to provide your Pinecone API key in the PINECONE_API_KEY environment variable.

export PINECONE_API_KEY=<your pinecone api key>

When you run gemini and press ctrl+t, pinecone should now be shown in the list of installed MCP servers.

Usage

Once configured, your AI tool will automatically make use of the MCP to interact with Pinecone. You may be prompted for permission before a tool can be used. Try asking your AI assistant to set up an example index, upload sample data, or search for you!

Tools

Pinecone Developer MCP Server provides the following tools for AI assistants to use:

  • search-docs: Search the official Pinecone documentation.
  • list-indexes: Lists all Pinecone indexes.
  • describe-index: Describes the configuration of an index.
  • describe-index-stats: Provides statistics about the data in the index, including the number of records and available namespaces.
  • create-index-for-model: Creates a new index that uses an integrated inference model to embed text as vectors.
  • upsert-records: Inserts or updates records in an index with integrated inference.
  • search-records: Searches for records in an index based on a text query, using integrated inference for embedding. Has options for metadata filtering and reranking.
  • cascading-search: Searches for records across multiple indexes, deduplicating and reranking the results.
  • rerank-documents: Reranks a collection of records or text documents using a specialized reranking model.

Limitations

Only indexes with integrated inference are supported. Assistants, indexes without integrated inference, standalone embeddings, and vector search are not supported.

Contributing

We welcome your collaboration in improving the developer MCP experience. Please submit issues in the GitHub issue tracker. Information about contributing can be found in CONTRIBUTING.md.