npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@nish_ntr/chat-memory-mcp

v1.0.2

Published

MCP server to search and retrieve VS Code Copilot Chat history with AI-powered summarization. Access your cross-workspace conversation history from any AI agent.

Readme

Chat Memory MCP - Search & Summarize Your VS Code Chat History

npm version License: MIT Model Context Protocol

An MCP (Model Context Protocol) server that enables AI agents in VS Code to search and retrieve content from past Copilot Chat sessions across all your workspaces. Enhance your AI coding agent (like Cline, Roo Code, or Claude) with persistent memory.

Table of Contents

Features

  • ✅ Continuous Memory: Search through every conversation you've ever had with Copilot.
  • ✅ Instant Summaries: Get 5-8 sentence summaries of complex past discussions without polluting your context window.
  • ✅ Cross-Workspace Access: Access solutions and snippets from any project, instantly.
  • ✅ Local & Secure: Your data stays on your machine and is only summarized by your chosen LLM.

Why use Chat Memory?

❌ Without Chat Memory

  • ❌ Lost Context: You lose valuable insights when switching between projects/workspaces.
  • ❌ Repetitive Work: You waste time re-asking Copilot the same questions you resolved yesterday.
  • ❌ Isolated Knowledge: Finding a specific code snippet or architectural decision from a past chat requires manually opening old workspaces.

✅ With Chat Memory

Chat Memory MCP connects your AI agent to your entire history of conversations. It retrieves up-to-date context from your past sessions and provides AI-generated summaries—placing them directly into your current prompt.

Tools Provided

1. search_vs_code_chats

Search through past VS Code Copilot Chat sessions.

Input:

  • query (string): Search query to find in chat titles and messages

Output:

  • Array of matching sessions with metadata (sessionId, title, workspaceName, lastModified)

2. get_vs_code_chat_content

Retrieve an AI-generated summary of a specific chat session.

Input:

  • sessionId (string): Absolute file path to the chat session JSON file (obtained from search results)

Output:

  • AI-generated concise summary of the conversation (3-5 sentences)
  • Chat title and message count
  • Note: The full transcript is sent to the LLM for summarization but is NOT included in the response, keeping the prompt chain clean

Installation

Standard Installation

VS Code NPM

For other clients (Claude Code, Roo Code, etc.), use the following command:

{
  "mcpServers": {
    "chat-memory": {
      "command": "npx",
      "args": ["-y", "@nish_ntr/chat-memory-mcp@latest"]
    }
  }
}

Important Tips

Accessing History

Simply ask your agent:

"Search my past chats for how I implemented JWT authentication in the other project."

Clean Context

When you ask for content, the server uses MCP Sampling to summarize the conversation first. This means the agent gets a concise summary rather than the full transcript, keeping your token usage low and your focus sharp.

Data Privacy

The server reads directly from your VS Code workspace storage locally.

  • Local Data: Your data never leaves your machine except when sent to your chosen LLM for summarization.
  • Selective Context: Only human-readable summaries are returned to the AI agent, keeping your full chats private from the immediate prompt history unless explicitly summarized.

The server automatically detects your OS and searches in:

  • macOS: ~/Library/Application Support/Code/User/workspaceStorage/
  • Windows: %APPDATA%\Code\User\workspaceStorage\
  • Linux: ~/.config/Code/User/workspaceStorage/

Contributing

Contributions are welcome! If you have suggestions for improvements or bug fixes, please open an issue or a pull request.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/AmazingFeature)
  3. Commit your changes (git commit -m 'Add some AmazingFeature')
  4. Push to the branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

License

MIT