npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@arcticzeroo/muse

v0.0.14

Published

Muse is an MCP server that provides markdown-based memory for any AI agent supporting MCP sampling*.

Readme

muse

Muse is an MCP server that provides markdown-based memory for any AI agent supporting MCP sampling*.

The idea is that the agent builds up a memory over time, which essentially becomes documentation for your codebase. Since they're just markdown files, you can (and should) check them into your version control system to share with your team. When memory files are created/deleted/updated outside the MCP server (aka by humans), muse will automatically ingest those changes into its memory. It is a good idea to review all documentation generated by Muse and to make edits before checking it in.

*Note: MCP sampling is currently (as of writing) only supported in VSCode, and occasionally has some bugs (e.g. you may have to restart VSCode every so often in case the server starts getting timeouts from VSCode). This project heavily uses MCP sampling, so you should probably only use models that don't count towards your request budget. There is no model hint provided by this project at the moment.

Sampling also tends to be somewhat slow in VSCode right now, but that will probably improve over time. IMO it is still worth waiting the ~30s for querying or ingestion since the memory is so useful.

Installation

Add an MCP server with the command npx -y @arcticzeroo/muse <memory directory>. Optionally you can also include a context file with npx -y @arcticzeroo/muse <memory directory> --context <context file>.

In mcp.json, this would look like:

{
  "servers": {
    "muse": {
      "type": "stdio",
      "command": "npx",
      "args": [
        "-y",
        "@arcticzeroo/muse",
        "<memory directory>"
      ]
    }
  }
}

Adding a context file can help a lot if your codebase has a lot of invented terminology.

Further, it is probably a good idea to add a line to your copilot-instructions.md (or equivalent) that says something like:

IMPORTANT: Always query memory in muse before searching/writing code. Once you're done with your task, make sure to ingest into muse's memory with any new information you learned.

Alternative Installation: use as a library

Muse can be used as a library instead. Check how it is used in server/tools as an example.

import { MemorySession } from '@arcticzeroo/muse/session';
import { MCP_SERVER } from './mcp-server.js';
import z from 'zod';

const session = MemorySession.createAsync({
    mcpServer: MCP_SERVER, // should be your MCP server instance from the MCP SDK
    outputDirectory: '<memory directory>',
    contextFile: '<context file>', // optional
});

MCP_SERVER.registerTool(
    'query',
    {
        inputSchema: z.string()
    },
    async ({ query }) => {
        const result = await session.queryMemory(query);
        return [{
            type: 'text',
            text: result
        }];
    }
);

await MCP_SERVER.connect(transport);

await session.initializeAfterMcpServerStarted();

Usage

Once you have the MCP server running, you can use it in your AI agent however you like. You can do things like:

  • Ask the AI to investigate some feature, architecture, language patterns, etc. and ingest it into memory.
  • Ask the AI to go through your existing copilot-instructions file, investigate the related code, and ingest findings into memory.
  • Ask the AI to query memory for some topic.

Docs

Docs in this repo /docs are generated by muse and slightly edited.