npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@waylaidwanderer/background-process-mcp

v1.2.8

Published

A Model Context Protocol (MCP) server that provides background process management capabilities. This server enables LLMs to start, stop, and monitor long-running command-line processes.

Readme

Background Process MCP

A Model Context Protocol (MCP) server that provides background process management capabilities. This server enables LLMs to start, stop, and monitor long-running command-line processes.

Motivation

Some AI agents, like Claude Code, can manage background processes natively, but many others can't. This project provides that capability as a standard tool for other agents like Google's Gemini CLI. It works as a separate service, making long-running task management available to a wider range of agents. I also added a TUI because I wanted to be able to monitor the processes myself.

Screenshot

Getting Started

To get started, install the Background Process MCP server in your preferred client.

Standard Config

This configuration works for most MCP clients:

{
  "mcpServers": {
    "backgroundProcess": {
      "command": "npx",
      "args": [
        "@waylaidwanderer/background-process-mcp@latest"
      ]
    }
  }
}

To connect to a standalone server, add the --port argument to the args array (e.g., ...mcp@latest", "--port", "31337"]).

Use the Claude Code CLI to add the Background Process MCP server:

claude mcp add backgroundProcess npx @waylaidwanderer/background-process-mcp@latest

Follow the MCP install guide, use the standard config above.

Create or edit the configuration file ~/.codex/config.toml and add:

[mcp_servers.backgroundProcess]
command = "npx"
args = ["@waylaidwanderer/background-process-mcp@latest"]

For more information, see the Codex MCP documentation.

Click the button to install:

Or install manually:

Go to Cursor Settings -> MCP -> Add new MCP Server. Name it backgroundProcess, use command type with the command npx @waylaidwanderer/background-process-mcp@latest.

Follow the MCP install guide, use the standard config above.

Click the button to install:

Install in Goose

Or install manually:

Go to Advanced settings -> Extensions -> Add custom extension. Name it backgroundProcess, use type STDIO, and set the command to npx @waylaidwanderer/background-process-mcp@latest. Click "Add Extension".

Click the button to install:

Add MCP Server backgroundProcess to LM Studio

Or install manually:

Go to Program in the right sidebar -> Install -> Edit mcp.json. Use the standard config above.

Follow the MCP Servers documentation. For example in ~/.config/opencode/opencode.json:

{
  "$schema": "https://opencode.ai/config.json",
  "mcp": {
    "backgroundProcess": {
      "type": "local",
      "command": [
        "npx",
        "@waylaidwanderer/background-process-mcp@latest"
      ],
      "enabled": true
    }
  }
}

Open Qodo Gen chat panel in VSCode or IntelliJ → Connect more tools → + Add new MCP → Paste the standard config above.

Click Save.

Click the button to install:

Or install manually:

Follow the MCP install guide, use the standard config above. You can also install the server using the VS Code CLI:

# For VS Code
code --add-mcp '{"name":"backgroundProcess","command":"npx","args":["@waylaidwanderer/background-process-mcp@latest"]}'

Follow Windsurf MCP documentation. Use the standard config above.

Tools

The following tools are exposed by the MCP server.

  • start_process

    • Description: Starts a new process in the background.
    • Parameters:
      • command (string): The shell command to execute.
    • Returns: A confirmation message with the new process ID.
  • stop_process

    • Description: Stops a running process.
    • Parameters:
      • processId (string): The UUID of the process to stop.
    • Returns: A confirmation message.
  • clear_process

    • Description: Clears a stopped process from the list.
    • Parameters:
      • processId (string): The UUID of the process to clear.
    • Returns: A confirmation message.
  • get_process_output

    • Description: Gets the recent output for a process. Can specify head for the first N lines or tail for the last N lines.
    • Parameters:
      • processId (string): The UUID of the process to get output from.
      • head (number, optional): The number of lines to get from the beginning of the output.
      • tail (number, optional): The number of lines to get from the end of the output.
    • Returns: The requested process output as a single string.
  • list_processes

    • Description: Gets a list of all processes being managed by the Core Service.
    • Parameters: None
    • Returns: A JSON string representing an array of all process states.
  • get_server_status

    • Description: Gets the current status of the Core Service.
    • Parameters: None
    • Returns: A JSON string containing server status information (version, port, PID, uptime, process counts).

Architecture

The project has three components:

  1. Core Service (src/server.ts): A standalone WebSocket server that uses node-pty to manage child process lifecycles. It is the single source of truth for all process states. It is designed to be standalone so that other clients beyond the official TUI and MCP can be built for it.

  2. MCP Client (src/mcp.ts): Exposes the Core Service functionality as a set of tools for an LLM agent. It can connect to an existing service or spawn a new one.

  3. TUI Client (src/tui.ts): An ink-based terminal UI that connects to the Core Service to display process information and accept user commands.

Manual Usage

If you wish to run the server and TUI manually outside of an MCP client, you can use the following commands.

For a shorter command, you can install the package globally:

pnpm add -g @waylaidwanderer/background-process-mcp

This will give you access to the bgpm command.

1. Run the Core Service

Start the background service manually:

# With npx
npx @waylaidwanderer/background-process-mcp server

# Or, if installed globally
bgpm server

The server will listen on an available port (defaulting to 31337) and output a JSON handshake with the connection details.

2. Use the TUI

Connect the TUI to a running server via its port:

# With npx
npx @waylaidwanderer/background-process-mcp ui --port <port_number>

# Or, if installed globally
bgpm ui --port <port_number>