npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

clarissa

v1.4.1

Published

An AI-powered terminal assistant with tool execution capabilities

Readme


Clarissa is a command-line AI agent built with Bun and Ink. It supports multiple LLM providers including cloud services like OpenRouter, OpenAI, and Anthropic, as well as local inference via Apple Intelligence, LM Studio, and local GGUF models. The agent can execute tools, manage files, run shell commands, and integrate with external services via the Model Context Protocol (MCP).

Features

  • Multi-provider support - Use cloud providers (OpenRouter, OpenAI, Anthropic) or run completely offline with local models
  • Apple Intelligence - On-device AI using Apple Foundation Models with full tool calling support (macOS 26+)
  • Local model inference - Run GGUF models locally via LM Studio or node-llama-cpp with GPU acceleration
  • Streaming responses - Real-time token streaming for responsive conversations
  • Built-in tools - File operations, Git integration, shell commands, web fetching, and more
  • MCP integration - Connect to external MCP servers to extend functionality
  • Session management - Auto-save on exit, quick resume with /last, and named sessions
  • Memory persistence - Remember facts across sessions with /remember and /memories
  • Context management - Automatic token tracking and context truncation
  • Tool confirmation - Approve or reject potentially dangerous operations
  • One-shot mode - Run single commands directly from your shell with query history
  • Auto-updates - Get notified of new versions and upgrade easily with clarissa upgrade

How It Works

Clarissa implements the ReAct (Reasoning + Acting) agent pattern, where an LLM reasons about tasks and takes actions through tool execution in an iterative loop.

Architecture Overview

flowchart LR
    subgraph Input
        A[User Message]
        B[Piped Content]
    end

    subgraph Clarissa
        C[Agent Loop]
        D[LLM Client]
        E[Tool Registry]
        F[Context Manager]
    end

    subgraph Providers
        G[Cloud: OpenRouter / OpenAI / Anthropic]
        H[Local: Apple AI / LM Studio / GGUF]
    end

    subgraph External
        I[MCP Servers]
    end

    A --> C
    B --> C
    C <--> D
    C <--> E
    C <--> F
    D <--> G
    D <--> H
    E <-.-> I

The system connects your terminal to various LLM providers. When you ask Clarissa to perform a task, it:

  1. Sends your message to the LLM along with available tool definitions
  2. Receives a response that may include tool calls (e.g., read a file, run a command)
  3. Executes the requested tools and feeds results back to the LLM
  4. Repeats until the LLM provides a final answer

The ReAct Loop

flowchart TD
    A[User Input] --> B[Add to Conversation]
    B --> C[Send to LLM]
    C --> D{Response Type?}
    D -->|Tool Calls| E[Execute Tools]
    E --> F[Add Results to History]
    F --> C
    D -->|Final Answer| G[Display Response]

This loop continues until the LLM responds without requesting any tools, indicating it has completed the task. A maximum iteration limit prevents infinite loops.

Key Concepts

| Concept | Description | |---------|-------------| | Tool Confirmation | Potentially dangerous tools (file writes, shell commands) require approval before execution. Use /yolo to auto-approve. | | Context Management | Clarissa tracks token usage and automatically truncates older messages when approaching the model's context limit. | | Session Persistence | Conversations can be saved to ~/.clarissa/sessions/ and restored later with /save and /load. | | Memory System | Use /remember to store facts that persist across sessions and are included in every conversation. | | MCP Extensibility | Connect to Model Context Protocol servers to add custom tools without modifying Clarissa's code. |

For detailed architecture documentation, see the Architecture Guide.

Requirements

  • Bun v1.0 or later (for running from source or npm install)
  • For cloud providers: API key for OpenRouter, OpenAI, or Anthropic
  • For Apple Intelligence: macOS 26+ with Apple Silicon and Apple Intelligence enabled
  • For local models: LM Studio or download GGUF models with clarissa download

Installation

From npm (recommended)

# Using bun
bun install -g clarissa

# Using npm
npm install -g clarissa

From source

git clone https://github.com/cameronrye/clarissa.git
cd clarissa
bun install
bun link

Standalone binary

Download a pre-built binary from the releases page and add it to your PATH:

# Example for macOS ARM
chmod +x clarissa-macos-arm64
mv clarissa-macos-arm64 /usr/local/bin/clarissa

Configuration

Create a config file at ~/.clarissa/config.json or run clarissa init for interactive setup.

API Keys (Cloud Providers)

Set one or more API keys for cloud providers:

# Environment variables
export OPENROUTER_API_KEY=your_key_here
export OPENAI_API_KEY=your_key_here
export ANTHROPIC_API_KEY=your_key_here

Or in ~/.clarissa/config.json:

{
  "apiKey": "your_openrouter_key",
  "openaiApiKey": "your_openai_key",
  "anthropicApiKey": "your_anthropic_key"
}

Local Providers (No API Key Required)

  • Apple Intelligence: Automatically detected on macOS 26+ with Apple Intelligence enabled
  • LM Studio: Start LM Studio and load a model - Clarissa auto-detects the local server
  • Local GGUF: Download models with clarissa download and run offline

Configuration Options

| Config Key | Env Variable | Default | Description | |------------|--------------|---------|-------------| | apiKey | OPENROUTER_API_KEY | - | OpenRouter API key | | openaiApiKey | OPENAI_API_KEY | - | OpenAI API key | | anthropicApiKey | ANTHROPIC_API_KEY | - | Anthropic API key | | model | - | (auto) | Preferred model | | preferredProvider | - | (auto) | Preferred provider ID | | maxIterations | MAX_ITERATIONS | 10 | Maximum tool execution iterations | | debug | DEBUG | false | Enable debug logging | | mcpServers | - | {} | MCP servers to auto-load (see below) |

MCP Server Configuration

Add MCP servers to your config file to auto-load them on startup:

{
  "apiKey": "your_api_key_here",
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/dir"]
    },
    "github": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-github"],
      "env": { "GITHUB_TOKEN": "your_token" }
    }
  }
}

Use /mcp to view connected servers and /tools to see available tools.

Usage

Interactive Mode

Start Clarissa in interactive mode:

clarissa

One-Shot Mode

Run a single command and exit:

clarissa "What files are in this directory?"

Piped Input

Pipe content from other commands:

cat error.log | clarissa "Explain this error"
git diff | clarissa "Write a commit message for these changes"

CLI Options

| Option | Description | |--------|-------------| | -m, --model MODEL | Use a specific model | | --list-models | List available models | | --debug | Enable debug output | | -h, --help | Show help | | -v, --version | Show version |

Commands

| Command | Description | |---------|-------------| | /help | Show available commands | | /clear | Clear conversation history | | /save [NAME] | Save current session | | /sessions | List saved sessions | | /load ID | Load a saved session | | /last | Resume last session | | /delete ID | Delete a saved session | | /remember <fact> | Save a memory | | /memories | List saved memories | | /forget <#\|ID> | Forget a memory | | /model [NAME] | Show or switch the current model | | /provider [ID] | Show or switch the LLM provider | | /mcp | Show connected MCP servers | | /tools | List available tools | | /context | Show context window usage and breakdown | | /yolo | Toggle auto-approve mode (skip tool confirmations) | | /exit | Exit Clarissa |

Keyboard Shortcuts

| Shortcut | Action | |----------|--------| | Ctrl+C | Cancel current operation / Exit | | Ctrl+P | Enhance prompt with AI | | Up/Down | Navigate input history |

Built-in Tools

File Operations

  • read_file - Read file contents
  • write_file - Write or create files
  • patch_file - Apply patches to files
  • list_directory - List directory contents
  • search_files - Search for files by pattern

Git Integration

  • git_status - Show repository status
  • git_diff - Show changes
  • git_log - View commit history
  • git_add - Stage files
  • git_commit - Commit changes
  • git_branch - Manage branches

System

  • bash - Execute shell commands
  • calculator - Perform calculations

Web

  • web_fetch - Fetch and parse web pages

MCP Integration

Connect to Model Context Protocol servers to extend Clarissa with additional tools:

/mcp npx -y @modelcontextprotocol/server-filesystem /path/to/directory

Development

Run with hot reloading:

bun run dev

Run tests:

bun test

Building Binaries

Build for your current platform:

bun run build:current

Build for all platforms:

bun run build:all

Binaries are output to the dist/ directory.

Publishing to npm

npm publish

Project Structure

src/
  index.tsx        # Entry point
  agent.ts         # ReAct agent loop implementation
  config/          # Environment configuration
  llm/             # LLM client and context management
  mcp/             # MCP client integration
  session/         # Session persistence
  tools/           # Tool definitions
  ui/              # Ink UI components

License

MIT


Made with ❤️ by Cameron Rye