npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

neochat

v3.7.1

Published

A CLI chatbot with support for multiple AI providers (ChatGPT, Claude, Gemini) — styled like Claude Code

Readme

neochat

A CLI chatbot that lets you choose between ChatGPT, Claude, and Gemini as the underlying model — styled after Claude Code.

Node.js License

Install

npm install -g neochat

Setup

Set the API key(s) for the provider(s) you want to use:

export OPENAI_API_KEY="sk-..."         # for GPT models
export ANTHROPIC_API_KEY="sk-ant-..."  # for Claude models
export GEMINI_API_KEY="..."            # for Gemini models

neochat also auto-loads these keys from your shell rc files if they aren't already in the environment — handy when you launch it from a shell that didn't source them (non-login shells, IDE terminals, etc.):

  • zsh~/.zshrc, ~/.zshenv, ~/.zprofile, ~/.profile
  • bash (default) → ~/.bashrc, ~/.bash_profile, ~/.profile

Both export FOO=bar and plain FOO=bar forms are recognized, with single/double quotes and trailing # comments handled. Values containing shell expansions ($VAR, $(...), backticks) are skipped — export the resolved value or set it in your current shell instead.

Usage

# Start with the default model (gpt-4o)
neochat

# Start with a specific model
neochat claude-sonnet-4-20250514

# Or set a default via env var
export NEOCHAT_MODEL=claude-sonnet-4-20250514
neochat

Slash Commands

| Command | Description | | --------------- | -------------------------------- | | /model | List available models | | /model <name> | Switch to a different model | | /clear | Clear conversation history | | /help | Show available commands | | /exit | Quit neochat |

Supported Models

When you pick a provider in the interactive menu, neochat calls that provider's models endpoint with your API key and shows you the live list of models your account can actually use (OpenAI /v1/models, Anthropic /v1/models, Gemini /v1beta/models). After you pick a model, neochat sends a 1-token ping to confirm the key works and you have access to that specific model — so you find out immediately, not on your first chat.

If the provider's list endpoint is unreachable, neochat falls back to a built-in list (kept as a safety net, not the source of truth):

OpenAI (fallback): gpt-4o, gpt-4o-mini, gpt-4-turbo, gpt-3.5-turbo

Anthropic (fallback): claude-sonnet-4-20250514, claude-opus-4-20250514, claude-haiku-4-20250514

Google (fallback): gemini-2.5-pro, gemini-2.5-flash, gemini-2.0-flash, gemini-2.0-flash-lite

Web fetching

neochat auto-connects the upstream MCP Fetch server, which exposes a fetch tool that retrieves a URL and returns it as extracted markdown (or raw HTML). There is no self-hosted search — the model asks for specific URLs.

Requirements (one of):

# Preferred: uvx handles install automatically
#   https://github.com/astral-sh/uv
uvx mcp-server-fetch --help      # verify it runs

# Or install into your Python env:
pip install mcp-server-fetch

Neochat detects uvx first and falls back to python -m mcp_server_fetch.

Filesystem access

neochat also auto-connects the upstream MCP Filesystem server, which gives the model sandboxed tools for reading, writing, editing, and navigating local files (read_text_file, write_file, edit_file, list_directory, search_files, …).

Access is restricted to the roots you authorize. By default the sandbox is the current working directory. To expand it, set NEOCHAT_FS_ROOTS to an OS-path-separator list before launching:

# Linux / macOS
export NEOCHAT_FS_ROOTS="$HOME/projects:$HOME/notes"

# Windows (PowerShell)
$env:NEOCHAT_FS_ROOTS = "C:\\projects;C:\\notes"

neochat

Runs via npx -y @modelcontextprotocol/server-filesystem <roots...> — no install needed beyond npx (ships with Node).

Persistent memory

neochat auto-connects the upstream MCP Memory server — a knowledge-graph store the model uses to remember facts across sessions (user role, preferences, ongoing projects, feedback it shouldn't repeat).

Tools exposed: create_entities, create_relations, add_observations, delete_entities, delete_relations, delete_observations, read_graph, search_nodes, open_nodes.

The graph is persisted to $MEMORY_FILE_PATH (default ~/.neochat/memory.json). Point it elsewhere to share memory across machines (e.g. a synced directory) or to keep project-scoped graphs:

export MEMORY_FILE_PATH="$HOME/Dropbox/neochat-memory.json"
# or per-project
MEMORY_FILE_PATH=./.neochat-memory.json neochat

Delete the file to wipe memory completely.

Sequential thinking

neochat auto-connects the upstream MCP Sequential Thinking server, which exposes a single sequentialthinking tool. The model uses it to externalize structured, revisable chains of reasoning for hard problems (non-trivial refactors, opaque debugging, design work). No configuration — runs via npx -y @modelcontextprotocol/server-sequential-thinking.

License

MIT