npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

mahanai

v4.4.11

Published

MahanAI Super 2.0 — terminal AI agent with streaming, tools, multi-model support (NVIDIA NIM, Claude, OpenAI Codex, custom endpoint).

Readme

MahanAI Super

Terminal AI agent (Super 2.0) with multi-model support, streaming chat, tools, and a built-in Claude CLI mode. Docs: MahanAI.

Install

pip install mahanai
mahanai

Launch Options

| Flag | Description | |---|---| | --compact | Compact mode: renders a small MAI ASCII logo and a trimmed header (no streaming hint, no /api-key reminder) |

mahanai --compact

Compact banner looks like:

===================================
  Super 2.0  |  <Model>  |
  /help  /exit  /quit
===================================

Models

MahanAI Super supports multiple backends selectable at runtime via /models.

NVIDIA NIM

| Pretty Name | Model ID | Backend | |-------------------|---------------------------------|---------------------| | Llama 3.3 70B | meta/llama-3.3-70b-instruct | NVIDIA NIM (direct) |

Note: A legacy server mode (mahanai/mahanai) exists in the model selector but is undocumented and not recommended for use.

Claude

| Pretty Name | Model ID | Backend | |-------------------|---------------------------|------------| | Claude Opus 4 | claude-opus-4-7 | Claude CLI | | Claude Sonnet 4.6 | claude-sonnet-4-6 | Claude CLI | | Claude Haiku 4.5 | claude-haiku-4-5-20251001 | Claude CLI |

OpenAI Codex

Seven models available, each accessible in Direct and Indirect mode (see OpenAI Codex below):

| Pretty Name | Model ID | |--------------------|---------------------| | GPT-5.4 | gpt-5.4 | | GPT-5.2-Codex | gpt-5.2-codex | | GPT-5.1-Codex-Max | gpt-5.1-codex-max | | GPT-5.4-Mini | gpt-5.4-mini | | GPT-5.3-Codex | gpt-5.3-codex | | GPT-5.2 | gpt-5.2 | | GPT-5.1-Codex-Mini | gpt-5.1-codex-mini |

Switch models interactively with /models (arrow-key selector) or quick-switch with /mode claude / /mode default.

Custom Endpoint

Point MahanAI at any OpenAI-compatible API (Ollama, LM Studio, vLLM, OpenRouter, etc.):

/custom http://localhost:11434/v1 llama3 [optional-api-key]

Once saved, select Custom Endpoint from /models to start using it. The config persists across sessions.

Commands

| Command | Description | |---|---| | /models | Interactive model selector (↑↓ arrows, Enter to confirm, Esc to cancel) | | /mode claude | Quick-switch to Claude Sonnet 4.6 | | /mode default | Quick-switch back to MahanAI Super (server) | | /api-key [key] | Save server API key (omit key for hidden prompt) | | /api-key clear | Remove saved server key | | /api-key-nvidia [key] | Save NVIDIA direct API key | | /api-key-nvidia clear | Remove NVIDIA key, switch back to server | | /codex-login | Sign in to OpenAI via browser (Codex Direct mode) | | /codex-logout | Remove saved OpenAI Codex credentials | | /custom [url [model [key]]] | Configure a custom OpenAI-compatible endpoint | | /custom clear | Remove saved custom endpoint | | /help | Show help | | /exit or /quit | Leave |

API Keys

Server / NVIDIA NIM

  1. Environment: MAHANAI_API_KEY=...
  2. Project .env: MAHANAI_API_KEY=...
  3. In-app: /api-key your-key

Keys are stored under %APPDATA%\MahanAI\config.json on Windows or ~/.config/mahanai/config.json on Linux/macOS.

Claude CLI mode

Claude models use your local claude CLI installation. Make sure Claude Code is installed and on your PATH. No extra API key configuration needed inside MahanAI — it uses whatever account Claude CLI is authenticated with.

OpenAI Codex

MahanAI supports two Codex authentication modes:

Direct mode

Signs in to your OpenAI account via a browser-based OAuth PKCE flow — no API key needed.

/codex-login

This opens your browser to auth.openai.com. After you approve, MahanAI receives and stores the access token automatically. Tokens are refreshed silently before they expire (saved to the same config.json as other keys).

Indirect mode

Reads credentials from a locally installed and signed-in OpenAI Codex CLI. MahanAI looks for auth.json in these locations:

| Platform | Paths checked | |---|---| | Windows | %LOCALAPPDATA%\OpenAI\Codex\auth.json, ~\.codex\auth.json | | macOS / Linux | ~/.codex/auth.json, ~/.config/codex/auth.json |

If no token file is found, MahanAI falls back to running the codex CLI as a subprocess (requires Codex CLI on your PATH).

To use indirect mode, install and sign in to the Codex CLI first:

npm i -g @openai/codex
codex login

Then select any OpenAI Codex (Indirect) model from /models.

Custom Endpoint

Use /custom to connect to any OpenAI-compatible server — Ollama, LM Studio, vLLM, OpenRouter, or your own deployment.

Interactive setup (prompts for each field):

/custom

One-liner:

/custom <base-url> [model] [api-key]

Examples:

/custom http://localhost:11434/v1 llama3
/custom http://localhost:1234/v1 mistral-7b
/custom https://openrouter.ai/api/v1 openai/gpt-4o sk-or-...
  • base-url — the /v1 base URL of the server
  • model — model ID to send in requests (defaults to gpt-3.5-turbo if omitted)
  • api-key — leave blank if the server doesn't require one

After saving, run /models and select Custom Endpoint, or the agent will remind you to switch if you haven't already. To remove the config:

/custom clear

Environment Variables

| Variable | Purpose | |---|---| | MAHANAI_API_KEY | Override saved server API key | | MAHANAI_MODEL | Override default model ID | | MAHANAI_STREAM | Set to 0/false/no/off to disable streaming | | MAHANAI_CONFIG_DIR | Override config file directory | | NO_COLOR | Disable terminal colors |

Tools

MahanAI can execute tools on your behalf:

  • run_command — run shell commands (asks confirmation before destructive ops)
  • read_file — read a file
  • write_file — write a file
  • append_file — append to a file
  • list_directory — list directory contents

Develop

pip install -e .
python -m mahanai

Publish to PyPI

Bump version in both pyproject.toml and mahanai/__init__.py, then:

pip install build twine
python -m build
python -m twine check dist/*

Windows (PowerShell):

$env:TWINE_USERNAME = "__token__"
$env:TWINE_PASSWORD = "pypi-YOUR_TOKEN_HERE"
python -m twine upload dist/mahanai-4.0.0*

macOS / Linux:

export TWINE_USERNAME=__token__
export TWINE_PASSWORD=pypi-YOUR_TOKEN_HERE
python -m twine upload dist/mahanai-4.0.0*

twine cannot publish without your token; keep it out of git.