npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

codebase-rag-tui

v0.3.2

Published

Terminal-based AI agent for codebase RAG (Retrieval-Augmented Generation)

Downloads

541

Readme

Codebase RAG

A terminal-based AI coding assistant that uses Retrieval-Augmented Generation (RAG) and Abstract Syntax Tree (AST) analysis to provide context-aware code review and modification. It parses your codebase into a knowledge graph, then gives an LLM precise, structural context when answering questions or making edits.

Design

How It Works

┌──────────────┐       Socket.IO / REST        ┌──────────────────┐
│   React Ink   │  ◄──────────────────────────►  │   FastAPI + SIO  │
│   Frontend    │                                │   Backend        │
└──────────────┘                                └────────┬─────────┘
                                                         │
                                       ┌─────────────────┼─────────────────┐
                                       │                 │                 │
                                  ┌────▼────┐     ┌──────▼──────┐   ┌─────▼─────┐
                                  │Memgraph │     │   Redis     │   │ Postgres  │
                                  │ (Graph) │     │  (Cache)    │   │(Sessions) │
                                  └─────────┘     └─────────────┘   └───────────┘

Backend — The core of the system lives in api/. It uses Tree-sitter to parse source code into ASTs, extracting structural entities (modules, classes, functions, methods, imports, call relationships) across multiple languages (Python, TypeScript, JavaScript, Java, C++, Rust, Go, Lua, Scala). These entities and their relationships are ingested into a Memgraph knowledge graph.

When you ask a question, a RAG orchestrator:

  1. Translates your natural-language query into a Cypher graph query to retrieve structurally relevant code (call chains, inheritance, imports).
  2. Optionally performs semantic vector search for intent-based discovery.
  3. Feeds the retrieved context to the LLM, which can then answer questions, review code, or propose edits — grounded in your actual codebase structure.

The backend also supports an agent mode with tools for file reading/writing, shell commands, directory listing, and document analysis, allowing the LLM to autonomously explore and modify code.

Frontend — A terminal UI built with React Ink. It connects to the backend over Socket.IO (for real-time file operations the backend delegates to the client) and REST (for queries). The TUI supports two modes — Chat for Q&A and Agent for autonomous code modifications with accept/reject review.

Supported Languages

Python, JavaScript, TypeScript, Java, C++, Rust, Go, Lua, Scala.

Prerequisites

  • Linux environment (required)
  • Node.js >= 16 and pnpm
  • Python >= 3.12 and uv
  • make
  • Docker and Docker Compose (for Memgraph, Redis, and Postgres)
  • An LLM provider API key (Google, OpenAI, Vertex AI, or a local Ollama endpoint)

Setup

1. Environment Variables

cp .env.example .env

Fill in your LLM provider API keys and any other settings in the .env file.

2. Start the Backend

Start the infrastructure services:

cd api
docker compose up -d     # starts Memgraph, Redis, and Postgres

Then start the API server:

cd api
uv sync                  # install Python dependencies
make                     # starts the FastAPI server on port 8000

You can also access the Memgraph Lab UI at http://localhost:3000 to visually explore the knowledge graph.

3. Start the Frontend

# From the project root
pnpm install
pnpm run build
node dist/cli.js

Usage

  1. Launch the TUI with node dist/cli.js.
  2. Select a mode — Chat or Agent.
  3. Enter the path to your repository when prompted.
  4. Start asking questions about your codebase or request code modifications.

TUI Commands

| Command | Description | | -------- | ---------------------------------------- | | /help | Show available commands | | /clear | Clear the conversation | | /quit | Leave the current session and reset | | /exit | Exit the application |

In Agent mode, when the assistant proposes code edits, you'll be prompted to accept or reject the changes before they are applied.

License

MIT