adist
v1.0.19
Published
A project indexing and distribution management tool for LLMs
Readme
Adist
A powerful CLI tool for indexing, searching, and having AI-powered conversations about your projects.
Developed by okik.ai.
Contributing
Contributions are welcome! Feel free to submit issues and pull requests to help improve Adist.
The repository is hosted at github.com/okikorg/adist.
⚠️ IMPORTANT: This is an active development project. Breaking changes may occur between versions as we continue to improve the tool. Please check the changelog when updating.
Features
- 🔍 Fast document indexing and semantic searching
- 🔎 Advanced search operators (AND/OR) for precise queries
- 📁 Support for multiple projects
- 🎯 Project-specific search
- 🧩 Block-based indexing for more precise document analysis
- 🤖 LLM-powered document summarization using Anthropic's Claude or local Ollama models
- 🗣️ Interactive chat with AI about your codebase
- 📊 Project statistics and file analysis
- 🔄 Easy project switching and reindexing
- ⚡ Real-time streaming responses for chat and queries
Installation
npm install -g adistUsage
Initialize a Project
adist init <project-name>This will:
- Create a new project configuration
- Index all supported files in the current directory
- Optionally generate LLM summaries if you have the ANTHROPIC_API_KEY set
- Create a Cursor rules file (.cursor/rules/adist-rule.mdc) with usage instructions for AI assistants
Search Documents
adist get "<query>"Search for documents in the current project using natural language queries. You can use advanced operators (AND/OR) for more precise searching.
Advanced Search Operators
The search functionality supports advanced operators for more precise results:
# AND operator - find documents containing both terms
adist get "authentication AND middleware"
# OR operator - find documents containing either term
adist get "login OR signin"AND- All terms must be present in the documentOR- Any of the terms can be present in the document
Query Your Project with AI
adist query "<question>"Ask questions about your project and get AI-powered answers. The AI analyzes relevant documents from your codebase to provide contextual answers with proper code highlighting.
For real-time streaming responses (note that code highlighting may be limited):
adist query "<question>" --streamChat with AI About Your Project
adist chatStart an interactive chat session with AI about your project. This mode provides:
- Persistent conversation history within the session
- Context awareness across multiple questions
- Code syntax highlighting for better readability
- Automatic retrieval of relevant documents for each query
By default, chat mode displays a loading spinner while generating responses. For real-time streaming responses, use:
adist chat --streamNote that code highlighting may be limited in streaming mode.
Type /exit to end the chat session.
Switch Projects
adist switch <project-name>Switch to a different project for searching.
List Projects
adist listView all configured projects.
Reindex Project
adist reindexReindex the current project. Use --summarize to generate LLM summaries:
adist reindex --summarizeThis will:
- Show project statistics (total files, size, word count)
- Ask for confirmation before proceeding with summarization
- Generate summaries for each file
- Create an overall project summary
Inspect File Structure
adist inspect-file <file-path>Visualize how a file is parsed into blocks and what metadata is extracted during indexing. This command displays block IDs, line ranges, and content previews by default.
Options:
-v, --verbose: Show detailed block metadata-t, --tree: Display block structure as a hierarchical tree-c, --content: Show full block content (instead of just previews)
# Show file blocks as a tree structure
adist inspect-file src/app.js -t
# Show detailed block metadata
adist inspect-file src/app.js -v
# Show full block content
adist inspect-file src/app.js -c
# Use advanced search with AND operator
adist get "database AND connection AND pool"
# Use advanced search with OR operator
adist get "error OR exception OR failure"View Summaries
adist summaryView the overall project summary. To view a specific file's summary:
adist summary --file <filename>Configure LLM Provider
adist llm-configConfigure which LLM provider to use:
- Anthropic Claude (cloud-based, requires API key)
- Claude 3 Opus
- Claude 3 Sonnet
- Claude 3 Haiku
- OpenAI (cloud-based, requires API key)
- GPT-4o
- GPT-4 Turbo
- GPT-3.5 Turbo
- Ollama (run locally, no API key needed)
- Choose from any locally installed models
When using Ollama, you can select from your locally installed models and customize the API URL if needed.
LLM Features
The tool supports several LLM-powered features using Anthropic's Claude models, OpenAI's GPT models, or Ollama models (local):
Document Summarization
Generate summaries of your project files to help understand large codebases quickly.
Question Answering
Get specific answers about your codebase without having to manually search through files.
Interactive Chat
Have a natural conversation about your project, with the AI maintaining context between questions.
Streaming Responses
AI interactions can be used in two modes:
- Default mode: Shows a loading spinner while generating responses with full code highlighting
- Streaming mode: Shows real-time responses as they're being generated (use
--streamflag)
# Default mode with loading spinner and code highlighting
adist query "How does authentication work?"
# Streaming mode with real-time responses
adist query "How does authentication work?" --streamSetting Up
You have three options for using LLM features:
Option 1: Anthropic Claude (Cloud)
Set your Anthropic API key in the environment:
export ANTHROPIC_API_KEY='your-api-key-here'Configure to use Anthropic and select your preferred model:
adist llm-config
Option 2: OpenAI (Cloud)
Set your OpenAI API key in the environment:
export OPENAI_API_KEY='your-api-key-here'Configure to use OpenAI and select your preferred model:
adist llm-config
Option 3: Ollama (Local)
Install Ollama from ollama.com/download
Run Ollama and pull a model (e.g., llama3):
ollama pull llama3Configure adist to use Ollama:
adist llm-configSelect Ollama and choose your preferred model from the list.
Initialize Your Project
After setting up your preferred LLM provider:
Initialize your project:
adist init <project-name>Start interacting with your codebase:
adist query "How does the authentication system work?" # or adist chat
Supported File Types
The tool indexes a wide range of file types including:
- Markdown (.md)
- Text (.txt)
- Code files (.js, .ts, .py, .go, etc.)
- Documentation (.rst, .asciidoc)
- Configuration files (.json, .yaml, .toml)
- And many more
Configuration
The tool stores its configuration in:
- macOS:
~/Library/Application Support/adist - Linux:
~/.config/adist - Windows:
%APPDATA%\adist
Recent Updates
April 2025
- Added advanced search operators:
ANDoperator to find documents containing all specified termsORoperator to find documents containing any of the specified terms
- Added automatic Cursor rules creation during project initialization to assist AI tools
March 2025
- Improved chat and query commands with better code highlighting in non-streaming mode (default)
- Added
--streamflag to chat and query commands for real-time streaming responses - Added support for OpenAI models (GPT-4o, GPT-4 Turbo, GPT-3.5 Turbo)
- Added support for all Claude 3 models (Opus, Sonnet, Haiku)
- Added block-based indexing as the default method for faster and more precise document analysis
- Made block-based search the default search method for better contextual understanding
- Legacy indexing and search methods are still available under
legacy-reindexandlegacy-get - Added support for Ollama to run LLM features locally without an API key
- Added LLM provider configuration command for easy switching between Anthropic, OpenAI, and Ollama
- Enhanced document relevance ranking for more accurate results
- Added automatic related document discovery for richer context
- Optimized token usage to reduce API costs
Block-Based Indexing
The latest version of adist uses block-based indexing by default, which:
- Splits documents into semantic blocks (functions, sections, paragraphs)
- Indexes each block individually with its metadata
- Allows for more precise searching and better context understanding
- Improves AI interactions by providing more relevant code snippets
You can see how files are parsed into blocks using the inspect-file command.
The previous full-document indexing method is still available as legacy-reindex and legacy-get commands.
Cursor IDE Integration
Adist automatically creates a .cursor/rules/adist-rule.mdc file during project initialization. This integration provides:
- Automatic instructions for AI assistants in Cursor IDE
- Commands and examples available directly to the AI when working with your code
- Context about advanced search operators and available tools
- Improved AI assistance with project-specific knowledge
The rules file helps AI assistants understand how to use adist to search your codebase, providing better answers and more useful suggestions when you're coding.
License
MIT
