npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

nebula-ai-cli

v0.2.0

Published

Nebula AI CLI assistant - Multi-provider AI assistant for developers

Readme

Nebula AI CLI Assistant

A powerful, multi-provider AI assistant designed specifically for developers. Nebula supports multiple AI providers and offers advanced features like context management, interactive chat, code analysis, and more.

🚀 Features

Multi-Provider AI Support

  • OpenAI (GPT-4, GPT-4o, GPT-3.5-turbo)
  • Anthropic (Claude 3.5 Sonnet, Claude 3 Opus)
  • Google (Gemini 1.5 Pro, Gemini 1.5 Flash)
  • Groq (Llama 3.1, Mixtral - Ultra-fast inference)
  • Ollama (Local models - Free, no API key needed)
  • Cohere (Command R+, Command R)
  • Mistral AI (Mistral Large, Medium, Small)

Advanced Context Management

  • Add files and directories to conversation context
  • Automatic project context detection (package.json, README, etc.)
  • Git integration (branch, commits, diff analysis)
  • Environment detection and tool availability

Developer-Focused Features

  • Code Analysis: Explain, debug, review, and optimize code
  • Interactive Chat: Persistent conversation sessions
  • Quick Commands: Fast access to common tasks
  • Template Generation: Boilerplate code for common patterns
  • Search Integration: Built-in grep with ripgrep support
  • File Editing: Safe find/replace operations
  • GitHub Integration: PR creation and management

📦 Installation

Global Installation (Recommended)

npm install -g nebula

Local Development

git clone <repository>
cd nebula/cli
npm install
npm link

🔧 Setup

Quick Setup

After installation, run the interactive setup:

nebula provider setup

Manual Provider Configuration

OpenAI

export OPENAI_API_KEY="sk-..."
nebula provider setup openai

Anthropic Claude

export ANTHROPIC_API_KEY="sk-ant-..."
nebula provider setup anthropic

Google Gemini

export GOOGLE_API_KEY="..."
nebula provider setup google

Groq (Fast Inference)

export GROQ_API_KEY="gsk_..."
nebula provider setup groq

Ollama (Local, Free)

# Install Ollama first: https://ollama.ai
ollama pull llama3.2
nebula provider setup ollama

💬 Usage

Basic Chat

# Simple question
nebula chat "How do I optimize this React component?"

# With file context
nebula chat "Review this code" --file src/component.jsx

# With project context
nebula chat "Help me debug this issue" --context

# Interactive mode
nebula interactive

Quick Commands

# Explain code
nebula explain "const [state, setState] = useState([])"

# Debug issues
nebula debug "Getting 'Cannot read property' error" --file app.js

# Code review
nebula review --file src/utils.js

# Generate templates
nebula template react-component --file NewComponent.jsx

Context Management

# Add files to context
nebula context add src/app.js
nebula context add src/ --pattern "**/*.js" --exclude "node_modules/**"

# List current context
nebula context list

# Clear context
nebula context clear

Provider Management

# List all providers
nebula provider list

# Setup new provider
nebula provider setup anthropic

# Switch models
nebula model set gpt-4o --provider openai
nebula model list

Advanced Features

# Different chat modes
nebula chat "Optimize this function" --mode code-review
nebula chat "Explain async/await" --mode learning
nebula chat "Design a microservice" --mode architecture

# Save conversations
nebula chat "Help with deployment" --save
nebula history show

# Custom parameters
nebula chat "Creative story" --temperature 0.9 --max-tokens 2000

🛠️ Commands Reference

Core Commands

  • nebula chat <prompt> - Chat with AI assistant
  • nebula interactive - Start interactive chat session
  • nebula status - Show current configuration

Provider Management

  • nebula provider setup [provider] - Configure AI provider
  • nebula provider list - List all available providers
  • nebula model set <model> - Set default model
  • nebula model list [provider] - List available models

Context Management

  • nebula context add <path> - Add file/directory to context
  • nebula context remove <path> - Remove from context
  • nebula context list - Show current context
  • nebula context clear - Clear all context

Quick Commands

  • nebula explain <code> - Explain code or concept
  • nebula debug <issue> - Help debug problems
  • nebula review <code> - Code review and suggestions
  • nebula template [type] - Generate code templates

Utilities

  • nebula grep <pattern> - Search files (uses ripgrep if available)
  • nebula edit replace <file> - Safe find/replace operations
  • nebula pr create - Create GitHub pull requests
  • nebula history show - Show conversation history

🎯 Chat Modes

Nebula supports specialized chat modes for different use cases:

  • default - General purpose assistance
  • code-review - Focus on code quality and best practices
  • debugging - Systematic problem-solving approach
  • architecture - System design and architectural guidance
  • learning - Educational explanations with examples
  • devops - Infrastructure and deployment guidance
nebula chat "How to scale this service?" --mode architecture
nebula chat "Explain closures" --mode learning

📝 Templates

Generate boilerplate code for common patterns:

# Available templates
nebula template

# Generate specific template
nebula template react-component
nebula template dockerfile --file Dockerfile
nebula template github-workflow --file .github/workflows/ci.yml

Available templates:

  • react-component - React functional component with TypeScript
  • express-route - Express.js route handler
  • python-class - Python class with docstrings
  • dockerfile - Multi-stage Docker configuration
  • github-workflow - GitHub Actions CI/CD pipeline
  • terraform-module - Terraform module structure

🔍 Context Features

File Context

# Add single file
nebula context add src/app.js --alias "main-app"

# Add directory with filters
nebula context add src/ --pattern "**/*.{js,ts}" --exclude "*.test.js"

Project Context

Automatically detects and includes:

  • Package.json information
  • README content
  • Programming languages used
  • Framework detection (React, Vue, Express, etc.)

Git Context

When in a git repository, includes:

  • Current branch
  • Recent commits
  • File changes (staged/unstaged)
  • Git status

🔧 Configuration

Configuration is stored in your OS-specific config directory:

  • macOS: ~/Library/Preferences/nebula/
  • Linux: ~/.config/nebula/
  • Windows: %APPDATA%/nebula/

Environment Variables

# Provider API Keys
export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."
export GOOGLE_API_KEY="..."
export GROQ_API_KEY="gsk_..."
export COHERE_API_KEY="..."
export MISTRAL_API_KEY="..."

# Ollama Configuration
export OLLAMA_HOST="http://localhost:11434"

🚀 Advanced Usage

Batch Processing

# Process multiple files
for file in src/*.js; do
  nebula review --file "$file" --raw >> review-results.txt
done

Custom Workflows

# Code review workflow
nebula context add src/
nebula chat "Review this codebase for security issues" --mode code-review --save

# Debugging workflow
nebula context add logs/error.log
nebula debug "Application crashes on startup" --context

Integration with Other Tools

# With git hooks
git diff --cached | nebula chat "Review these changes"

# With CI/CD
nebula chat "Optimize this Dockerfile" --file Dockerfile --raw

🔒 Security & Privacy

  • API keys are stored locally in encrypted configuration
  • No conversation data is sent to Nebula servers
  • All communication is direct with your chosen AI provider
  • Local models (Ollama) keep everything on your machine

🐛 Troubleshooting

Common Issues

API Key Not Working

nebula status  # Check configuration
nebula provider setup <provider>  # Reconfigure

Ollama Connection Issues

# Check if Ollama is running
curl http://localhost:11434/api/tags

# Start Ollama
ollama serve

Context Too Large

nebula context clear
nebula context add specific-file.js  # Add only relevant files

Model Not Available

nebula model list  # Check available models
nebula model set <available-model>

📚 Examples

Code Review Example

nebula context add src/auth.js
nebula review "Focus on security vulnerabilities" --mode code-review

Debugging Example

nebula context add error.log
nebula debug "Server returns 500 error on POST requests" --context

Learning Example

nebula explain "async function fetchData() { const response = await fetch('/api'); return response.json(); }" --mode learning

Architecture Example

nebula context add README.md
nebula chat "Design a scalable architecture for this project" --mode architecture

🤝 Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests if applicable
  5. Submit a pull request

📄 License

MIT License - see LICENSE file for details

🆘 Support

  • GitHub Issues: Report bugs and request features
  • Documentation: Check the wiki for detailed guides
  • Community: Join discussions in GitHub Discussions

Made with ❤️ for developers by developers