npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@neuroequalityorg/knightcode

v0.2.4

Published

Knightcode CLI - Your local AI coding assistant using Ollama, LM Studio, and more

Readme

Knightcode CLI

A powerful AI coding assistant CLI tool that helps you write, understand, and debug code using local AI models.

Features

  • 🤖 Local AI-powered code assistance - No cloud API keys required
  • 🏠 Multiple local providers - Ollama and LM Studio support
  • 📝 Code generation and refactoring - Generate code from natural language
  • 🔍 Code explanation and documentation - Understand complex codebases
  • 🐛 Bug fixing and debugging - AI-powered problem solving
  • 💡 Intelligent code suggestions - Context-aware recommendations
  • 🔄 Real-time code analysis - Instant feedback on your code
  • 🔒 Privacy-focused - Your code stays on your machine

Installation

npm install -g @neuroequalityorg/knightcode

Prerequisites

  • Node.js >= 18.0.0
  • Either Ollama or LM Studio installed and running locally

Quick Start

New to Knightcode? Start with our Getting Started Guide for a 5-minute setup!

Option 1: Using Ollama (Recommended)

# Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh

# Start Ollama service
ollama serve

# Pull a coding model
ollama pull devstral:24b

# Test Knightcode
knightcode ask "Hello, can you help me with coding?"

Option 2: Using LM Studio

# Download LM Studio from https://lmstudio.ai/
# Load a model and start the local server

# Test Knightcode
knightcode ask "Hello, can you help me with coding?"

Usage

# Start the CLI
knightcode

# Ask a coding question
knightcode ask "How do I implement a binary search tree in TypeScript?"

# Explain code
knightcode explain path/to/file.ts

# Refactor code
knightcode refactor path/to/file.ts --focus readability

# Fix bugs
knightcode fix path/to/file.ts --issue "Infinite loop in the sort function"

# Generate code
knightcode generate "a REST API server with Express" --language TypeScript

# Use specific AI provider
knightcode --provider ollama --model devstral:24b ask "How do I implement authentication?"

Commands

Core Commands

  • ask - Ask questions about code or programming
  • explain - Get explanations of code files or snippets
  • refactor - Refactor code for better readability or performance
  • fix - Fix bugs or issues in code
  • generate - Generate code based on a prompt

Configuration & System

  • config - View or edit configuration settings
  • login - Log in to Knightcode (for cloud features)
  • logout - Log out and clear stored credentials

AI Providers

Knightcode supports multiple local AI providers:

Ollama

  • Default provider - Easy to set up and use
  • Recommended models: devstral:24b, codellama:7b, llama3.2:3b
  • Port: 11434 (default)
  • Best for: Most users, good balance of speed and quality

LM Studio

  • Alternative provider - More control over models
  • Port: 1234 (default)
  • Best for: Users who want to experiment with different models

Anthropic (Cloud)

  • Fallback option - Requires API key
  • Best for: When local models aren't sufficient

Configuration

Knightcode can be configured through:

  1. Configuration file (.knightcode.json) - Recommended
  2. Environment variables - For automation
  3. Command line arguments - For one-time use

Example Configuration File

Create .knightcode.json in your project directory:

{
  "ai": {
    "provider": "ollama",
    "model": "devstral:24b",
    "temperature": 0.7,
    "maxTokens": 4096
  },
  "terminal": {
    "theme": "system",
    "useColors": true
  }
}

Environment Variables

export KNIGHTCODE_AI_PROVIDER=ollama
export KNIGHTCODE_AI_MODEL=devstral:24b

Performance Tips

  • Smaller models (3B-7B): Faster responses, good for simple tasks
  • Larger models (13B-70B): Better quality, slower responses
  • Memory: Ensure you have enough RAM for your chosen model
  • GPU: Models run faster with GPU acceleration (if supported)

Troubleshooting

Common Issues

  1. Connection failed: Make sure your AI service is running
  2. Model not found: Download/pull the model first
  3. Slow responses: Try a smaller model or check your hardware
  4. Memory errors: Reduce model size or increase available RAM

Getting Help

# Check configuration
knightcode config

# Test connection
knightcode ask "Hello"

# View logs
knightcode --verbose ask "Hello"

Development

# Clone the repository
git clone https://github.com/neuroequalityorg/knightcode.git
cd knightcode

# Install dependencies
npm install

# Build the project
npm run build

# Run in development mode
npm run dev

# Run tests
npm test

Contributing

Contributions are welcome! Please read our Contributing Guide for details on our code of conduct and the process for submitting pull requests.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Support

  • 🚀 Getting Started: GETTING_STARTED.md - 5-minute setup guide
  • 📖 Detailed Setup: SETUP_LOCAL_AI.md - Comprehensive configuration guide
  • 🐛 Issues: Report bugs on GitHub
  • 💬 Discussions: Join community discussions
  • Star: If this project helps you, consider giving it a star!