darkfoo-code
v0.4.4
Published
Darkfoo Code — local AI coding assistant powered by Ollama, vLLM, llama.cpp, and other LLM providers
Maintainers
Readme
Darkfoo Code
A local AI coding assistant CLI powered by Ollama. Built with TypeScript, React + Ink, and native Ollama tool calling.
Features
- Interactive REPL with streaming responses and Darkfoo-branded terminal UI
- Non-interactive mode (
-p) for scripting - 14 tools: Bash, Read, Write, Edit, Grep, Glob, WebFetch, Plan mode, Tasks, and more
- Slash commands:
/model,/clear,/compact,/cost,/context,/diff,/commit,/history !bash mode for direct shell execution- Arrow-key command history and slash command tab-completion
- Generator-based conversation loop with multi-turn tool use
- Plan mode for structured implementation planning
- Task tracking system (create, update, list tasks)
- Session persistence with auto-save and resume
- DARKFOO.md project context system (like CLAUDE.md)
- Permission system with settings storage
- Markdown rendering in terminal output
- Context window usage tracking and compaction
Requirements
- Node.js 22+
- Ollama running locally
- A model with tool calling support (e.g.
llama3.1:8b) - ripgrep (
rg) installed for Grep/Glob tools
Setup
npm install
ollama pull llama3.1:8b # or any tool-calling model
sudo dnf install ripgrep # Fedora — or your distro's equivalentUsage
# Interactive REPL
npm run dev
# Single prompt
npm run dev -- -p "Read /etc/hostname"
# Global command (after npm link)
npm link
darkfoo
darkfoo -p "What files are in src/?"
darkfoo -m llama3.1:8bSlash Commands
| Command | Description |
|---------|-------------|
| /help | List available commands |
| /model [name] | List or switch Ollama models |
| /clear | Clear conversation history |
| /compact | Compress conversation to save context |
| /cost | Show token usage for session |
| /context | Show context window usage with visual bar |
| /diff | Show uncommitted git changes |
| /commit [msg] | Stage and commit with AI-generated message |
| /history | List saved sessions |
| /resume <id> | Resume a saved session |
| /exit | Exit Darkfoo Code |
Prefix with ! to run shell commands directly (e.g. !ls -la, !git status).
Tools
| Tool | Description | |------|-------------| | Bash | Execute shell commands with timeout and output limits | | Read | Read files with line numbers, offset/limit support | | Write | Write files, creating parent directories as needed | | Edit | Find-and-replace with quote normalization and diff output | | Grep | Search file contents via ripgrep (regex, glob filtering) | | Glob | Find files by pattern via ripgrep | | WebFetch | Fetch URLs and return text content | | EnterPlanMode | Enter structured planning mode (read-only exploration) | | ExitPlanMode | Submit plan for user approval | | TaskCreate | Create a tracked task | | TaskUpdate | Update task status (pending/in_progress/completed) | | TaskList | List all tasks | | TaskGet | Get task details | | AskUserQuestion | Prompt user for clarification |
Project Context (DARKFOO.md)
Place a DARKFOO.md file in your project root or ~/.darkfoo/DARKFOO.md for global instructions. Rules can also go in .darkfoo/rules/*.md. These are automatically loaded into the system prompt.
Session Persistence
Sessions auto-save to ~/.darkfoo/sessions/. Use /history to browse and /resume <id> to continue a previous session.
Project Structure
src/
main.tsx CLI entry (Commander.js)
app.tsx Ink app wrapper + context
repl.tsx Interactive REPL screen
query.ts Generator-based conversation loop
ollama.ts Ollama /api/chat streaming client
types.ts Core message and event types
system-prompt.ts System prompt builder + DARKFOO.md loader
state.ts App state (plan mode, tasks)
session.ts Session save/load/list
context-loader.ts DARKFOO.md discovery and loading
permissions.ts Permission system + settings
commands/ Slash command implementations
tools/ Tool implementations (14 tools)
components/ React + Ink UI components
utils/ Theme, formatting, markdown rendererOllama Model Compatibility
Models must support Ollama's native tool calling. Tested and working:
llama3.1:8b— recommended default, solid tool callingllama3.2:3b— works but limited reasoning at 3B
Models that do not support tool calling: gemma2, phi4, deepseek-r1, gpt-oss.
Environment Variables
| Variable | Description | Default |
|----------|-------------|---------|
| OLLAMA_HOST | Ollama API base URL | http://localhost:11434 |
