@niiaassh/ollamacode
v0.0.9
Published
An open-source AI agent that brings the power of ollamacode directly into your terminal.
Readme
ollamacode CLI
A conversational AI CLI tool powered by ollamacode with intelligent text editor capabilities and tool usage.
Features
- 🤖 Conversational AI: Natural language interface powered by qwen-3
- 📝 Smart File Operations: AI automatically uses tools to view, create, and edit files
- ⚡ Bash Integration: Execute shell commands through natural conversation
- 🔧 Automatic Tool Selection: AI intelligently chooses the right tools for your requests
- 🔌 MCP Tools: Extend capabilities with Model Context Protocol servers (Linear, GitHub, etc.)
- 💬 Interactive UI: Beautiful terminal interface built with Ink
- 🌍 Global Installation: Install and use anywhere with
npm i -g @niiaassh/ollamacode
Installation
Prerequisites
- Node.js 16+
- ollama install in your system and pull the model with command ollama pull qwen3(modelname)
Download the ollama from this below link
https://ollama.com/download/Pull the model
ollama pull modle_name (eg : qwen3:8b)Global Installation (Recommended)
npm install -g @niiaassh/ollamacodeLocal Development
git clone <repository>
cd ollamacode
npm install
npm run build
npm linkSetup
- Install the ollama on your local machine and pull the model qwen3:8b (it is best after trying many models like mistral, llama etc
Start the conversational AI assistant:
ollamacodeOr specify a working directory:
ollamacode -d /path/to/projectHeadless Mode
Process a single prompt and exit (useful for scripting and automation):
ollamacode --prompt "show me the package.json file"
ollamacode -p "create a new file called example.js with a hello world function"
ollamacode --prompt "run npm test and show me the results" --directory /path/to/projectThis mode is particularly useful for:
- CI/CD pipelines: Automate code analysis and file operations
- Scripting: Integrate AI assistance into shell scripts
- Terminal benchmarks: Perfect for tools like Terminal Bench that need non-interactive execution
- Batch processing: Process multiple prompts programmatically
Model Selection
You can specify which AI model to use with the --model parameter or ollamacode_MODEL environment variable:
Priority order: --model flag > ollamacode_MODEL environment variable > user settings > default (ollamacode-4-latest)
Command Line Options
ollamacode [options]
Options:
-V, --version output the version number
-d, --directory <dir> set working directory
-u, --base-url <url> ollamacode API base URL (or set ollamacode_BASE_URL env var)
-m, --model <model> AI model to use (e.g., ollamacode-4-latest, ollamacode-3-latest) (or set ollamacode_MODEL env var)
-p, --prompt <prompt> process a single prompt and exit (headless mode)
-h, --help display help for commandCustom Instructions
You can provide custom instructions to tailor ollamacode's behavior to your project by creating a .ollamacode/ollamacode.md file in your project directory:
mkdir .ollamacodeCreate .ollamacode/ollamacode.md with your custom instructions:
# Custom Instructions for ollamacode CLI
Always use TypeScript for any new code files.
When creating React components, use functional components with hooks.
Prefer const assertions and explicit typing over inference where it improves clarity.
Always add JSDoc comments for public functions and interfaces.
Follow the existing code style and patterns in this project.ollamacode will automatically load and follow these instructions when working in your project directory. The custom instructions are added to ollamacode's system prompt and take priority over default behavior.
#### Add from JSON configuration:
```bash
ollamacode mcp add-json my-server '{"command": "node", "args": ["server.js"], "env": {"API_KEY": "your_key"}}'Linear Integration Example
To add Linear MCP tools for project management:
This enables Linear tools like:
- Create and manage Linear issues
- Search and filter issues
- Update issue status and assignees
- Access team and project information
Available Transport Types
- stdio: Run MCP server as a subprocess (most common)
- http: Connect to HTTP-based MCP server
- sse: Connect via Server-Sent Events
Development
# Install dependencies
npm install
# Development mode
npm run dev
# Build project
npm run build
# Run linter
npm run lint
# Type check
npm run typecheckArchitecture
- Agent: Core command processing and execution logic
- Tools: Text editor and bash tool implementations
- UI: Ink-based terminal interface components
- Types: TypeScript definitions for the entire system
License
MIT
