@mcoda/codali
v0.1.71
Published
Standalone tool-runner adapter for mcoda.
Downloads
301
Readme
@mcoda/codali
Standalone tool-runner adapter for mcoda. codali runs a tool loop that can edit the repo directly via built-in tools, with optional streaming output and pre-flight cost estimates.
Usage
Run with a task file:
codali run --workspace-root . --provider openai-compatible --model gpt-4o-mini --task tasks/work.txtOr pass the task on stdin:
echo "Fix the failing test" | codali run --workspace-root . --provider ollama-remote --model llama3Smart pipeline
Enable the multi-phase pipeline with --smart (librarian/architect/builder/critic):
codali run --smart --workspace-root . --provider openai-compatible --model gpt-4o-mini --task tasks/work.txtYou can also set CODALI_SMART=1 in the environment.
Providers
openai-compatible(default base URL:https://api.openai.com/v1)ollama-remote(default base URL:http://127.0.0.1:11434)stub(test-only provider)
Config and env
codali merges config in this order:
- CLI args
codali.config.jsonor.codalirc- Environment variables
Common environment variables:
CODALI_WORKSPACE_ROOTCODALI_PROVIDERCODALI_MODELCODALI_API_KEYCODALI_BASE_URLCODALI_STREAMING_ENABLEDCODALI_STREAMING_FLUSH_MSCODALI_COST_MAX_PER_RUNCODALI_COST_CHAR_PER_TOKENCODALI_COST_PRICING_OVERRIDESCODALI_TOOLS_ENABLEDCODALI_ALLOW_SHELLCODALI_SHELL_ALLOWLISTDOCDEX_HTTP_BASE_URLCODALI_DOCDEX_REPO_ID
Routing config (per phase, optional) lives in codali.config.json:
{
"routing": {
"librarian": { "agent": "<librarian-agent-slug>", "temperature": 0.1 },
"architect": { "agent": "<architect-agent-slug>", "temperature": 0.4 },
"builder": { "agent": "<builder-agent-slug>", "temperature": 0.2, "format": "json" },
"critic": { "agent": "<critic-agent-slug>", "temperature": 0.1 },
"interpreter": { "agent": "<interpreter-agent-slug>", "temperature": 0.1 }
},
"limits": {
"maxRetries": 3
}
}To enforce a GBNF grammar with Ollama:
{
"routing": {
"builder": { "format": "gbnf", "grammar": "root ::= \"ok\"" }
}
}Routing env overrides:
CODALI_PROVIDER_LIBRARIANCODALI_PROVIDER_ARCHITECTCODALI_PROVIDER_BUILDERCODALI_PROVIDER_CRITICCODALI_PROVIDER_INTERPRETERCODALI_MODEL_LIBRARIANCODALI_MODEL_ARCHITECTCODALI_MODEL_BUILDERCODALI_MODEL_CRITICCODALI_MODEL_INTERPRETERCODALI_FORMAT_LIBRARIANCODALI_FORMAT_ARCHITECTCODALI_FORMAT_BUILDERCODALI_FORMAT_CRITICCODALI_FORMAT_INTERPRETERCODALI_GRAMMAR_LIBRARIANCODALI_GRAMMAR_ARCHITECTCODALI_GRAMMAR_BUILDERCODALI_GRAMMAR_CRITICCODALI_GRAMMAR_INTERPRETERCODALI_LIMIT_MAX_RETRIES
Logs
codali writes JSONL logs under logs/codali/<runId>.jsonl, rooted in the global workspace folder
(~/.mcoda/workspaces/<workspace>/).
Docdex integration
codali calls docdex over HTTP for search/snippets/graphs and uses MCP for symbols/AST/memory. Ensure docdexd is running and DOCDEX_HTTP_BASE_URL is set if needed.
