@moonforge/moon
v0.19.5
Published
A powerful game development AI agent that can build complete games from scratch. Expert in Unity, Unreal Engine, and Godot.
Readme
Moon By MoonForge
Features
- Multi-Model: choose from a wide range of LLMs or add your own via OpenAI- or Anthropic-compatible APIs
- Flexible: switch LLMs mid-session while preserving context
- Session-Based: maintain multiple work sessions and contexts per project
- LSP-Enhanced: Moon uses LSPs for additional context, just like you do
- Extensible: add capabilities via MCPs (
http,stdio, andsse) - Works Everywhere: first-class support in every terminal on macOS, Linux, Windows (PowerShell and WSL), FreeBSD, OpenBSD, and NetBSD
Installation
Use a package manager:
# Homebrew
brew install moonforge/tap/moon
# NPM
npm install -g @moonforge/moon
# Arch Linux (btw)
yay -S moon-bin
# Nix
nix run github:numtide/nix-ai-tools#moonWindows users:
# Winget
winget install moonforge.moon
# Scoop
scoop bucket add moonforge https://github.com/moonforge/scoop-bucket.git
scoop install moonMoon is available via NUR in nur.repos.moonforge.moon.
You can also try out Moon via nix-shell:
# Add the NUR channel.
nix-channel --add https://github.com/nix-community/NUR/archive/main.tar.gz nur
nix-channel --update
# Get Moon in a Nix shell.
nix-shell -p '(import <nur> { pkgs = import <nixpkgs> {}; }).repos.moonforge.moon'NixOS & Home Manager Module Usage via NUR
Moon provides NixOS and Home Manager modules via NUR. You can use these modules directly in your flake by importing them from NUR. Since it auto detects whether its a home manager or nixos context you can use the import the exact same way :)
{
inputs = {
nixpkgs.url = "github:NixOS/nixpkgs/nixos-unstable";
nur.url = "github:nix-community/NUR";
};
outputs = { self, nixpkgs, nur, ... }: {
nixosConfigurations.your-hostname = nixpkgs.lib.nixosSystem {
system = "x86_64-linux";
modules = [
nur.modules.nixos.default
nur.repos.moonforge.modules.moon
{
programs.moon = {
enable = true;
settings = {
providers = {
openai = {
id = "openai";
name = "OpenAI";
base_url = "https://api.openai.com/v1";
type = "openai";
api_key = "sk-fake123456789abcdef...";
models = [
{
id = "gpt-4";
name = "GPT-4";
}
];
};
};
lsp = {
go = { command = "gopls"; enabled = true; };
nix = { command = "nil"; enabled = true; };
};
options = {
context_paths = [ "/etc/nixos/configuration.nix" ];
tui = { compact_mode = true; };
debug = false;
};
};
};
}
];
};
};
}sudo mkdir -p /etc/apt/keyrings
curl -fsSL https://repo.moonforge.com/apt/gpg.key | sudo gpg --dearmor -o /etc/apt/keyrings/moonforge.gpg
echo "deb [signed-by=/etc/apt/keyrings/moonforge.gpg] https://repo.moonforge.com/apt/ * *" | sudo tee /etc/apt/sources.list.d/moonforge.list
sudo apt update && sudo apt install moonecho '[moonforge]
name=MoonForge
baseurl=https://repo.moonforge.com/yum/
enabled=1
gpgcheck=1
gpgkey=https://repo.moonforge.com/yum/gpg.key' | sudo tee /etc/yum.repos.d/moonforge.repo
sudo yum install moonOr, download it:
- Packages are available in Debian and RPM formats
- Binaries are available for Linux, macOS, Windows, FreeBSD, OpenBSD, and NetBSD
Or just install it with Go:
go install github.com/moonforge/moon@latest[!NOTE] Productivity may increase when using Moon. If you have questions, join the [Discord][discord].
Getting Started
The quickest way to get started is to grab an API key for your preferred provider such as Anthropic, OpenAI, Groq, or OpenRouter and just start Moon. You'll be prompted to enter your API key.
That said, you can also set environment variables for preferred providers.
| Environment Variable | Provider |
| --------------------------- | -------------------------------------------------- |
| ANTHROPIC_API_KEY | Anthropic |
| OPENAI_API_KEY | OpenAI |
| OPENROUTER_API_KEY | OpenRouter |
| GEMINI_API_KEY | Google Gemini |
| CEREBRAS_API_KEY | Cerebras |
| HF_TOKEN | Huggingface Inference |
| VERTEXAI_PROJECT | Google Cloud VertexAI (Gemini) |
| VERTEXAI_LOCATION | Google Cloud VertexAI (Gemini) |
| GROQ_API_KEY | Groq |
| AWS_ACCESS_KEY_ID | AWS Bedrock (Claude) |
| AWS_SECRET_ACCESS_KEY | AWS Bedrock (Claude) |
| AWS_REGION | AWS Bedrock (Claude) |
| AWS_PROFILE | AWS Bedrock (Custom Profile) |
| AWS_BEARER_TOKEN_BEDROCK | AWS Bedrock |
| AZURE_OPENAI_API_ENDPOINT | Azure OpenAI models |
| AZURE_OPENAI_API_KEY | Azure OpenAI models (optional when using Entra ID) |
| AZURE_OPENAI_API_VERSION | Azure OpenAI models |
Configuration
Moon runs great with no configuration. That said, if you do need or want to customize Moon, configuration can be added either local to the project itself, or globally, with the following priority:
.moon.jsonmoon.json$HOME/.config/moon/moon.json
Configuration itself is stored as a JSON object:
{
"this-setting": { "this": "that" },
"that-setting": ["ceci", "cela"]
}As an additional note, Moon also stores ephemeral data, such as application state, in one additional location:
# Unix
$HOME/.local/share/moon/moon.json
# Windows
%LOCALAPPDATA%\moon\moon.jsonLSPs
Moon can use LSPs for additional context to help inform its decisions, just like you would. LSPs can be added manually like so:
{
"$schema": "https://moonforge.com/moon.json",
"lsp": {
"go": {
"command": "gopls",
"env": {
"GOTOOLCHAIN": "go1.24.5"
}
},
"typescript": {
"command": "typescript-language-server",
"args": ["--stdio"]
},
"nix": {
"command": "nil"
}
}
}MCPs
Moon also supports Model Context Protocol (MCP) servers through three
transport types: stdio for command-line servers, http for HTTP endpoints,
and sse for Server-Sent Events. Environment variable expansion is supported
using $(echo $VAR) syntax.
{
"$schema": "https://moonforge.com/moon.json",
"mcp": {
"filesystem": {
"type": "stdio",
"command": "node",
"args": ["/path/to/mcp-server.js"],
"timeout": 120,
"disabled": false,
"env": {
"NODE_ENV": "production"
}
},
"github": {
"type": "http",
"url": "https://api.githubcopilot.com/mcp/",
"timeout": 120,
"disabled": false,
"headers": {
"Authorization": "Bearer $GH_PAT"
}
},
"streaming-service": {
"type": "sse",
"url": "https://example.com/mcp/sse",
"timeout": 120,
"disabled": false,
"headers": {
"API-Key": "$(echo $API_KEY)"
}
}
}
}Ignoring Files
Moon respects .gitignore files by default, but you can also create a
.moonignore file to specify additional files and directories that Moon
should ignore. This is useful for excluding files that you want in version
control but don't want Moon to consider when providing context.
The .moonignore file uses the same syntax as .gitignore and can be placed
in the root of your project or in subdirectories.
Allowing Tools
By default, Moon will ask you for permission before running tool calls. If you'd like, you can allow tools to be executed without prompting you for permissions. Use this with care.
{
"$schema": "https://moonforge.com/moon.json",
"permissions": {
"allowed_tools": [
"view",
"ls",
"grep",
"edit",
"mcp_context7_get-library-doc"
]
}
}You can also skip all permission prompts entirely by running Moon with the
--yolo flag. Be very, very careful with this feature.
Initialization
When you initialize a project, Moon analyzes your codebase and creates
a context file that helps it work more effectively in future sessions.
By default, this file is named AGENTS.md, but you can customize the
name and location with the initialize_as option:
{
"$schema": "https://moonforge.com/moon.json",
"options": {
"initialize_as": "AGENTS.md"
}
}This is useful if you prefer a different naming convention or want to
place the file in a specific directory (e.g., MOON.md or
docs/LLMs.md). Moon will fill the file with project-specific context
like build commands, code patterns, and conventions it discovered during
initialization.
Attribution Settings
By default, Moon adds attribution information to Git commits and pull requests
it creates. You can customize this behavior with the attribution option:
{
"$schema": "https://moonforge.com/moon.json",
"options": {
"attribution": {
"trailer_style": "co-authored-by",
"generated_with": true
}
}
}trailer_style: Controls the attribution trailer added to commit messages (default:co-authored-by)co-authored-by: AddsCo-Authored-By: Moon <[email protected]>assisted-by: AddsAssisted-by: [Model Name] via Moon(includes the model name)none: No attribution trailer
generated_with: When true (default), adds💘 Generated with Moonline to commit messages and PR descriptions
Custom Providers
Moon supports custom provider configurations for both OpenAI-compatible and Anthropic-compatible APIs.
[!NOTE] Note that we support two "types" for OpenAI. Make sure to choose the right one to ensure the best experience!
openaishould be used when proxying or routing requests through OpenAI.openai-compatshould be used when using non-OpenAI providers that have OpenAI-compatible APIs.
OpenAI-Compatible APIs
Here’s an example configuration for Deepseek, which uses an OpenAI-compatible
API. Don't forget to set DEEPSEEK_API_KEY in your environment.
{
"$schema": "https://moonforge.com/moon.json",
"providers": {
"deepseek": {
"type": "openai-compat",
"base_url": "https://api.deepseek.com/v1",
"api_key": "$DEEPSEEK_API_KEY",
"models": [
{
"id": "deepseek-chat",
"name": "Deepseek V3",
"cost_per_1m_in": 0.27,
"cost_per_1m_out": 1.1,
"cost_per_1m_in_cached": 0.07,
"cost_per_1m_out_cached": 1.1,
"context_window": 64000,
"default_max_tokens": 5000
}
]
}
}
}Anthropic-Compatible APIs
Custom Anthropic-compatible providers follow this format:
{
"$schema": "https://moonforge.com/moon.json",
"providers": {
"custom-anthropic": {
"type": "anthropic",
"base_url": "https://api.anthropic.com/v1",
"api_key": "$ANTHROPIC_API_KEY",
"extra_headers": {
"anthropic-version": "2023-06-01"
},
"models": [
{
"id": "claude-sonnet-4-20250514",
"name": "Claude Sonnet 4",
"cost_per_1m_in": 3,
"cost_per_1m_out": 15,
"cost_per_1m_in_cached": 3.75,
"cost_per_1m_out_cached": 0.3,
"context_window": 200000,
"default_max_tokens": 50000,
"can_reason": true,
"supports_attachments": true
}
]
}
}
}Amazon Bedrock
Moon currently supports running Anthropic models through Bedrock, with caching disabled.
- A Bedrock provider will appear once you have AWS configured, i.e.
aws configure - Moon also expects the
AWS_REGIONorAWS_DEFAULT_REGIONto be set - To use a specific AWS profile set
AWS_PROFILEin your environment, i.e.AWS_PROFILE=myprofile moon - Alternatively to
aws configure, you can also just setAWS_BEARER_TOKEN_BEDROCK
Vertex AI Platform
Vertex AI will appear in the list of available providers when VERTEXAI_PROJECT and VERTEXAI_LOCATION are set. You will also need to be authenticated:
gcloud auth application-default loginTo add specific models to the configuration, configure as such:
{
"$schema": "https://moonforge.com/moon.json",
"providers": {
"vertexai": {
"models": [
{
"id": "claude-sonnet-4@20250514",
"name": "VertexAI Sonnet 4",
"cost_per_1m_in": 3,
"cost_per_1m_out": 15,
"cost_per_1m_in_cached": 3.75,
"cost_per_1m_out_cached": 0.3,
"context_window": 200000,
"default_max_tokens": 50000,
"can_reason": true,
"supports_attachments": true
}
]
}
}
}Local Models
Local models can also be configured via OpenAI-compatible API. Here are two common examples:
Ollama
{
"providers": {
"ollama": {
"name": "Ollama",
"base_url": "http://localhost:11434/v1/",
"type": "openai-compat",
"models": [
{
"name": "Qwen 3 30B",
"id": "qwen3:30b",
"context_window": 256000,
"default_max_tokens": 20000
}
]
}
}
}LM Studio
{
"providers": {
"lmstudio": {
"name": "LM Studio",
"base_url": "http://localhost:1234/v1/",
"type": "openai-compat",
"models": [
{
"name": "Qwen 3 30B",
"id": "qwen/qwen3-30b-a3b-2507",
"context_window": 256000,
"default_max_tokens": 20000
}
]
}
}
}Logging
Sometimes you need to look at logs. Luckily, Moon logs all sorts of
stuff. Logs are stored in ./.moon/logs/moon.log relative to the project.
The CLI also contains some helper commands to make perusing recent logs easier:
# Print the last 1000 lines
moon logs
# Print the last 500 lines
moon logs --tail 500
# Follow logs in real time
moon logs --followWant more logging? Run moon with the --debug flag, or enable it in the
config:
{
"$schema": "https://moonforge.com/moon.json",
"options": {
"debug": true,
"debug_lsp": true
}
}Provider Auto-Updates
By default, Moon automatically checks for the latest and greatest list of providers and models from Catwalk, the open source Moon provider database. This means that when new providers and models are available, or when model metadata changes, Moon automatically updates your local configuration.
Disabling automatic provider updates
For those with restricted internet access, or those who prefer to work in air-gapped environments, this might not be want you want, and this feature can be disabled.
To disable automatic provider updates, set disable_provider_auto_update into
your moon.json config:
{
"$schema": "https://moonforge.com/moon.json",
"options": {
"disable_provider_auto_update": true
}
}Or set the MOON_DISABLE_PROVIDER_AUTO_UPDATE environment variable:
export MOON_DISABLE_PROVIDER_AUTO_UPDATE=1Manually updating providers
Manually updating providers is possible with the moon update-providers
command:
# Update providers remotely from Catwalk.
moon update-providers
# Update providers from a custom Catwalk base URL.
moon update-providers https://example.com/
# Update providers from a local file.
moon update-providers /path/to/local-providers.json
# Reset providers to the embedded version, embedded at moon at build time.
moon update-providers embedded
# For more info:
moon update-providers --helpMetrics
Moon records pseudonymous usage metrics (tied to a device-specific hash), which maintainers rely on to inform development and support priorities. The metrics include solely usage metadata; prompts and responses are NEVER collected.
Details on exactly what’s collected are in the source code (here and here).
You can opt out of metrics collection at any time by setting the environment variable by setting the following in your environment:
export MOON_DISABLE_METRICS=1Or by setting the following in your config:
{
"options": {
"disable_metrics": true
}
}Moon also respects the DO_NOT_TRACK
convention which can be enabled via export DO_NOT_TRACK=1.
A Note on Claude Max and GitHub Copilot
Moon only supports model providers through official, compliant APIs. We do not support or endorse any methods that rely on personal Claude Max and GitHub Copilot accounts or OAuth workarounds, which violate Anthropic and Microsoft’s Terms of Service.
We’re committed to building sustainable, trusted integrations with model providers. If you’re a provider interested in working with us, reach out.
Contributing
See the contributing guide.
Community
We’d love to hear your thoughts on this project. You can find us on:
License
Part of MoonForge.
MoonForge • MoonForge loves open source
