@prism-lang/cli
v1.3.1
Published
Command-line interface for Prism language
Maintainers
Readme
@prism-lang/cli
Command-line interface for the Prism programming language.
📚 Full Documentation | 🚀 Getting Started | 💻 CLI Guide
Installation
# Install globally
npm install -g @prism-lang/cli
# Or use with npx
npx @prism-lang/cliUsage
Run a Prism file
prism run myfile.prism
# Hot reload on changes
prism run --watch myfile.prismStart the REPL
prism replExecute inline code
prism eval "x = 5 ~> 0.9; x"Send an LLM prompt
# Stream tokens from a specific provider/model
prism llm --provider claude --model claude-3-haiku --stream "Draft a haiku about autumn rain"
# Request reasoning metadata and disable structured output
prism llm --include-reasoning --no-structured-output "Explain why the sky is blue"Check version
prism --versionFeatures
- Run Prism files: Execute
.prismfiles from the command line (with optional--watchhot reload) - Interactive REPL: Explore Prism interactively with confidence tracking
- Inline evaluation: Quick one-liners for testing
- LLM Integration: Built-in support for AI providers with per-command overrides (provider, model, temperature, tokens, reasoning, structured output)
Configuration
Set up LLM providers with environment variables:
# Anthropic Claude
export CLAUDE_API_KEY=your-key
# Google Gemini
export GEMINI_API_KEY=your-keyExamples
# Run a file with AI safety checks
prism run safety-check.prism
# Enable hot reload for local development
prism run --watch app.prism
# Start REPL with confidence tracking
prism repl
# Quick calculation with uncertainty
prism eval "temp = 72 ~> 0.95; <~ temp"Related Packages
@prism-lang/core- Core language implementation@prism-lang/llm- LLM provider integrations@prism-lang/confidence- Confidence extraction utilities@prism-lang/validator- Validation toolkit@prism-lang/repl- Interactive REPL
License
MIT
