@roeintheglasses/re-script
v2.1.1
Published
Advanced LLM-powered JavaScript unminifier and deobfuscator
Downloads
34
Maintainers
Readme
re-Script
JavaScript unminifier and deobfuscator powered by AI.
re-Script transforms minified and obfuscated JavaScript into readable code using AI models like Claude, GPT-4, or local LLMs. It combines traditional tools like webcrack and Babel with intelligent variable renaming.
Features
- Multi-LLM support (OpenAI, Anthropic, Ollama, Azure, Bedrock)
- Batch processing for multiple files or directories
- AST-aware code splitting
- Response caching to reduce API costs
- Error recovery when processing steps fail
- File-based configuration with CLI overrides
- Real-time progress tracking
- Dry run mode
- Watch mode for automatic processing
npm install -g @roeintheglasses/re-scriptUsage
Quick Start
# Interactive setup (recommended for first time)
re-script init
# Process a single file
re-script app.min.js
# Process directory recursively
re-script src/ --recursive --output dist/Configuration
Create config with interactive setup:
re-script initOr manage config manually:
# Show current config
re-script config show
# Set values
re-script config set provider.name anthropic
re-script config set provider.apiKey "your-key"
# Validate config
re-script config validateExamples
# Single file
re-script bundle.min.js
# Multiple files
re-script file1.min.js file2.min.js
# Directory with pattern
re-script src/ --pattern "*.min.js" --recursive
# Dry run to preview changes
re-script app.min.js --dry-run
# Custom output location
re-script app.min.js --output app.readable.js
# Exclude patterns
re-script src/ --recursive --exclude "node_modules/**" "*.test.js"
# High concurrency
re-script src/ --recursive --concurrency 10Providers
# OpenAI
re-script app.min.js --provider openai --model gpt-4o
# Anthropic
re-script app.min.js --provider anthropic --model claude-3-5-sonnet-20241022
# Local Ollama
re-script app.min.js --provider ollama --model codellama:13b
# Azure OpenAI
re-script app.min.js --provider azure --model gpt-4oConfiguration
Config Files
re-Script looks for:
.rescriptrc.json.rescriptrc.yamlrescript.config.jspackage.json(inrescriptfield)
Environment Variables
export ANTHROPIC_API_KEY=your-key-here
export OPENAI_API_KEY=your-key-here
export OLLAMA_BASE_URL=http://localhost:11434
export RESCRIPT_DEBUG=trueConfig Commands
# Show current config
re-script config show
# Show with environment variables
re-script config show --env
# Set values
re-script config set provider.name openai
re-script config set provider.model gpt-4o
# Get values
re-script config get provider.name
# List all available keys
re-script config list
# Validate configuration
re-script config validateAdditional Commands
# Show usage examples
re-script examples
# Interactive setup wizard
re-script init
# Help and version
re-script --help
re-script --versionHow It Works
re-Script processes files through a 4-step pipeline:
- Webcrack Processing - Reverse bundling and deobfuscation
- Babel Transformations - AST-based code improvements
- LLM Processing - AI-powered variable/function renaming
- Code Formatting - Final prettification
Each step can fail gracefully without breaking the pipeline. Responses are cached to reduce API costs.
Supported Models
OpenAI: gpt-4o, gpt-4o-mini, gpt-4-turbo, gpt-4, gpt-3.5-turbo
Anthropic: claude-3-5-sonnet-20241022, claude-3-5-haiku-20241022, claude-3-opus-20240229, claude-3-sonnet-20240229, claude-3-haiku-20240307
Azure: Same as OpenAI models but hosted on Azure
Ollama: llama3:8b, llama3:70b, codellama:13b, codellama:34b, mistral:7b, deepseek-coder:6.7b
Development
Building from Source
git clone https://github.com/roeintheglasses/re-Script.git
cd re-Script
npm install
npm run build
npm linkRunning Tests
npm test
npm run test:coverageDevelopment Mode
npm run dev # TypeScript watch modeTroubleshooting
Common Issues
API Key Errors
re-script config set provider.apiKey your-key-here
# or
export ANTHROPIC_API_KEY=your-key-hereRate Limiting
re-script config set processing.concurrency 1
re-script config set processing.retries.maxDelay 60000Large Files
re-script config set processing.chunking.maxChunkSize 2000
re-script config set provider.maxTokens 4096Memory Issues
re-script config set processing.caching.backend file
node --max-old-space-size=8192 $(which re-script) large-file.jsDebug Mode
re-script --verbose input.js
# or
export RESCRIPT_DEBUG=trueLicense
MIT License - see LICENSE file for details.
Contributing
Contributions welcome! Open an issue or submit a PR.
Support
- Issues: GitHub Issues
- Discussions: GitHub Discussions
