piensa
v1.2.1
Published
CLI tool that pipes text into LLM agents from different providers
Maintainers
Readme
Piensa 🧠
A simple CLI tool to interact easily with LLMs like OpenAI and Anthropic. Pipe in prompts, text, or files to quickly summarize, calculate, or get insights.
Installation
npm install -g piensaUsage
Quick Examples
Ask a simple question:
piensa "What's the capital of France?"Pipe a calculation:
echo "2 + 2" | piensa "calculate"Summarize a file:
cat myfile.txt | piensa "summarize"Stream responses as they're generated:
piensa "Tell me a story" --streamOptions
Use a specific provider or model:
piensa "Question?" --provider openai
piensa "Question?" --model gpt-4Set your API key (stored for reuse):
piensa "Question?" --key your-api-keyStream the response in real-time:
piensa "Question?" --stream
# or use the short form
piensa "Question?" -sConfig
Set up your preferences:
piensa --configCheck your current config:
piensa configSet a default model:
piensa set-model openai gpt-4-turboCheck your default model:
piensa get-model openaiConfig Location
- macOS:
~/Library/Preferences/piensa-nodejs/config.json - Linux:
~/.config/piensa-nodejs/config.json - Windows:
%APPDATA%\piensa-nodejs\Config\config.json
Security Note
⚠️ API Key Security: API keys are stored in plain text in the configuration file. While these files are protected by your operating system's user permissions, please be aware of the following risks:
- Do not share your configuration directory or backups containing these files
- Be cautious when using on shared computers
- Consider using environment variables for API keys in sensitive environments
- If your machine is compromised, an attacker could potentially access these keys
Supported Providers
- OpenAI (default)
- Anthropic
Development
Clone and run locally:
git clone https://github.com/cesarvarela/piensa.git
cd piensa
npm install
npm run build
npm start "Your prompt here"License
ISC
