dev-mate-cli
v1.0.3
Published
command line tool using LLMs for code documentation
Readme
dev-mate-cli
A command-line tool that leverages OpenAI's Chat Completion API to document code with the assistance of AI models.
Watch this Demo video to view features.
Features
- Source Code Documentation: Automatically generate comments and documentation for your source code.
- Multiple File Processing: Handle one or multiple files in a single command.
- Model Selection: Use AI model of your choice with the
--modelflag. - Custom Output: Output the results to a file with the
--outputflag, or display them in the console. - Stream Output: Stream the LLM response to command line with
--streamflag.
Installation
npm install -g dev-mate-cliEnvironment Variables
dev-mate-cli needs API_KEY and BASE_URL to generate responses, these variables should be stored in a .env file within the current directory. Make sure to use the API_KEY and BASE_URL from the same OpenAI-compatible completion API provider.
API_KEY=your_api_key
BASE_URL=https://api.openai.com/v1Popular providers - OpenRouter, Groq, OpenAI.
Usage
Basic Usage
To run the tool, specify one or more source files or folders as input:
dev-mate-cli ./examples/file.jsFor processing multiple files:
dev-mate-cli ./examples/file.js ./examples/file.cppFor processing folders:
dev-mate-cli ./examplesCommand-line Options
-m, --model <model-name>: Choose the AI model to use(default: google/gemma-2-9b-it:free from OpenRouter).dev-mate-cli file.js -m "openai/gpt-4o-mini"-o, --output <output-file>: Write the output to a specified file.dev-mate-cli file.js -o output.js-t, --temperature <value>: Set the creativity level of the AI model(default: 0.7).dev-mate-cli file.js -t 1.1-u, --token-usage: Display token usage informationdev-mate-cli file.js -u-s, --stream: Stream response to command linedev-mate-cli file.js -s
Additional Commands
- Check Version: To check the current version of the tool, use:
dev-mate-cli --version - Help: Display the help message listing all available options:
dev-mate-cli --help
Examples
Document a JavaScript file and save the result:
dev-mate-cli ./examples/file.js --output file-documented.js --model google/gemini-flash-8b-1.5-expProcess multiple files and print output to the console:
dev-mate-cli ./examples/file.js ./examples/file.py --model google/gemini-flash-8b-1.5-exp
LLM Configuration
To use a file for LLM configuration, create a dotfile named .dev-mate-cli.toml in the home directory of your system.
Ex: ~/.dev-mate-cli.toml:
model = "gpt-4o"
temperature = "1"Contributing
Contributions are welcome! If you find a bug or have an idea for an improvement, feel free to open an issue or submit a pull request, view Contribution Guidelines for more details.
License
This project is licensed under the MIT License.
