photonchat
v1.0.0
Published
A modern AI chat application with 400+ models, built with React and Puter.js
Maintainers
Readme
PhotonChat CLI
A modern AI chat application with 400+ models, built with React and Puter.js.
Installation
npm install -g photonchatUsage
Start PhotonChat with automatic browser opening:
photonchatThis will:
- Start the PhotonChat server on port 3000
- Automatically open your default browser to the application
- Provide access to 400+ AI models from various providers
Features
- 400+ AI Models: Access models from OpenAI, Claude, Gemini, xAI, DeepSeek, and more
- Modern UI: Built with React, TypeScript, and Tailwind CSS
- Real-time Chat: Stream responses in real-time
- Image Analysis: Upload and analyze images with AI
- Image Generation: Generate images using AI models
- Model Management: Dynamic model list updates
- Persistent Authentication: Stay logged in across sessions
- Global CLI: Install once, use anywhere
AI Providers Supported
- OpenAI (GPT models)
- Anthropic (Claude models)
- Google (Gemini models)
- xAI (Grok models)
- DeepSeek
- OpenRouter
- And many more...
Development
# Clone the repository
git clone https://github.com/yourusername/photonchat.git
cd photonchat
# Install dependencies
npm install
# Start development server
npm run dev
# Build for production
npm run build
# Start production server
npm startGlobal Command Setup
After installation, you can use the photonchat command from anywhere:
photonchatThe server will start on http://localhost:3000 and automatically open in your default browser.
Stopping the Server
To stop the PhotonChat server, press Ctrl+C in the terminal where it's running.
Configuration
The application uses Puter.js for AI model access. Models are automatically fetched and updated, with support for:
- Dynamic model discovery
- Provider-specific features
- Real-time model availability
- Persistent model caching
License
MIT License - see LICENSE file for details.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Support
For issues and questions, please visit our GitHub Issues.
