oacr-aitool
v1.2.0
Published
A robust OpenRouter client for seamless AI integration - no complex setup required
Downloads
11
Maintainers
Readme
Personal AI Lib.
A robust, production-ready client for seamless AI integration. No complex setup, no multiple dependenciesM just install and use!
✨ Features
- 🎯 Zero Configuration - Works out of the box
- 🔄 Streaming Support - Real-time responses
- 🛡️ Built-in Error Handling - Robust error management
- 🔁 Auto Retry - Automatic retry with exponential backoff
- 📊 Usage Statistics - Track your API usage
- 🎛️ Model Management - Easy model switching
- 💪 TypeScript Support - Full type definitions
- 🧪 Thoroughly Tested - Comprehensive test suite
🚀 Quick Start
npm install oacr-aitoolimport { createClient } from 'oacr-aitool';
const ai = createClient({
k: 'key provided by the dev'
});
// Simple chat
const response = await ai.chat('Hello, how are you?');
console.log(response.message);
// Streaming
await ai.stream('Tell me a story', {}, (chunk) => {
process.stdout.write(chunk);
});📖 Documentation
Installation
npm install oacr-aitoolBasic Usage
import { createClient } from 'oacr-aitool';
const client = createClient({
k: 'key provided by the dev',
siteUrl: 'https://yoursite.com', // Optional
siteName: 'Your App Name' // Optional
});API Methods
chat(messages, options)
Send chat messages and get responses.
// String input
const response = await client.chat('What is AI?');
// Array input with system message
const response = await client.chat([
{ role: 'system', content: 'You are a helpful assistant' },
{ role: 'user', content: 'Explain quantum computing' }
], {
model: 'openai/gpt-4',
temperature: 0.7,
max_tokens: 500
});stream(messages, options, onChunk)
Stream responses in real-time.
await client.stream('Tell me a joke', {}, (chunk, fullText, meta) => {
console.log('Chunk:', chunk);
console.log('Complete so far:', fullText);
console.log('Token count:', meta.tokenCount);
});getModels()
Get available models and pricing.
const models = await client.getModels();
console.log(models.models);Configuration Options
const client = createClient({
k: 'required',
siteUrl: 'https://yoursite.com',
siteName: 'Your App',
defaultModel: 'openai/gpt-3.5-turbo',
defaultOptions: {
temperature: 0.7,
max_tokens: 1000,
top_p: 1
},
retryOptions: {
maxRetries: 3,
baseDelay: 1000
}
});🔧 Advanced Usage
Error Handling
const response = await client.chat('Hello');
if (!response.success) {
console.error('Error:', response.error);
console.error('Code:', response.code);
}Dynamic Configuration
client.updateConfig({
defaultModel: 'anthropic/claude-3-opus',
defaultOptions: { temperature: 0.9 }
});Usage Statistics
const stats = client.getStats();
console.log('Requests made:', stats.requestCount);
console.log('Last request:', stats.lastRequestTime);🎯 Examples
Check out the /examples directory for complete examples:
basic-usage.js- Simple chat examplesstreaming.js- Real-time streamingadvanced.js- Advanced features
🧪 Testing
npm test
npm run test:coverage🤝 Contributing
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
📄 License
MIT © Oscar Angelo Collins Rivera
🆘 Support
- 📧 Email: [email protected]
- 🐛 Issues: GitHub Issues
- 💬 Discussions: GitHub Discussions
```markdown:LICENSE
MIT License
Copyright (c) 2024 Oscar Angelo Collins Rivera
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.📦 Important Answer About Dependencies
YES, your package will automatically install OpenAI and other required dependencies! Here's how it works:
Your Package Dependencies Structure:
{
"dependencies": {
"openai": "^4.20.1"
},
"devDependencies": {
"dotenv": "^16.3.1",
"rollup": "^4.6.1",
// ... other dev dependencies
}
}What Happens When Users Install Your Package:
npm install oacr-aitoolAutomatically installs:
- ✅
openaipackage (because it's independencies) - ✅ All dependencies that
openaineeds - ❌ Does NOT install
devDependencies(likedotenv,rollup, etc.)
For User Examples (Optional Dependencies):
If users want to run your examples, they need to install optional dependencies:
{
"dependencies": {
"oacr-aitool": "^1.0.0",
"dotenv": "^16.3.1" // Only needed for examples
}
}User Experience:
// User only needs to do this:
npm install oacr-aitool
// Then use immediately:
import { createClient } from 'oacr-aitool';
const ai = createClient({ k: 'key' });
await ai.chat('Hello!'); // Works immediately!Build Commands to Prepare Your Package:
# Build the package
npm run build
# Test everything works
npm test
# Publish to npm
npm publish