ai-client-hook
v1.0.0
Published
A React hook for easy integration with multiple AI providers including OpenAI, Groq, Anthropic, Cohere, Mistral, Perplexity, DeepSeek, and Ollama
Maintainers
Readme
ai-client-hook
A lightweight React hook for seamless integration with multiple AI providers including OpenAI, Groq, Anthropic, Cohere, Mistral, Perplexity, DeepSeek, and Ollama.
✨ Features
- 🚀 Multi-Provider Support: Works with 8+ AI providers
- 🔄 Real-time Streaming: Get responses as they arrive
- 🛡️ TypeScript Ready: Full type definitions included
- 🎯 Simple API: Easy-to-use React hook interface
- 🔧 Flexible Configuration: Custom models, endpoints, and options
- 📱 React 18+ Compatible: Built for modern React applications
- 🎨 Zero Dependencies: Lightweight and fast
🚀 Quick Start
Installation
npm install ai-client-hook
# or
yarn add ai-client-hook
# or
pnpm add ai-client-hookBasic Usage
import { useOpenAI } from "ai-client-hook";
function ChatComponent() {
const { response, loading, error, send } = useOpenAI({
apiKey: "your-api-key",
provider: "openai", // optional, defaults to "openai"
model: "gpt-3.5-turbo", // optional, defaults vary by provider
});
const handleSend = async () => {
await send("Hello, how are you?");
};
return (
<div>
<button onClick={handleSend} disabled={loading}>
{loading ? "Sending..." : "Send Message"}
</button>
{error && <p style={{ color: "red" }}>{error}</p>}
{response && <p>{response}</p>}
</div>
);
}📋 Supported Providers
| Provider | API Key Format | Default Model | Popular Models |
| -------------- | -------------- | -------------------------- | --------------------------------------------------- |
| OpenAI | sk-... | gpt-3.5-turbo | gpt-4, gpt-4o, gpt-3.5-turbo |
| Groq | gsk_... | llama3-8b-8192 | llama3-70b-8192, mixtral-8x7b-32768 |
| Anthropic | sk-ant-... | claude-3-sonnet-20240229 | claude-3-opus-20240229, claude-3-haiku-20240307 |
| Cohere | ... | command | command-light, command-r-plus |
| Mistral | ... | mistral-small-latest | mistral-large-latest, mistral-medium-latest |
| Perplexity | pplx-... | llama-3.1-8b-instant | llama-3.1-70b-versatile, sonar-medium-chat |
| DeepSeek | sk-... | deepseek-chat | deepseek-coder, deepseek-coder-instruct |
| Ollama | "" (none) | llama2 | mistral, codellama, gemma |
🛠️ API Reference
useOpenAI(options)
Parameters
interface UseOpenAIOptions {
apiKey: string; // Required: Your API key
provider?: string; // Optional: AI provider (default: "openai")
model?: string; // Optional: Model name (default varies by provider)
baseURL?: string; // Optional: Custom API endpoint
temperature?: number; // Optional: Response randomness (0-2)
maxTokens?: number; // Optional: Maximum response length
systemPrompt?: string; // Optional: System message for context
}Returns
interface UseOpenAIReturn {
response: string; // Latest AI response
loading: boolean; // Request status
error: string; // Error message (if any)
send: (prompt: string) => Promise<void>; // Send message function
availableModels: string[]; // Available models for provider
currentModel: string; // Currently selected model
reset: () => void; // Reset response and error states
}🔧 Helper Functions
Get Available Models
import { getAvailableModels } from "ai-client-hook";
const models = getAvailableModels("openai");
// ["gpt-4o", "gpt-4o-mini", "gpt-4-turbo", "gpt-3.5-turbo", ...]Get Model Info
import { getModelInfo } from "ai-client-hook";
const info = getModelInfo("openai", "gpt-4");
// { name: "gpt-4", description: "Available openai model", provider: "openai" }Get Supported Providers
import { getSupportedProviders } from "ai-client-hook";
const providers = getSupportedProviders();
// ["openai", "groq", "anthropic", "cohere", "mistral", "perplexity", "deepseek", "ollama"]📝 Examples
Dynamic Provider Selection
import {
useOpenAI,
getAvailableModels,
getSupportedProviders,
} from "ai-client-hook";
import { useState } from "react";
function ProviderSelector() {
const [provider, setProvider] = useState("openai");
const [model, setModel] = useState("gpt-3.5-turbo");
const { response, loading, send, reset } = useOpenAI({
apiKey: "your-api-key",
provider,
model,
temperature: 0.7,
maxTokens: 1000,
});
const models = getAvailableModels(provider);
const providers = getSupportedProviders();
return (
<div>
<select value={provider} onChange={(e) => setProvider(e.target.value)}>
{providers.map((p) => (
<option key={p} value={p}>
{p.charAt(0).toUpperCase() + p.slice(1)}
</option>
))}
</select>
<select value={model} onChange={(e) => setModel(e.target.value)}>
{models.map((m) => (
<option key={m} value={m}>
{m}
</option>
))}
</select>
<button onClick={() => send("Hello!")} disabled={loading}>
{loading ? "Sending..." : "Send"}
</button>
<button onClick={reset}>Reset</button>
{response && <p>{response}</p>}
</div>
);
}Multiple Providers
function MultiProviderChat() {
const openai = useOpenAI({
apiKey: "sk-...",
provider: "openai",
systemPrompt: "You are a helpful assistant.",
});
const groq = useOpenAI({
apiKey: "gsk_...",
provider: "groq",
temperature: 0.8,
});
const claude = useOpenAI({
apiKey: "sk-ant-...",
provider: "anthropic",
maxTokens: 500,
});
return (
<div>
<button
onClick={() => openai.send("Hello OpenAI!")}
disabled={openai.loading}
>
Ask OpenAI
</button>
<button onClick={() => groq.send("Hello Groq!")} disabled={groq.loading}>
Ask Groq
</button>
<button
onClick={() => claude.send("Hello Claude!")}
disabled={claude.loading}
>
Ask Claude
</button>
<div>
<h4>OpenAI: {openai.response}</h4>
<h4>Groq: {groq.response}</h4>
<h4>Claude: {claude.response}</h4>
</div>
</div>
);
}Chat Interface
import { useOpenAI } from "ai-client-hook";
import { useState } from "react";
function ChatInterface() {
const [messages, setMessages] = useState<
Array<{ role: string; content: string }>
>([]);
const [input, setInput] = useState("");
const { loading, send } = useOpenAI({
apiKey: "your-api-key",
provider: "openai",
systemPrompt: "You are a helpful AI assistant.",
});
const handleSend = async () => {
if (!input.trim()) return;
const userMessage = { role: "user", content: input };
setMessages((prev) => [...prev, userMessage]);
setInput("");
try {
await send(input);
// Note: You'd need to capture the response and add it to messages
} catch (error) {
console.error("Error sending message:", error);
}
};
return (
<div>
<div
style={{ height: "400px", overflowY: "auto", border: "1px solid #ccc" }}
>
{messages.map((msg, i) => (
<div
key={i}
style={{
padding: "10px",
backgroundColor: msg.role === "user" ? "#f0f0f0" : "#e8f4fd",
}}
>
<strong>{msg.role}:</strong> {msg.content}
</div>
))}
</div>
<div style={{ display: "flex", gap: "10px", marginTop: "10px" }}>
<input
value={input}
onChange={(e) => setInput(e.target.value)}
onKeyPress={(e) => e.key === "Enter" && handleSend()}
placeholder="Type your message..."
style={{ flex: 1, padding: "8px" }}
/>
<button onClick={handleSend} disabled={loading || !input.trim()}>
{loading ? "Sending..." : "Send"}
</button>
</div>
</div>
);
}🔑 Getting API Keys
- OpenAI: platform.openai.com/api-keys
- Groq: console.groq.com/keys
- Anthropic: console.anthropic.com
- Cohere: dashboard.cohere.ai
- Mistral: console.mistral.ai
- Perplexity: perplexity.ai/settings/api
- DeepSeek: platform.deepseek.com
- Ollama: No API key needed (local installation)
🐛 Error Handling
The hook automatically handles common errors:
- Invalid API keys
- Network issues
- Rate limiting
- Invalid model names
- Provider-specific errors
Errors are stored in the error state and can be displayed to users.
const { error, send } = useOpenAI({ apiKey: "your-key" });
if (error) {
console.error("AI Error:", error);
// Display error to user
}📦 TypeScript Support
Full TypeScript support with complete type definitions included.
import { useOpenAI, UseOpenAIOptions, UseOpenAIReturn } from "ai-client-hook";
const options: UseOpenAIOptions = {
apiKey: "your-key",
provider: "openai",
model: "gpt-4",
};
const { response, loading, error, send }: UseOpenAIReturn = useOpenAI(options);🚀 Advanced Usage
Custom Configuration
const { response, send } = useOpenAI({
apiKey: "your-api-key",
provider: "openai",
model: "gpt-4",
temperature: 0.8,
maxTokens: 2000,
systemPrompt:
"You are a coding assistant. Provide concise, helpful responses.",
baseURL: "https://your-custom-endpoint.com", // For custom deployments
});Environment Variables
// .env
REACT_APP_OPENAI_API_KEY = sk - your - key;
REACT_APP_GROQ_API_KEY = gsk_your - key;
// Component
const { response, send } = useOpenAI({
apiKey: process.env.REACT_APP_OPENAI_API_KEY!,
provider: "openai",
});📄 License
ISC
🤝 Contributing
Contributions welcome! Please feel free to submit a Pull Request.
Development
git clone https://github.com/achyutsharan/ai-client-hook.git
cd ai-client-hook
npm install
npm run build📈 Roadmap
- [ ] Streaming responses support
- [ ] Conversation history management
- [ ] File upload support
- [ ] Function calling support
- [ ] More AI providers
- [ ] React Native support
Made with ❤️ by Achyut Sharan
