clear-ai
v1.0.6
Published
Clear AI - A modern TypeScript framework for building AI-powered applications with tool execution and workflow orchestration
Maintainers
Readme
clear-ai
Clear AI - A modern TypeScript framework for building AI-powered applications with tool execution and workflow orchestration. Perfect for CLI tools, APIs, and server applications.
📚 Documentation
📖 Full Documentation - Complete guides, API reference, and examples
🚀 Quick Start
npm install clear-aiimport { ClearAI } from "clear-ai";
// Initialize the framework for CLI usage
const ai = new ClearAI({
llm: {
openaiApiKey: "your-key",
ollamaBaseUrl: "http://localhost:11434",
},
server: {
port: 3001,
},
});
// Start everything
await ai.init();
// Access services
const mcpServer = ai.getMCP();
const llmService = ai.getLLM();
const toolService = ai.getTools();📦 What's Included
MCP (Model Context Protocol) - clear-ai-mcp-basic
- MCPServer - Full MCP protocol implementation
- ToolRegistry - Dynamic tool registration and management
- Built-in Tools - API calls, JSON processing, file operations
Shared Services - clear-ai-shared
- SimpleLangChainService - Multi-provider LLM integration
- ToolExecutionService - Tool registration and execution
- SimpleWorkflowService - Workflow orchestration
- Logger - Structured logging utilities
Server - clear-ai-server
- Express API - RESTful endpoints for tool execution
- Workflow Execution - LangGraph workflow orchestration
- Health Monitoring - System health and status endpoints
Client - @clear-ai/client (Private - Local Development Only)
- React Components - Pre-built UI components with Storybook
- Theme System - Multiple visual themes
- Web Interface - Browser-based tool execution interface
🎯 Usage Examples
Basic Tool Execution
import { MCPServer, ToolRegistry } from "clear-ai-mcp-basic";
const server = new MCPServer();
const tools = server.getToolRegistry();
// Execute an API call
const result = await tools.executeTool("api_call", {
url: "https://api.example.com/users/1",
method: "GET",
});LLM Integration
import { SimpleLangChainService } from "clear-ai-shared";
const llm = new SimpleLangChainService({
openaiApiKey: "your-key",
ollamaBaseUrl: "http://localhost:11434",
});
const response = await llm.complete("Hello, world!", {
model: "ollama",
temperature: 0.7,
});Workflow Execution
import { SimpleWorkflowService, ToolExecutionService } from "clear-ai-shared";
const toolService = new ToolExecutionService(llmConfig);
const workflow = new SimpleWorkflowService(llmConfig, toolService);
const result = await workflow.executeWorkflow(
"Get weather data and format it nicely"
);Server API
import { createServer } from "clear-ai-server";
const server = createServer({
port: 3001,
mcpConfig: { tools: ["api_call", "json_reader"] },
llmConfig: { openaiApiKey: "your-key" },
});
await server.start();CLI Application
import { ClearAI, MCPServer } from "clear-ai-core";
async function main() {
const ai = new ClearAI({
llm: { openaiApiKey: process.env.OPENAI_API_KEY },
server: { port: 3001 },
});
await ai.init();
// Use the MCP server for tool execution
const mcpServer = ai.getMCP();
const result = await mcpServer.getToolRegistry().executeTool("api_call", {
url: "https://api.example.com/data",
method: "GET",
});
console.log("Result:", result);
}
main().catch(console.error);🏗️ Architecture
clear-ai-core
├── clear-ai-mcp-basic # Model Context Protocol
├── clear-ai-shared # Shared services & utilities
├── clear-ai-server # Express API server
└── @clear-ai/client # React web interface (private)🔧 Configuration
Environment Variables
# LLM Configuration
OPENAI_API_KEY=your-key
OLLAMA_BASE_URL=http://localhost:11434
MISTRAL_API_KEY=your-key
GROQ_API_KEY=your-key
# Langfuse (Observability)
LANGFUSE_SECRET_KEY=your-key
LANGFUSE_PUBLIC_KEY=your-key
LANGFUSE_BASE_URL=https://cloud.langfuse.com
# Server Configuration
PORT=3001
NODE_ENV=productionFramework Configuration
const config: ClearAIConfig = {
mcp: {
tools: ["api_call", "json_reader", "file_reader"],
},
llm: {
openaiApiKey: process.env.OPENAI_API_KEY,
ollamaBaseUrl: process.env.OLLAMA_BASE_URL,
langfuseSecretKey: process.env.LANGFUSE_SECRET_KEY,
},
server: {
port: 3001,
cors: { origin: ["http://localhost:3000"] },
},
};🛠️ Development
Prerequisites
- Node.js >= 18.0.0
- npm >= 10.0.0
Setup
# Clone the repository
git clone https://github.com/wsyeabsera/clear-ai.git
cd clear-ai
# Install dependencies
npm install
# Build all packages
npm run build
# Start development servers
npm run devAvailable Scripts
npm run dev- Start all packages in development modenpm run build- Build all packagesnpm run lint- Run ESLint on all packagesnpm run type-check- Run TypeScript type checkingnpm run clean- Clean all build artifacts
📚 Documentation
- Getting Started - Quick start guide
- API Reference - Complete API documentation
- Tutorials - Step-by-step tutorials
- Architecture - System architecture overview
📖 Documentation Access
From NPM
When you install the package, you can access documentation via:
npm docs clear-ai-core
# or
npm home clear-ai-coreDirect Links
- Main Documentation: https://wsyeabsera.github.io/Clear-AI/
- Getting Started: https://wsyeabsera.github.io/Clear-AI/docs/intro
- API Reference: https://wsyeabsera.github.io/Clear-AI/docs/api/overview
🤝 Contributing
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Commit your changes:
git commit -m 'Add amazing feature' - Push to the branch:
git push origin feature/amazing-feature - Open a Pull Request
📝 License
This project is licensed under the MIT License - see the LICENSE file for details.
🙏 Acknowledgments
- LangChain - LLM framework
- Model Context Protocol - Tool protocol
- Express.js - Web framework
- React - UI library
📞 Support
- 📖 Documentation: GitHub Docs
- 🐛 Issues: GitHub Issues
- 💬 Discussions: GitHub Discussions
Made with ❤️ by the Clear AI Team
