dranai
v0.5.0-b5
Published
`dranai` (Dragon AI) is a comprehensive AI-powered application designed with a client-server architecture. It serves as a unified interface and proxy for various LLM providers, offering an Ollama-compatible API for seamless integration with existing tools
Readme
Dragon AI (dranai)
dranai (Dragon AI) is a comprehensive AI-powered application designed with a client-server architecture. It serves as a unified interface and proxy for various LLM providers, offering an Ollama-compatible API for seamless integration with existing tools.
Features
- Ollama-Compatible API: Implements key endpoints of the Ollama API, allowing it to work with clients that support Ollama.
/api/chat: Supports chat completions with streaming and non-streaming responses./api/tags: Lists available models from registered providers.
- LangChain Integration: Utilizes LangChain for robust LLM interactions and abstraction.
- Multimodal Support: Supports image inputs in chat requests (base64 encoded) for vision-capable models.
- Provider Management: Dynamically manages and routes requests to different LLM providers via a registry system.
Usage
Installation
npm install -g dranaiRunning the Server
Start the server using the CLI command:
dranai
# or
npx dranaiThe server listens on port 28700 by default.
Configuration
Configuration is handled via environment variables. You can define them in a .env file in your working directory.
| Variable | Description | Default |
| :--- | :--- | :--- |
| DRANAI_PORT | The port the server listens on. | 28700 |
| DRANAI_OLLAMA_HOST | The URL of the backing Ollama instance. | http://localhost:11434 |
API Endpoints
The server exposes an Ollama-like API under /api/ollama/:id where :id is an identifier for the tenant or context (implementation dependent).
- Chat:
POST /api/ollama/:id/api/chat- Standard Ollama chat request format.
- Supports
images(base64) for multimodal input.
- Tags:
GET /api/ollama/:id/api/tags- Returns a list of available models from all registered providers.
