karakuri-agent-example
v1.0.0-dev.20251005015730
Published
Example usage of karakuri-agent-core
Readme
Karakuri Agent Core Example
This example demonstrates how to use the karakuri-agent-core library to create and deploy AI agents.
Setup
- Install dependencies:
npm install- Configure environment variables:
cp .env.example .env
# Edit .env with your configuration- Run the example:
npm start
# or for development with hot reload
npm run devAvailable Scripts
npm start- Start the servernpm run dev- Start with hot reloadnpm run build- Build TypeScript files
Configuration
Edit .env file to configure:
PORT- Server port (default: 8080)KANON_LLM_BASE_URL- LLM API base URLKANON_LLM_API_KEY- Your API keyKANON_LLM_MODEL- Model to use (e.g., gpt-4)KANON_USER_NAME- User name for Kanon agentKANON_USER_INFO- User information for context
API Endpoints
Once running, the following endpoints are available:
GET /v1/models- List available agentsPOST /v1/chat/completions- Chat with an agent (OpenAI-compatible)GET /health- Health check
Using the API
# List available models
curl http://localhost:8080/v1/models
# Chat with Kanon agent
curl http://localhost:8080/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "kanon-agent",
"messages": [
{"role": "user", "content": "Hello!"}
]
}'