mb-app-runtime
v1.0.5
Published
Lightweight Deno runtime framework with AI capabilities, KV storage, and service registry
Maintainers
Readme
mb-runtime
Complete Deno runtime framework with AI capabilities, KV storage, and microservices support.
Features
- 🚀 HTTP Server - Hono-based with routing and middleware
- 📦 KV Storage - App-scoped Deno KV with TTL support
- 🤖 LangChain AI - Multi-provider (OpenAI, Anthropic, Google, DeepSeek, Azure)
- 🔐 Authentication - Built-in auth middleware
- 🌐 Nacos Integration - Service registry and discovery
- 📝 Logging - Structured logging with context
- 🔧 HTTP Client - Whitelist-based with rate limiting
- 📊 OpenAPI/Swagger - Auto-generated API documentation
Installation
For Deno Projects
// Using npm specifier (recommended)
import { createApp, config } from "npm:[email protected]";
// Or add to import_map.json
{
"imports": {
"mb-runtime": "npm:[email protected]",
"mb-runtime/": "npm:[email protected]/"
}
}For npm Registry (traditional)
npm install mb-runtimeQuick Start
Basic HTTP Server
import { serve } from "https://deno.land/std/http/server.ts";
import { createApp, config } from "npm:mb-runtime";
// Create the application
const app = await createApp();
// Start server
serve(app.fetch, { port: config.port });
console.log(`Server running on port ${config.port}`);Using Tools Directly
import {
createKVStore,
createHttpClient,
createLangChainClient
} from "npm:mb-runtime";
// KV Storage
const kv = createKVStore('my-app');
await kv.set('user:123', { name: 'Alice' });
// HTTP Client
const http = createHttpClient({
id: 'my-app',
permissions: { http: ['api.example.com'] }
});
const response = await http.fetch('https://api.example.com/data');
// LangChain AI
const ai = await createLangChainClient({
id: 'my-app',
permissions: {
langchain: {
enabled: true,
models: ['gpt-4o-mini'],
features: ['llm', 'memory']
}
}
});
const answer = await ai.chat('Hello!');Configuration
Set environment variables:
# Server
PORT=8002
NODE_ENV=production
# Paths (customizable)
APPS_DIR=./apps # Default: ./apps
# KV Storage
DENO_KV_PATH=./data/kv.db # Optional, defaults to memory
# AI Providers (choose one or more)
OPENAI_API_KEY=sk-xxx
ANTHROPIC_API_KEY=sk-ant-xxx
GOOGLE_API_KEY=xxx
DEEPSEEK_API_KEY=sk-xxx
# Azure
AZURE_OPENAI_API_KEY=xxx
AZURE_OPENAI_ENDPOINT=https://xxx
# Nacos (optional)
NACOS_ENABLED=true
NACOS_SERVER_ADDR=nacos:8848Architecture
mb-runtime/
├── config.js - Configuration management
├── mod.js - Main exports
├── http/ - HTTP layer
│ ├── app.js - Hono app creation
│ ├── middleware/ - Auth, error handling
│ └── routes/ - Route handlers
├── tools/ - Core capabilities ⭐
│ ├── kv.js - KV storage
│ ├── http.js - HTTP client
│ └── langchain.js - AI capabilities
├── services/ - Business services
├── runtime/ - App runtime
└── utils/ - UtilitiesAPI Reference
Core Exports
createApp()
Creates a configured Hono application.
const app = await createApp();config
Runtime configuration object.
console.log(config.port); // 8002
console.log(config.paths.apps); // ./appsTools (⭐ AI-Discoverable)
createKVStore(appId: string)
Creates app-scoped KV storage.
Methods:
get(key)- Get valueset(key, value, ttl?)- Set value with optional TTL (seconds)delete(key)- Delete keylist(prefix)- List keys with prefix
createHttpClient(appInfo)
Creates whitelist-restricted HTTP client.
Methods:
fetch(url, options?)- Make HTTP request
createLangChainClient(appInfo, kv?)
Creates LangChain AI client.
Methods:
chat(message, options?)- Send chat messagecreateChain(type, config?)- Create LangChain chainsaveMessage(role, content)- Save to memorygetMessages()- Get conversation historyclearMessages()- Clear memory
Services
NacosService
Service registry integration.
import { NacosService } from "npm:mb-runtime";
const nacos = new NacosService(config);
await nacos.register();
nacos.startHeartbeat();Usage in Your Project
Project Structure
your-project/
├── main.js - Your entry point
├── deno.json - Import map
└── import_map.json - npm:mb-runtime mappingExample main.js
import { serve } from "https://deno.land/std/http/server.ts";
import { createApp, config, logger } from "npm:mb-runtime";
logger.info('Starting application...');
const app = await createApp();
serve(app.fetch, {
port: config.port,
onListen: ({ port }) => {
logger.info(`Server listening on port ${port}`);
}
});License
MIT
Contributing
Issues and PRs welcome at https://github.com/yourusername/mb-runtime
