@marshmallo/marlo
v0.1.3
Published
The official TypeScript SDK for Marlo - the agent learning platform
Readme
@marshmallo/marlo
The official TypeScript SDK for Marlo - the open-source agent learning platform.
Marlo enables AI agents to learn and improve autonomously in production. It captures agent behavior, evaluates outcomes, and turns failures into actionable learnings.
Installation
npm install @marshmallo/marloQuick Start
import * as marlo from '@marshmallo/marlo';
// Initialize (connects to local server at http://localhost:8000 by default)
await marlo.init();
// Register your agent
marlo.agent('support-bot', 'You are a customer support agent.', [
{ name: 'lookup_order', description: 'Find order by ID' }
]);
// Track a task
const task = marlo.task('thread-123', 'support-bot').start();
task.input('Where is my order?');
// Your agent logic here...
task.output('Your order ships tomorrow.');
task.end();
// Shutdown before exit
await marlo.shutdown();Why Marlo?
Agents fail silently in production. The same mistakes repeat because failures aren't captured in a reusable form. Marlo solves this with a learning loop:
- Capture - Record LLM calls, tool calls, and outcomes
- Evaluate - Score task outcomes automatically
- Learn - Generate guidance from failures
- Apply - Inject learnings into future tasks
API
Initialize
await marlo.init();
await marlo.init('http://localhost:8000');Register Agent
marlo.agent(name, systemPrompt, tools, mcp?, modelConfig?);| Parameter | Type | Description |
|-----------|------|-------------|
| name | string | Unique agent identifier |
| systemPrompt | string | Agent's system prompt |
| tools | ToolDefinition[] | Available tools |
| mcp | McpDefinition[] | MCP servers (optional) |
| modelConfig | ModelConfig | Model settings (optional) |
Track Tasks
const task = marlo.task(threadId, agentName, threadName?).start();
task.input(text); // User input
task.output(text); // Agent response
task.llm({ model, usage, messages?, response? });
task.tool(name, input, output, error?);
task.reasoning(text); // Chain-of-thought
task.error(message); // Mark as failed
task.end();Fetch Learnings
const learnings = await task.getLearnings();
if (learnings?.learnings_text) {
// Inject into your agent's context
systemPrompt += `\n\nLearnings:\n${learnings.learnings_text}`;
}Multi-Agent
const parent = marlo.task('thread-1', 'orchestrator').start();
parent.input('Research AI trends');
const child = parent.child('researcher').start();
child.input('Search for AI trends');
child.output('Found 3 sources...');
child.end();
parent.output('Report complete');
parent.end();Shutdown
await marlo.shutdown();Full Example
import * as marlo from '@marshmallo/marlo';
await marlo.init();
marlo.agent('support-bot', 'You are a customer support agent.', [
{ name: 'lookup_order', description: 'Find order by ID' }
]);
async function handleMessage(input: string, threadId: string) {
const task = marlo.task(threadId, 'support-bot').start();
task.input(input);
// Apply learnings from past interactions
const learnings = await task.getLearnings();
let systemPrompt = 'You are a customer support agent.';
if (learnings?.learnings_text) {
systemPrompt += `\n\nLearnings:\n${learnings.learnings_text}`;
}
// Track tool call
task.tool('lookup_order', { id: '123' }, { status: 'shipped' });
// Track LLM call
const response = 'Your order ships tomorrow.';
task.llm({
model: 'gpt-4',
usage: { input_tokens: 100, output_tokens: 25 },
messages: [{ role: 'user', content: input }],
response
});
task.output(response);
task.end();
return response;
}Requirements
- Node.js 18+
Links
License
MIT
