npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@truffle-ai/saiki

v0.3.7

Published

Your command center for controlling computers and services with natural language - connect once, command everything

Readme

Saiki

A lightweight runtime for creating and running AI agents that turn natural language into real-world actions.


Table of Contents

  1. Why Saiki?
  2. Installation
  3. Run Modes
  4. Quick Start
  5. Programmatic API
  6. Configuration
  7. Examples & Demos
  8. Capabilities
  9. LLM Providers
  10. Standalone MCP Manager
  11. CLI Reference
  12. Next Steps
  13. Community & Support
  14. Contributors
  15. License

Why Saiki?

Saiki is the missing intelligence layer of your stack—perfect for building AI applications, standalone chatbots, or as the reasoning engine inside larger products.

The main Saiki features are:

| 💡 Feature | What it means for you | |------------|-----------------------| | Powerful CLI and Web UI | Saiki ships with a powerful CLI and Web UI that enable you to run AI agents in your terminal and over the web. | | Single runtime, many interfaces | Run the same agent via CLI, Web, Discord, Telegram, or a REST/WS server. | | Model-agnostic | Hot-swap LLMs from OpenAI, Anthropic, Gemini, Groq, or local models. | | Unified Tooling | Connect to remote tool servers (filesystem, browser, web-search) via the Model Context Protocol (MCP). | | Config-driven | Define agent behavior (prompts, tools, model, memory) in version-controlled YAML. | | Production-ready Core | Leverage a multi-session chat manager, typed API, pluggable storage, and robust logging. | | Extensible | Ship your own MCP tool servers or plug in custom services with a few lines of config. | | Multi-Agent Systems | Enable multi-agent collaboration via MCP and A2A. |


Installation

# NPM global
npm install -g @truffle-ai/saiki

# —or— build from source
git clone https://github.com/truffle-ai/saiki.git
cd saiki && npm i && npm run build && npm link

Run Modes

| Mode | Command | Best for | |------|---------|----------| | Interactive CLI | saiki | Everyday automation & quick tasks | | Web UI | saiki --mode web | Friendly chat interface w/ image support | | Headless Server | saiki --mode server | REST & WebSocket APIs for agent interaction | | MCP Server (Agent) | saiki --mode mcp | Exposing your agent as a tool for others via stdio | | MCP Server (Aggregator) | saiki mcp --group-servers | Re-exposing tools from multiple MCP servers via stdio | | Discord Bot | saiki --mode discord | Community servers & channels (Requires Setup) | | Telegram Bot | saiki --mode telegram | Mobile chat (Requires Setup) |

Run saiki --help for all flags, sub-commands, and environment variables.


Quick Start

Set your API keys first:

export OPENAI_API_KEY=your_openai_api_key_here

Then, give Saiki a multi-step task that combines different tools:

saiki "create a new snake game in html, css, and javascript, then open it in the browser"

Saiki will use its filesystem tools to write the code and its browser tools to open the index.html file—all from a single prompt.

Then start the Web UI:

saiki --mode web

The Web UI will load up any previous conversations you had, and also allows you to experiment with different models and MCP servers.


Programmatic API

The SaikiAgent class is the core of the runtime. The following example shows its full lifecycle: initialization, running a single task, holding a conversation, and shutting down.

import 'dotenv/config';
import { SaikiAgent, loadConfigFile } from '@truffle-ai/saiki';

const cfg  = await loadConfigFile('./agents/agent.yml');
const agent = new SaikiAgent(cfg);

await agent.start();

// Single-shot task
console.log(await agent.run('List the 5 largest files in this repo'));

// Conversation
await agent.run('Write a haiku about TypeScript');
await agent.run('Make it funnier');

agent.resetConversation();

await agent.stop();

Everything in the CLI is powered by this same class—so whatever the CLI can do, your code can too.

Check out our Typescript SDK docs for a complete guide.


Configuration

Agents are defined in version-controlled YAML. A minimal example:

mcpServers:
  filesystem:
    type: stdio
    command: npx
    args: ['-y', '@modelcontextprotocol/server-filesystem', '.']
  puppeteer:
    type: stdio
    command: npx
    args: ['-y', '@truffle-ai/puppeteer-server']

llm:
  provider: openai
  model: gpt-4o
  apiKey: $OPENAI_API_KEY

systemPrompt: |
  You are Saiki, an expert coding assistant...

Change the file, reload the agent, and chat—the conversation state, memory, and tools will update.

Check out our Configuration guide for the complete reference.


Examples & Demos

🛒 Amazon Shopping Assistant

Task: Can you go to amazon and add some snacks to my cart? I like trail mix, cheetos and maybe surprise me with something else?

# Default agent has browser tools
saiki

📧 Send Email Summaries to Slack

Task: Summarize emails and send highlights to Slack

saiki --agent ./agents/examples/email_slack.yml

More ready-to-run recipes live in agents/examples and the docs site.


Capabilities

  • Dynamic LLM Switching: Change model, provider, or routing rules mid-conversation.
  • Streaming Responses: Opt-in to receive tokens as they arrive for real-time output.
  • Multi-Session Management: Create isolated, stateful chat sessions (think workspace tabs).
  • Pluggable Memory Backends: Use the in-memory default or connect your own DB via the StorageManager.
  • Lifecycle Event Bus: Subscribe to agent events for metrics, logging, or custom side-effects.
  • Standalone MCP Manager: Use Saiki's core MCPManager in your own projects without the full agent.

LLM Providers

Saiki supports multiple LLM providers out-of-the-box, plus any OpenAI SDK-compatible provider.

  • OpenAI: gpt-4.1-mini, gpt-4o, o3, o1 and more
  • Anthropic: claude-4-sonnet-20250514, claude-3-7-sonnet-20250219, and more
  • Google: gemini-2.5-pro, gemini-2.0-flash and more
  • Groq: llama-3.3-70b-versatile, gemma-2-9b-it

Quick Setup

Set your API key and run. You can switch providers instantly via the -m flag.

# OpenAI (default)
export OPENAI_API_KEY=your_openai_api_key_here
export ANTHROPIC_API_KEY=your_anthropic_api_key_here
export GOOGLE_GENERATIVE_AI_API_KEY=your_google_gemini_api_key_here
saiki

# Switch providers via CLI
saiki -m claude-3.5-sonnet-20240620
saiki -m gemini-1.5-flash-latest

For comprehensive setup instructions, see our LLM Providers Guide.


Standalone MCP Manager

Need to manage MCP tool servers without the full agent? Use the MCPManager directly in your own applications.

import { MCPManager } from '@truffle-ai/saiki';

// Create manager instance
const manager = new MCPManager();

// Connect to MCP servers
await manager.connectServer('filesystem', {
  type: 'stdio',
  command: 'npx',
  args: ['-y', '@modelcontextprotocol/server-filesystem', '.']
});

// Get all available tools across servers
const tools = await manager.getAllTools();
console.log('Available tools:', Object.keys(tools));

// Execute a tool
const result = await manager.executeTool('readFile', { path: './README.md' });
console.log('File contents:', result);

// Disconnect when done
await manager.disconnectAll();

See the MCP Manager Documentation for the complete API reference.


CLI Reference

Usage: saiki [options] [command] [prompt...]

The Saiki CLI allows you to talk to Saiki, build custom AI Agents, and create complex AI applications. For full documentation, visit https://github.com/truffle-ai/saiki.

Arguments:
  prompt                    Natural-language prompt to run once. If empty, starts interactive CLI.

Options:
  -v, --version             output the current version
  -a, --agent <path>        Path to agent config file (default: "agents/agent.yml")
  -s, --strict              Require all server connections to succeed
  --no-verbose              Disable verbose output
  -m, --model <model>       Specify the LLM model to use.
  -r, --router <router>     Specify the LLM router to use (vercel or in-built)
  --mode <mode>             Runtime mode: cli | web | server | discord | telegram | mcp (default: "cli")
  --web-port <port>         Optional port for the web UI (default: "3000")
  -h, --help                display help for command

Commands:
  create-app                Scaffold a new Saiki Typescript app.
  init-app                  Initialize an existing Typescript app with Saiki.
  mcp                       Run Saiki as an MCP server.

Next Steps


Contributing

We welcome contributions! Refer to our Contributing Guide for more details.

Community & Support

Saiki is built by the team at Truffle AI.
Join our Discord to share projects, ask questions, or just say hi!

Discord

If you enjoy Saiki, please give us a ⭐ on GitHub—it helps a lot!

Twitter Follow Twitter Follow


Contributors

Thanks to all these amazing people for contributing to Saiki!

Contributors


License

Elastic License 2.0. See LICENSE for full terms.