@andresaya/flowkit
v1.1.0
Published
The simplest way to create AI conversational flows. Define agents and flows declaratively, let the LLM do the heavy lifting.
Maintainers
Readme
FlowKit
The simplest way to create AI conversational flows.
Define your agent and conversation flows declaratively. Let the LLM handle the complexity.
Features
- Simple fluent API for agents and flows
- LLM-powered extraction (no regex)
- Multi-language support
- Provider agnostic (Ollama, OpenAI, OpenRouter, etc.)
- Flexible and strict modes
- Tool calling
- Streaming support on supported providers
Installation
npm install @andresaya/flowkit
# or
pnpm add @andresaya/flowkitQuick Start
import {
agent, flow, FlowEngine, MemoryStorage, OllamaAdapter,
name, yesNo, oneOf,
} from "@andresaya/flowkit";
const assistant = agent("Alex")
.company("ACME Corp")
.personality("friendly, helpful")
.language("en")
.build();
const supportFlow = flow("support", assistant)
.ask("greeting", "Hi! I'm Alex. What's your name?", name(), "customer_name")
.then("ask_type")
.ask(
"ask_type",
"Nice to meet you, {{customer_name}}! How can I help?",
oneOf(["billing", "technical", "other"]),
"issue_type"
)
.when({ billing: "billing_help", technical: "tech_help", other: "general_help" })
.say("billing_help", "I'll transfer you to our billing team, {{customer_name}}.")
.done()
.say("tech_help", "Let me connect you with technical support.")
.done()
.say("general_help", "How can I assist you today?")
.done()
.build();
const engine = new FlowEngine(supportFlow, {
llm: new OllamaAdapter({ model: "llama3.2" }),
storage: new MemoryStorage(),
});
const result = await engine.start("session-1");
console.log(result.message);
const response = await engine.handle("session-1", "I'm John");
console.log(response.message);Documentation
See the documentation site or the docs/ folder:
- Getting Started:
docs/guide/quick-start.md - Agents:
docs/guide/agents.md - Flows:
docs/guide/flows.md - Extractors:
docs/guide/extractors.md - Providers:
docs/providers/ - Storage:
docs/guide/storage.md - Tools:
docs/guide/tools.md - API Reference:
docs/api/
Run Docs Locally
pnpm run docs:dev
pnpm run docs:build
pnpm run docs:previewProviders
// Ollama (Local)
import { OllamaAdapter } from "@andresaya/flowkit";
const llm = new OllamaAdapter({ model: "llama3.2" });
// OpenAI
import { OpenAIAdapter } from "@andresaya/flowkit";
const llm = new OpenAIAdapter({ apiKey: "...", model: "gpt-4o-mini" });
// OpenRouter (100+ models)
import { OpenRouterAdapter } from "@andresaya/flowkit";
const llm = new OpenRouterAdapter({ apiKey: "...", model: "anthropic/claude-3-5-sonnet" });Examples
Working examples in examples/:
examples/01-simple-chat.ts- Basic flexible mode exampleexamples/02-strict-flow.ts- Strict mode customer service flowexamples/feature-tools.ts- Tool callingexamples/feature-memory.ts- Persistence with SQLiteexamples/feature-handoff.ts- Handoff + timeoutsexamples/provider-openai.ts- OpenAI adapter setupexamples/provider-openrouter.ts- OpenRouter adapter setup
See examples/README.md for the full list.
pnpm run dev
pnpm run dev:simple
pnpm run dev:openaiContributing
Read CONTRIBUTING.md for details.
License
MIT - see LICENSE.
