llmist
v17.5.1
Published
TypeScript LLM client with streaming tool execution. Tools fire mid-stream. Built-in function calling works with any model—no structured outputs or native tool support required.
Downloads
5,973
Maintainers
Readme
llmist
Streaming-first multi-provider LLM client in TypeScript with home-made tool calling.
llmist implements its own tool calling syntax called "gadgets" - tools execute the moment their block is parsed, not after the response completes. Works with any model that can follow instructions.
Installation
npm install llmistQuick Start
import { Gadget, LLMist, z } from 'llmist';
// Define a gadget (tool) with Zod schema
class Calculator extends Gadget({
description: 'Performs arithmetic operations',
schema: z.object({
operation: z.enum(['add', 'subtract', 'multiply', 'divide']),
a: z.number(),
b: z.number(),
}),
}) {
execute(params: this['params']): string {
const { operation, a, b } = params;
switch (operation) {
case 'add': return String(a + b);
case 'subtract': return String(a - b);
case 'multiply': return String(a * b);
case 'divide': return String(a / b);
}
}
}
// Run the agent
const answer = await LLMist.createAgent()
.withModel('sonnet')
.withGadgets(Calculator)
.askAndCollect('What is 15 times 23?');
console.log(answer);Features
- Streaming-first - Tools execute mid-stream, not after response completes
- Multi-provider - OpenAI, Anthropic, Gemini, HuggingFace with unified API
- Type-safe - Full TypeScript inference from Zod schemas
- Flexible hooks - Observers, interceptors, and controllers for deep integration
- MCP integration - Consume stdio and Streamable HTTP MCP servers; publish llmist gadgets and skills as MCP tools and prompts
- Built-in cost tracking - Real-time token counting and cost estimation
- Multimodal - Vision and audio input support
Model Context Protocol (MCP)
Attach MCP servers to an agent with the same gadget execution pipeline as native tools:
const answer = await LLMist.createAgent()
.withModel('sonnet')
.withMcpServer({
name: 'filesystem',
transport: 'stdio',
command: 'npx',
args: ['-y', '@modelcontextprotocol/server-filesystem', '/tmp'],
timeoutMs: 30000,
})
.askAndCollect('List files in /tmp');llmist also exports createMcpServer({ gadgets, skills }) so applications can publish native gadgets as MCP tools and skills as MCP prompts.
Providers
Set one of these environment variables:
export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."
export GEMINI_API_KEY="..."
export HF_TOKEN="hf_..."Use model aliases for convenience:
.withModel('sonnet') // Claude 3.5 Sonnet
.withModel('opus') // Claude Opus 4
.withModel('gpt4o') // GPT-4o
.withModel('flash') // Gemini 2.0 FlashDocumentation
Full documentation at llmist.dev
Examples
See the examples directory for runnable examples covering all features.
Related Packages
@llmist/cli- Command-line interface@llmist/testing- Testing utilities and mocks
License
MIT
