@output.ai/llm
v0.2.5
Published
Framework abstraction to interact with LLM models
Readme
LLM Module
Framework abstraction to interact with LLM models, including prompt management and structured generation.
Quick Start
import { generateText } from '@output.ai/llm';
const response = await generateText({
prompt: 'my_prompt@v1',
variables: { topic: 'AI workflows' }
});Features
- Unified API: Single import for prompt loading and LLM generation
- Multiple Generation Types: Text, objects, arrays, and enums
- Prompt Management: Load and render
.promptfiles with variable interpolation - Multi-Provider Support: Anthropic, OpenAI, and Azure
- Type Safety: Full TypeScript support with Zod schemas
Generate Text
Generate unstructured text from an LLM:
import { generateText } from '@output.ai/llm';
const response = await generateText({
prompt: 'explain_topic@v1',
variables: { topic: 'machine learning' }
});Generate Object
Generate a structured object matching a Zod schema:
import { generateObject } from '@output.ai/llm';
import { z } from '@output.ai/core';
const recipeSchema = z.object({
title: z.string(),
ingredients: z.array(z.string()),
steps: z.array(z.string())
});
const recipe = await generateObject({
prompt: 'recipe@v1',
variables: { dish: 'lasagna' },
schema: recipeSchema
});Generate Array
Generate an array of structured items:
import { generateArray } from '@output.ai/llm';
import { z } from '@output.ai/core';
const taskSchema = z.object({
title: z.string(),
priority: z.number()
});
const tasks = await generateArray({
prompt: 'task_list@v1',
variables: { project: 'website' },
schema: taskSchema
});Generate Enum
Generate a value from a list of allowed options:
import { generateEnum } from '@output.ai/llm';
const category = await generateEnum({
prompt: 'categorize@v1',
variables: { text: 'Product announcement' },
enum: ['marketing', 'engineering', 'sales', 'support']
});Prompt Files
Prompt files use YAML frontmatter for configuration and support LiquidJS templating:
File: [email protected]
---
provider: anthropic
model: claude-sonnet-4-20250514
temperature: 0.7
---
<system>
You are a concise technical explainer.
</system>
<user>
Explain {{ topic }} in 3 bullet points.
</user>Configuration Options
Prompt files support these configuration fields:
---
provider: anthropic | openai | azure
model: model-name
temperature: 0.0-1.0 (optional)
maxTokens: number (optional)
providerOptions: (optional)
thinking:
type: enabled
budgetTokens: number
---