prompt-as-endpoint
v2.1.0
Published
Declare typed LLM-backed HTTP endpoints with Zod input/output schemas, prompt templates, and built-in retries.
Maintainers
Readme
prompt-as-endpoint
Helper library for creating HTTP endpoints that use LLMs to return schema-validated JSON responses.
Installation
npm install prompt-as-endpointUsage
import { createEndpointHandler } from 'prompt-as-endpoint';
import { z } from 'zod';
const handler = createEndpointHandler({
inputSchema: z.object({ name: z.string(), style: z.string() }),
outputSchema: z.object({ greeting: z.string(), mood: z.string() }),
prompt: 'Greet {name} in a {style} way. Respond with JSON only.',
call: async (messages) => {
// call your LLM here, return the response string
},
});
// handler(input) returns a typed, validated result — wire it into your framework directly:Framework integration
The handler is a plain async function. Wire it into your framework with a few lines:
Hono
app.post('/greet', async (c) => {
const parsed = handler.inputSchema.safeParse(await c.req.json());
if (!parsed.success) return c.json({ error: parsed.error.flatten() }, 400);
return c.json(await handler(parsed.data));
});Express
app.post('/greet', async (req, res) => {
const parsed = handler.inputSchema.safeParse(req.body);
if (!parsed.success) return res.status(400).json({ error: parsed.error.flatten() });
res.json(await handler(parsed.data));
});Next.js App Router
export async function POST(request: NextRequest) {
const parsed = handler.inputSchema.safeParse(await request.json());
if (!parsed.success) return NextResponse.json({ error: parsed.error.flatten() }, { status: 400 });
return NextResponse.json(await handler(parsed.data));
}License
MIT
