@rodgerai/nextjs
v2.0.0
Published
Next.js integration for Rodger.ai - Production-ready AI agents with auth, storage, and tool approval
Downloads
366
Maintainers
Readme
@rodger/nextjs
Production-ready AI chat handlers for Next.js with authentication, storage, and tool approval.
The perfect companion to Vercel AI SDK - adds production features without hiding the AI SDK.
Features
- 🚀 Drop-in Next.js API Routes - One function call to create a production-ready chat endpoint
- 🔐 Auto-detect Authentication - Works with Clerk, Supabase, NextAuth, Auth0
- 💾 Pluggable Storage - Supabase, PostgreSQL, Redis, or in-memory
- ✅ Tool Approval - Require user confirmation for sensitive operations
- 🎯 Transparent AI SDK Integration - Built on top of AI SDK, not replacing it
- 📦 TypeScript - Full type safety
- 🔌 Extensible - Customize everything
Installation
npm install @rodger/nextjs ai
# Choose your AI provider
npm install @ai-sdk/openai
# or
npm install @ai-sdk/anthropicQuick Start
Basic Chat Handler
// app/api/chat/route.ts
import { createChatHandler } from '@rodger/nextjs';
import { openai } from '@ai-sdk/openai';
export const POST = createChatHandler({
model: openai('gpt-4o'),
systemPrompt: 'You are a helpful assistant.'
});That's it! You now have a production-ready chat endpoint at /api/chat.
With Authentication
// app/api/chat/route.ts
import { createChatHandler } from '@rodger/nextjs';
import { openai } from '@ai-sdk/openai';
export const POST = createChatHandler({
model: openai('gpt-4o'),
systemPrompt: 'You are a helpful assistant.',
auth: true // Auto-detects Clerk, Supabase, NextAuth, Auth0
});With Storage (Supabase)
// app/api/chat/route.ts
import { createChatHandler, createSupabaseStorage } from '@rodger/nextjs';
import { openai } from '@ai-sdk/openai';
import { createClient } from '@supabase/supabase-js';
const supabase = createClient(
process.env.NEXT_PUBLIC_SUPABASE_URL!,
process.env.SUPABASE_SERVICE_ROLE_KEY!
);
const storage = createSupabaseStorage({
connection: supabase,
tables: {
messages: 'chat_messages',
sessions: 'chat_sessions'
}
});
export const POST = createChatHandler({
model: openai('gpt-4o'),
systemPrompt: 'You are a helpful assistant.',
auth: true,
storage
});With Tools
// app/api/chat/route.ts
import { createChatHandler } from '@rodger/nextjs';
import { openai } from '@ai-sdk/openai';
import { tool } from 'ai';
import { z } from 'zod';
const weatherTool = tool({
description: 'Get current weather',
parameters: z.object({
location: z.string()
}),
execute: async ({ location }) => {
// Your weather API call
return { temperature: 72, conditions: 'Sunny' };
}
});
export const POST = createChatHandler({
model: openai('gpt-4o'),
systemPrompt: 'You are a helpful assistant with access to weather data.',
tools: {
getCurrentWeather: weatherTool
}
});With Tool Approval
Require user confirmation before executing sensitive tools:
import { createChatHandler, withApproval } from '@rodger/nextjs';
import { openai } from '@ai-sdk/openai';
import { tool } from 'ai';
import { z } from 'zod';
const deleteUserTool = withApproval(
tool({
description: 'Delete a user account',
parameters: z.object({
userId: z.string()
}),
execute: async ({ userId }) => {
// Delete user logic
return { success: true };
}
}),
{
message: (args) => `Are you sure you want to delete user ${args.userId}?`,
timeout: 30000 // 30 seconds
}
);
export const POST = createChatHandler({
model: openai('gpt-4o'),
tools: {
deleteUser: deleteUserTool
}
});API Reference
createChatHandler(config, options?)
Create a Next.js route handler for AI chat.
Config
| Property | Type | Description |
|----------|------|-------------|
| model | any | Required. AI SDK model instance (e.g., openai('gpt-4o')) |
| systemPrompt | string \| (ctx) => string | System prompt for the agent |
| tools | Record<string, CoreTool> | Tools available to the agent |
| storage | StorageAdapter | Storage adapter for message persistence |
| auth | boolean \| AuthContext | Enable authentication (auto-detects provider) |
| temperature | number | Temperature for model (0-2) |
| maxTokens | number | Maximum tokens to generate |
| maxSteps | number | Maximum number of tool call steps |
| onStart | (ctx) => void | Callback before streaming starts |
| onComplete | (ctx, result) => void | Callback when streaming completes |
| onError | (ctx, error) => void | Callback on error |
Storage Adapters
createMemoryStorage()
In-memory storage (for development only):
import { createMemoryStorage } from '@rodger/nextjs';
const storage = createMemoryStorage();createSupabaseStorage(config)
Supabase storage adapter:
import { createSupabaseStorage } from '@rodger/nextjs';
import { createClient } from '@supabase/supabase-js';
const supabase = createClient(
process.env.NEXT_PUBLIC_SUPABASE_URL!,
process.env.SUPABASE_SERVICE_ROLE_KEY!
);
const storage = createSupabaseStorage({
connection: supabase,
tables: {
messages: 'chat_messages',
sessions: 'chat_sessions'
}
});See Database Schema for required tables.
createPostgresStorage(config)
PostgreSQL storage adapter:
import { createPostgresStorage } from '@rodger/nextjs';
import { Pool } from 'pg';
const pool = new Pool({
connectionString: process.env.DATABASE_URL
});
const storage = createPostgresStorage({
connection: pool,
tables: {
messages: 'chat_messages',
sessions: 'chat_sessions'
}
});createRedisStorage(config)
Redis storage adapter:
import { createRedisStorage } from '@rodger/nextjs';
import { createClient } from 'redis';
const redis = createClient({
url: process.env.REDIS_URL
});
await redis.connect();
const storage = createRedisStorage({
connection: redis,
options: {
keyPrefix: 'chat:',
messageTTL: 86400 * 30 // 30 days
}
});Authentication
Auto-detect (Recommended)
export const POST = createChatHandler({
model: openai('gpt-4o'),
auth: true // Auto-detects provider
});Supports:
- Clerk (
@clerk/nextjs) - Supabase (
@supabase/ssr) - NextAuth (
next-auth) - Auth0 (
@auth0/nextjs-auth0)
Specific Provider
import { detectClerkAuth } from '@rodger/nextjs';
export const POST = createChatHandler({
model: openai('gpt-4o'),
auth: detectClerkAuth
});Tool Approval
withApproval(tool, config?)
Wrap a tool to require user approval:
import { withApproval } from '@rodger/nextjs';
import { tool } from 'ai';
import { z } from 'zod';
const sensitiveTool = withApproval(
tool({
description: 'Perform sensitive operation',
parameters: z.object({
data: z.string()
}),
execute: async ({ data }) => {
// Sensitive operation
return { success: true };
}
}),
{
message: 'Approve this operation?',
timeout: 60000, // 1 minute
autoApproveOnTimeout: false
}
);Config:
message: Message or function that returns messagetimeout: Timeout in milliseconds (default: 60000)autoApproveOnTimeout: Auto-approve after timeout (default: false)onApprovalRequest: Custom approval handler
Database Schema
Supabase / PostgreSQL
-- Messages table
CREATE TABLE chat_messages (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
session_id TEXT NOT NULL,
role TEXT NOT NULL,
content TEXT NOT NULL,
user_id TEXT,
metadata JSONB,
created_at TIMESTAMPTZ DEFAULT NOW()
);
CREATE INDEX idx_messages_session ON chat_messages(session_id);
CREATE INDEX idx_messages_user ON chat_messages(user_id);
-- Sessions table
CREATE TABLE chat_sessions (
id TEXT PRIMARY KEY,
user_id TEXT,
metadata JSONB,
created_at TIMESTAMPTZ DEFAULT NOW(),
updated_at TIMESTAMPTZ DEFAULT NOW(),
last_message TEXT,
message_count INTEGER DEFAULT 0
);
CREATE INDEX idx_sessions_user ON chat_sessions(user_id);Examples
See the examples directory for complete examples:
- Basic Chat - Simple chat with no auth
- Authenticated Chat - Chat with Clerk auth
- Persistent Chat - Chat with Supabase storage
- Tool Approval - Chat with approval flow
- Full Featured - All features combined
TypeScript
Full TypeScript support with exported types:
import type {
ChatHandlerConfig,
ChatContext,
StorageAdapter,
AuthContext,
ToolWithApproval
} from '@rodger/nextjs';Why @rodger/nextjs?
Vercel AI SDK is amazing for LLM integration, but production apps need more:
- ✅ Authentication - Who is this user?
- ✅ Storage - Save conversation history
- ✅ Tool Approval - Prevent accidental/malicious actions
- ✅ Session Management - Track conversations
- ✅ Error Handling - Graceful failures
@rodger/nextjs provides these features while staying 100% transparent with AI SDK. You're still using streamText() under the hood - we just handle the production stuff.
Peer Dependencies
AI SDK is a peer dependency (optional). Install the version you want:
{
"peerDependencies": {
"ai": "^5.0.0",
"@ai-sdk/openai": "^2.0.0",
"@ai-sdk/anthropic": "^2.0.0",
"next": "^14.0.0 || ^15.0.0"
}
}This means:
- No version lock-in
- No conflicts if you already use AI SDK
- Full control over AI SDK version
- Smaller bundle size
License
MIT
Contributing
See CONTRIBUTING.md
