npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

react-ai-agent-chat-sdk

v0.2.9

Published

A React library for building AI-powered chat interfaces with tool execution, configurable timeouts, retry logic, and custom renderers

Readme

React AI Agent Chat SDK

A React library for building AI-powered chat interfaces with tool execution, configurable timeouts, retry logic, and custom renderers. CAUTION NOTE: This library is partially vibe coded, I'll iterate on it to make it more reasonable over time.

Quick Start

1. Install the Package

npm install react-ai-agent-chat-sdk
# or
pnpm add react-ai-agent-chat-sdk

Peer Dependencies:

npm install react react-dom zod

AI Provider (choose one):

# For Anthropic Claude models
npm install @ai-sdk/anthropic

# For OpenAI models  
npm install @ai-sdk/openai

2. Define Your Tools

Create tools with Zod schemas for type-safe input validation:

import { z } from 'zod';
import { createTool } from 'react-ai-agent-chat-sdk/config-server';

const readFileSchema = z.object({
  file_path: z.string().describe('The path to the file to read'),
});

const tools = {
  read_file: createTool({
    description: 'Read the contents of a file',
    display_name: "Reading file",
    inputSchema: readFileSchema,
    execute: async ({ file_path }) => {
      const content = await fs.readFile(file_path, 'utf-8');
      return { file_path, content };
    }
  })
};

3. Define Server Configuration

Create server configuration for API routes and tool execution:

import { makeAgentChatRouteConfig } from 'react-ai-agent-chat-sdk/config-server';
import { anthropic } from '@ai-sdk/anthropic';
import { MemoryStorage } from 'react-ai-agent-chat-sdk/storage';

const agentChatRouteConfig = makeAgentChatRouteConfig({
  system_prompt: `You are a helpful assistant with access to file management tools.`,
  tools,
  auth_func: async () => true, // Replace with your auth logic
  storage: new MemoryStorage(), // Use your preferred storage
  modelConfig: {
    model: anthropic('claude-sonnet-4-20250514'),
    temperature: 0.3
  }
});

4. Define Client Configuration

Create client configuration for the React UI:

import { makeAgentChatClientConfig } from 'react-ai-agent-chat-sdk/config-client';

const agentChatClientConfig = makeAgentChatClientConfig({
  tools: {
    read_file: {
      display_name: "Reading file"
    }
  },
  route: "/api/chat", // Your chat API endpoint
  // Optional: custom tool renderers
  toolRenderers: {
    read_file: CustomFileRenderer
  }
});

5. Add Chat and History Routes

Create API routes for chat and history:

For Next.js App Router (app/api/chat/route.ts):

import { chatRoute } from 'react-ai-agent-chat-sdk/api';
import { agentChatRouteConfig } from '@/lib/agent-config';

export async function POST(req: Request) {
  return chatRoute(agentChatRouteConfig, req);
}

History Route (app/api/chat/history/route.ts):

import { chatHistoryRoute } from 'react-ai-agent-chat-sdk/api';
import { agentChatRouteConfig } from '@/lib/agent-config';

export async function GET(req: Request) {
  return chatHistoryRoute(agentChatRouteConfig, req);
}

For Express.js (server.js):

import { AgentChatRoute } from 'react-ai-agent-chat-sdk/api';
import { agentChatRouteConfig } from './lib/agent-config.js';

const app = express();

// Chat endpoint
app.post('/api/chat', AgentChatRoute(agentChatRouteConfig));

// History endpoint  
app.get('/api/chat/history', AgentChatRoute({
  ...agentChatRouteConfig,
  method: 'GET'
}));

6. Add AgentChat UI Element

Use the chat component in your React app:

'use client';

import { useEffect, useState } from 'react';
import { AgentChat } from 'react-ai-agent-chat-sdk';
import 'react-ai-agent-chat-sdk/agent-chat.css';
import { agentChatClientConfig } from '@/lib/agent-chat-client-config';

export default function ChatPage() {
  const [conversationId, setConversationId] = useState<string>('');
  
  useEffect(() => {
    // Load or create conversation ID for persistence
    let id = localStorage.getItem('current-conversation-id');
    if (!id) {
      id = `conv_${crypto.randomUUID()}`;
      localStorage.setItem('current-conversation-id', id);
    }
    setConversationId(id);
  }, []);
  
  if (!conversationId) {
    return <div>Loading...</div>;
  }
  
  return (
    <AgentChat 
      config={agentChatClientConfig} 
      conversationId={conversationId} 
    />
  );
}

Architecture Overview

The SDK now uses a frontend/backend separation architecture:

  • Backend Configuration (config-server): Handles tool execution, AI model configuration, authentication, and storage. Used in API routes.
  • Frontend Configuration (config-client): Handles UI rendering, tool display names, and custom renderers. Used in React components.
  • Shared Types: Both configurations share the same tool definitions and conversation structure.

Customization

Tool Renderers

Create custom renderers for specific tools:

import { ToolCall, ToolResult } from 'react-ai-agent-chat-sdk/config';

export function CustomFileRenderer({ toolCall, toolResult }: { 
  toolCall: ToolCall; 
  toolResult?: ToolResult 
}) {
  const hasError = toolResult?.output?.__toolError;
  const isTimeout = hasError && toolResult?.output?.__errorType === 'ToolTimeoutError';
  
  const getStatusText = () => {
    if (isTimeout) return 'Timed out';
    if (hasError) return 'Error';
    if (toolResult?.output) return 'Completed';
    return 'Running';
  };

  return (
    <div className={`custom-renderer ${hasError ? 'error' : ''}`}>
      <div>📁 {toolCall.toolName} - {getStatusText()}</div>
      {toolResult?.output && (
        <pre>{JSON.stringify(toolResult.output, null, 2)}</pre>
      )}
    </div>
  );
}

Add renderers to your configuration:

// lib/agent-chat-client-config.ts
import { makeAgentChatClientConfig } from 'react-ai-agent-chat-sdk/config-client';
import { CustomFileRenderer } from './renderers';

export const agentChatClientConfig = makeAgentChatClientConfig({
  tools: {
    read_file: {
      display_name: "Reading file"
    }
  },
  route: "/api/chat",
  toolRenderers: {
    read_file: CustomFileRenderer,
  }
});

Route Parameters

Customize API endpoints to fit your application structure:

// Server config
const agentChatRouteConfig = makeAgentChatRouteConfig({
  system_prompt: "You are a helpful assistant.",
  tools,
  auth_func: async () => true,
  storage: new MemoryStorage()
});

// Client config
const agentChatClientConfig = makeAgentChatClientConfig({
  tools: {
    // Tool definitions for display
  },
  route: "/api/v1/chat", // Custom chat route
  historyRoute: "/api/v1/history" // Custom history route (optional)
});

Retry Configurations

Configure timeouts and retries globally and per-tool:

Global Configuration:

const agentChatRouteConfig = makeAgentChatRouteConfig({
  system_prompt: "You are a helpful assistant.",
  tools,
  auth_func: async () => true,
  storage: new MemoryStorage(),
  toolExecutionConfig: {
    timeoutMs: 30000, // 30 seconds default
    retries: 3,
    retryDelayMs: 1000 // 1 second initial delay
  }
});

Per-Tool Configuration:

const tools = {
  slow_operation: createTool({
    description: 'A slow operation that needs longer timeout',
    display_name: "Processing data",
    inputSchema: z.object({}),
    execute: async () => {
      // Long-running operation
    },
    executionConfig: {
      timeoutMs: 60000, // 1 minute timeout
      retries: 1, // Only 1 retry
      retryDelayMs: 5000 // 5 second delay
    }
  })
};

Storage Configuration:

import { MemoryStorage } from 'react-ai-agent-chat-sdk/storage';

// For development
const storage = new MemoryStorage();

// For production, implement ChatStorage interface
class MyStorage implements ChatStorage {
  async saveMessage(conversationId: string, message: ChatMessage): Promise<void> {
    // Save to your database
  }
  
  async getConversation(conversationId: string): Promise<Conversation | null> {
    // Retrieve from your database
  }
}

const agentChatRouteConfig = makeAgentChatRouteConfig({
  system_prompt: "You are a helpful assistant.",
  tools,
  auth_func: async () => true,
  storage // Add storage for conversation persistence
});

Model Configuration:

import { openai } from '@ai-sdk/openai';
import { messageCountIs } from 'ai';

const agentChatRouteConfig = makeAgentChatRouteConfig({
  system_prompt: "You are a helpful assistant.",
  tools,
  auth_func: async () => true,
  storage: new MemoryStorage(),
  modelConfig: {
    model: openai('gpt-4o'), // Use different AI models
    temperature: 0.7,
    stopWhen: messageCountIs(10), // Stop after 10 messages
    onStepFinish: (step) => {
      console.log('Step finished:', step.finishReason);
    }
  }
});