npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

create-llm-agent

v1.0.2

Published

Quickly scaffold a TypeScript-based LLM agent application with local or remote AI models

Downloads

15

Readme

create-llm-agent

Quickly scaffold a TypeScript-based LLM agent application with local or remote AI models

npm version License: MIT

🚀 Quick Start

Create a new LLM agent project with a single command:

npx create-llm-agent my-agent

Or using yarn:

yarn create-llm-agent my-agent

🎯 Features

  • OpenAI Compatible: Support for OpenAI and any OpenAI-compatible API providers (Even local models)
  • TypeScript: Full TypeScript support out of the box
  • Local Database: JSON-based local storage for conversations and memory
  • Configurable Tools: Optional tool/function calling system that can be enabled or disabled
  • System Prompt Options: Choose from General Assistant, Code Assistant, or create custom prompts
  • Interactive CLI: Beautiful command-line interface for chatting with your agent
  • Zero Config: Works out of the box with sensible defaults
  • Modular Architecture: Clean, organized code structure

📋 Requirements

  • Node.js 20.0 or higher
  • npm, yarn, or pnpm
  • For local models: Ollama installed and running

🛠️ Installation & Usage

Interactive Mode (Recommended)

Simply run the create command and follow the prompts:

npx create-llm-agent

You'll be asked to:

  1. Choose a project name
  2. Configure your LLM settings (model, API key, base URL)
  3. Enable or disable tool calling
  4. Select a system prompt (General, Coding, or Custom)

Quick Mode

Provide the project name directly:

npx create-llm-agent my-agent

🏗️ Project Structure

Your generated project will have the following structure:

my-agent/
├── src/
│   ├── agent/          # Agent logic and orchestration
│   ├── cli/            # CLI utilities
│   ├── config/         # Configuration and system prompts
│   ├── constants/      # Application constants
│   ├── llm/            # LLM client implementations
│   ├── memory/         # Memory and storage management
│   ├── tools/          # Tool definitions and runner
│   ├── types/          # TypeScript type definitions
│   ├── ui/             # User interface components
│   └── index.ts        # Main entry point
├── db.json             # Local JSON database
├── .env               # Environment configuration
├── package.json
└── tsconfig.json

🎮 Available Scripts

Once your project is created, you can use these commands:

  • npm run dev - Start the chat interface in development mode
  • npm run build - Build the project for production
  • npm start - Run the production build
  • npm run clear - Clear conversation history

🔧 Configuration

Environment Variables

The .env file contains your LLM configuration:

# LLM Configuration
LLM_MODEL=gpt-4o-mini
LLM_API_KEY=sk-your-api-key-here
LLM_BASE_URL=https://api.openai.com/v1
LLM_TOOL_CALLING=true

Supported Configurations

OpenAI

LLM_MODEL=gpt-4o-mini
LLM_API_KEY=sk-your-api-key-here
LLM_BASE_URL=https://api.openai.com/v1
LLM_TOOL_CALLING=true

Local Model (Llama 3.1 with Ollama)

LLM_MODEL=llama3.1:8b
LLM_API_KEY=
LLM_BASE_URL=http://localhost:11434/v1
LLM_TOOL_CALLING=false

📚 Examples

Basic Chat

After creating your project:

cd my-agent
npm run dev

Then interact with your agent through the CLI interface.

Adding Custom Tools

Create new tools in src/tools/ to extend your agent's capabilities:

import { z } from "zod";
import { ToolFn } from "../types";

export const weatherToolDefinition = {
  name: "get_weather",
  parameters: z.object({
    location: z.string().describe("The city name"),
  }),
  description: "Get current weather information for a location",
};

type Args = z.infer<typeof weatherToolDefinition.parameters>;

export const getWeather: ToolFn<Args, string> = async ({
  toolArgs,
  userMessage,
}) => {
  // Implementation here
  return `The weather in ${toolArgs.location} is sunny and 22°C`;
};

Custom System Prompts

Modify src/config/systemPrompt.ts to customize your agent's behavior:

export const systemPrompt = `
You are a helpful AI assistant specialized in...
Your key capabilities include...
`;

🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/AmazingFeature)
  3. Commit your changes (git commit -m 'Add some AmazingFeature')
  4. Push to the branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

📝 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments

📧 Support

For issues and questions, please open an issue on GitHub.