npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@302ai/studio-plugin-sdk

v1.0.0

Published

Plugin SDK for 302.AI Studio - Build plugins for AI chat application

Readme

@302ai/studio-plugin-sdk

Plugin SDK for 302.AI Studio - Build powerful AI provider plugins with ease.

npm version License: MIT

Overview

The 302.AI Studio Plugin SDK allows developers to create custom AI provider plugins that integrate seamlessly with the 302.AI Studio desktop application. Build plugins to add support for new AI providers, customize message processing, or extend functionality with hooks.

Features

  • 🎯 Type-Safe API - Full TypeScript support with comprehensive type definitions
  • 🧩 BaseProviderPlugin - Abstract base class with common utilities
  • 🪝 Hook System - Intercept and modify messages, responses, and errors
  • 💾 Storage API - Persist plugin configuration and data
  • 🌐 HTTP Client - Built-in authenticated HTTP requests
  • 🎨 UI Integration - Show notifications, dialogs, and custom components
  • 📝 Logging - Structured logging with plugin context
  • 🌍 i18n Support - Built-in internationalization capabilities

Installation

# Using npm
npm install @302ai/studio-plugin-sdk

# Using pnpm
pnpm add @302ai/studio-plugin-sdk

# Using yarn
yarn add @302ai/studio-plugin-sdk

Quick Start

Creating a Basic Provider Plugin

import { BaseProviderPlugin, type Model, type ModelProvider } from "@302ai/studio-plugin-sdk";

export class MyProviderPlugin extends BaseProviderPlugin {
	protected providerId = "my-provider";
	protected providerName = "My AI Provider";
	protected apiType = "openai";
	protected defaultBaseUrl = "https://api.example.com/v1";

	protected websites = {
		official: "https://example.com",
		apiKey: "https://example.com/api-keys",
		docs: "https://docs.example.com",
		models: "https://docs.example.com/models",
	};

	async onFetchModels(provider: ModelProvider): Promise<Model[]> {
		const url = this.buildApiUrl(provider, "models");
		const response = await this.httpRequest<{ data: any[] }>(url, {
			method: "GET",
			provider,
		});

		return response.data.map((model) => ({
			id: model.id,
			name: model.name,
			remark: `${this.providerName} ${model.id}`,
			providerId: this.providerId,
			capabilities: this.parseModelCapabilities(model.id),
			type: "language",
			custom: false,
			enabled: true,
			collected: false,
		}));
	}
}

export default MyProviderPlugin;

Plugin Configuration

Create a plugin.json file in your plugin directory:

{
	"id": "com.example.my-provider",
	"name": "My AI Provider",
	"version": "1.0.0",
	"type": "provider",
	"description": "Integration with My AI Provider API",
	"author": "Your Name",
	"permissions": ["network", "storage"],
	"compatibleVersion": ">=1.0.0",
	"main": "main/index.js",
	"builtin": false,
	"configSchema": {
		"type": "object",
		"properties": {
			"apiKey": {
				"type": "string",
				"title": "API Key",
				"description": "Your API key for authentication"
			}
		},
		"required": ["apiKey"]
	}
}

Core Concepts

BaseProviderPlugin

The BaseProviderPlugin abstract class provides:

  • Authentication - Default API key validation
  • HTTP Utilities - Authenticated requests with proper headers
  • Model Parsing - Capability and type detection
  • Error Handling - Common error scenarios (401, 429, timeout)
  • Logging & Notifications - Built-in logging and user notifications

Required Methods:

  • onFetchModels(provider: ModelProvider): Promise<Model[]> - Fetch available models

Optional Overrides:

  • getIconUrl() - Custom provider icon
  • getConfigSchema() - Custom configuration schema
  • testConnection(provider) - Connection validation
  • getAuthHeaders(provider) - Custom authentication headers

Hook System

Plugins can implement hooks to customize behavior:

onBeforeSendMessage

Modify messages before sending to the AI:

async onBeforeSendMessage(context: MessageContext): Promise<MessageContext> {
  // Add system prompt
  context.messages.unshift({
    role: "system",
    content: "You are a helpful assistant.",
  });
  return context;
}

onAfterSendMessage

Process responses after receiving:

async onAfterSendMessage(context: MessageContext, response: AIResponse): Promise<void> {
  this.log("info", `Used ${response.usage?.totalTokens} tokens`);
}

onStreamChunk

Modify streaming response chunks:

async onStreamChunk(chunk: StreamChunk): Promise<StreamChunk> {
  if (chunk.type === "text" && chunk.text) {
    chunk.text = chunk.text.toUpperCase(); // Example modification
  }
  return chunk;
}

onError

Handle errors with retry logic:

async onError(context: ErrorContext): Promise<ErrorHandleResult> {
  if (context.error.message.includes("429")) {
    return {
      handled: true,
      retry: true,
      retryDelay: 5000,
      message: "Rate limit exceeded. Retrying in 5 seconds...",
    };
  }
  return { handled: false };
}

Plugin API

The PluginAPI is injected during initialization:

Storage

// Configuration (visible in UI)
await this.api.storage.setConfig("apiKey", "sk-...");
const apiKey = await this.api.storage.getConfig<string>("apiKey");

// Private data (not visible in UI)
await this.api.storage.setData("cache", { timestamp: Date.now() });
const cache = await this.api.storage.getData("cache");

HTTP Client

// GET request
const models = await this.api.http.get<ModelList>("https://api.example.com/models");

// POST request with body
const result = await this.api.http.post("https://api.example.com/chat", {
  messages: [...],
});

UI Integration

// Show notification
this.api.ui.showNotification("Model loaded successfully", "success");

// Show dialog
const result = await this.api.ui.showDialog({
	title: "Confirm Action",
	message: "Are you sure?",
	type: "question",
	buttons: ["Yes", "No"],
});

Logging

this.api.logger.debug("Debug information");
this.api.logger.info("Informational message");
this.api.logger.warn("Warning message");
this.api.logger.error("Error message");

Advanced Usage

Custom Authentication

Override getAuthHeaders for custom auth:

protected getAuthHeaders(provider: ModelProvider): Record<string, string> {
  return {
    "X-API-Key": provider.apiKey,
    "X-Custom-Header": "value",
  };
}

Capability Detection

Customize model capability parsing:

protected parseModelCapabilities(modelId: string): Set<string> {
  const capabilities = new Set<string>();

  if (modelId.includes("vision")) {
    capabilities.add("vision");
  }

  if (modelId.includes("function")) {
    capabilities.add("function_call");
  }

  return capabilities;
}

Direct ProviderPlugin Implementation

For full control, implement ProviderPlugin directly:

import type { ProviderPlugin, PluginAPI } from "@302ai/studio-plugin-sdk";

export class CustomPlugin implements ProviderPlugin {
	api?: PluginAPI;

	async initialize(api: PluginAPI): Promise<void> {
		this.api = api;
	}

	getProviderDefinition() {
		return {
			id: "custom",
			name: "Custom Provider",
			// ... other properties
		};
	}

	async onAuthenticate(context) {
		// Custom auth logic
	}

	async onFetchModels(provider) {
		// Custom model fetching
	}
}

Type Reference

Core Types

  • Model - AI model definition
  • ModelProvider - Provider configuration
  • ChatMessage - Chat message structure
  • PluginMetadata - Plugin metadata from plugin.json

Hook Types

  • MessageContext - Message hook context
  • StreamChunk - Streaming response chunk
  • AIResponse - Complete AI response
  • ErrorContext - Error hook context
  • AuthContext - Authentication hook context

API Types

  • PluginAPI - Main plugin API
  • PluginStorageAPI - Storage operations
  • PluginHttpAPI - HTTP client
  • PluginUIAPI - UI operations
  • PluginLoggerAPI - Logging utilities

Examples

Check the plugins/builtin/ directory in the main repository for complete examples:

  • OpenAI Plugin - Standard OpenAI API integration
  • Anthropic Plugin - Claude models with custom headers
  • Google Plugin - Gemini models with custom parsing
  • Debug Plugin - Full hook implementation example

Publishing Your Plugin

Package Structure

my-plugin/
├── plugin.json          # Plugin metadata
├── main/
│   └── index.ts        # Main plugin code
├── package.json        # npm package config
└── README.md           # Plugin documentation

Build Script

{
	"scripts": {
		"build": "tsc && cp plugin.json dist/"
	}
}

Publishing to npm

npm publish --access public

Users can then install your plugin via URL in 302.AI Studio.

Development Tips

  1. Use TypeScript - Full type safety and autocomplete
  2. Test Thoroughly - Test authentication, model fetching, and message sending
  3. Handle Errors - Implement proper error handling and retry logic
  4. Log Appropriately - Use appropriate log levels for debugging
  5. Document Config - Provide clear configuration schema and defaults
  6. Version Compatibility - Specify compatible app versions in plugin.json

API Compatibility

This SDK follows semantic versioning. The API is stable for v1.x releases.

License

MIT License - see LICENSE file for details

Support

Contributing

Contributions are welcome! Please read our contributing guidelines before submitting PRs.


Built with ❤️ by 302.AI