npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@krutai/ai-provider

v0.2.13

Published

AI provider package for KrutAI — fetch-based client for your deployed LangChain server with API key validation

Readme

@krutai/ai-provider

AI provider package for KrutAI — fetch-based client form our deployed server.

Features

  • 🔑 API Key validation — validates your key against the server before use
  • 🚀 Zero SDK dependencies — uses native fetch only
  • 📡 Streaming — SSE-based streaming via async generator
  • 💬 Multi-turn chat — full conversation history support
  • ⚙️ Configurable — pass any model name to the server

Installation

npm install @krutai/ai-provider

Quick Start

import { krutAI } from '@krutai/ai-provider';

const ai = krutAI({
  apiKey: 'your-krutai-api-key',
  // Optional: omitted to use the default local dev server ('http://localhost:8000')
  // serverUrl: 'https://krut.ai',
});

await ai.initialize(); // validates key with your server

// Single response
const text = await ai.chat('Write a poem about TypeScript');
console.log(text);

Usage

Chat (single response)

const ai = krutAI({
  apiKey: process.env.KRUTAI_API_KEY!,
  serverUrl: 'https://krut.ai', // Override default for production
  model: 'gpt-4o', // optional — server's default is used if omitted
});

await ai.initialize();

const text = await ai.chat('Explain async/await in JavaScript', {
  system: 'You are a helpful coding tutor.',
  maxTokens: 500,
  temperature: 0.7,
});

console.log(text);

Multi-turn Chat

const ai = krutAI({
  apiKey: process.env.KRUTAI_API_KEY!,
});

await ai.initialize();

const response = await ai.chat([
  { role: 'system', content: 'You are a helpful assistant.' },
  { role: 'user', content: 'What is the capital of France?' },
  { role: 'assistant', content: 'Paris.' },
  { role: 'user', content: 'What is it famous for?' },
]);

console.log(response);

Multimodal Messages (Images)

For vision-supported models, you can pass an array of ContentParts instead of a flat string:

const response = await ai.chat([
  {
    role: 'user',
    content: [
      { type: 'text', text: 'Describe this image for me.' },
      { 
        type: 'image_url', 
        image_url: { url: 'https://example.com/logo.png' } 
      }
    ]
  }
], { 
  model: 'gpt-4o',
  // You can also pass images, documents, or pdfs via GenerateOptions
  images: ['https://example.com/photo.jpg'],
  documents: ['https://example.com/doc.docx'],
  pdf: ['https://example.com/report.pdf']
});

Streaming (Proxying SSE Streams)

If you are building an API route (e.g., in Next.js) and want to pipe the true Server-Sent Events (SSE) stream down to your backend component, use streamChatResponse.

streamChatResponse returns the raw fetch Response object containing the text/event-stream body from deployed LangChain server.

// app/api/chat/route.ts
export async function POST(req: Request) {
  const { messages } = await req.json();

  // Returns the native fetch Response (with text/event-stream headers and body)
  const response = await ai.streamChatResponse(messages);
  
  // Proxy it directly to the backend!
  return response;
}

If you need to consume the stream in a Node environment rather than proxying it, you can read from the response body directly:

const response = await ai.streamChatResponse([
  { role: 'user', content: 'Tell me a short story' }
]);

const reader = response.body?.getReader();
const decoder = new TextDecoder();

if (reader) {
  while (true) {
    const { done, value } = await reader.read();
    if (done) break;
    process.stdout.write(decoder.decode(value, { stream: true }));
  }
}

Skip validation (useful for tests)

const ai = krutAI({
  apiKey: 'test-key',
  serverUrl: 'http://localhost:3000',
  validateOnInit: false, // skips the /validate round-trip
});

// No need to call initialize() when validateOnInit is false
const text = await ai.chat('Hello!');

Server API Contract

Your LangChain server must expose these endpoints:

| Endpoint | Method | Auth | Body | |---|---|---|---| | /validate | POST | x-api-key header | { "apiKey": "..." } | | /generate | POST | Authorization: Bearer <key> | { "prompt": "...", "model": "...", ... } | | /stream | POST | Authorization: Bearer <key> | { "messages": [...], "model": "...", ... } |

Validation response: { "valid": true } or { "valid": false, "message": "reason" }

AI response: { "text": "..." } or { "content": "..." } or { "message": "..." }

Stream: text/event-stream with data: <chunk> lines, ending with data: [DONE]

API Reference

krutAI(config)

Factory function — preferred way to create a provider.

const ai = krutAI({
  apiKey: string;           // required — KrutAI API key
  serverUrl?: string;       // optional — defaults to 'http://localhost:8000'
  model?: string;           // optional — passed to server (default: 'default')
  validateOnInit?: boolean; // optional — default: true
});

KrutAIProvider

Full class API with the same methods as above. Use when you need the class directly.

Exports

export { krutAI, KrutAIProvider, KrutAIKeyValidationError, validateApiKey, validateApiKeyFormat, DEFAULT_MODEL };
export type { KrutAIProviderConfig, GenerateOptions, ChatMessage };

License

MIT