npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

litellm-js

v0.2.0

Published

Universal JavaScript client for LLM APIs

Readme

LiteLLM-JS

JavaScript 版本的 LiteLLM,提供统一的接口来访问不同的大型语言模型 API。

特点

  • 支持多种 LLM 提供商(OpenAI, Anthropic, 等)
  • 代理模式支持
  • 完全兼容浏览器和 Node.js 环境
  • 支持流式输出
  • 简单、一致的 API 接口

安装

npm install litellm-js

快速开始

基本使用

import liteLLM from 'litellm-js';

// 注册提供商
liteLLM.registerProvider('openai', {
  apiKey: 'your-openai-api-key'
});

// 生成完成
const response = await liteLLM.completion({
  model: 'gpt-3.5-turbo',
  messages: [
    { role: 'system', content: '你是一名有用的助手。' },
    { role: 'user', content: '告诉我关于 JavaScript 的知识。' }
  ]
});

console.log(response);

// 流式输出
for await (const chunk of liteLLM.streamCompletion({
  model: 'gpt-3.5-turbo',
  messages: [
    { role: 'system', content: '你是一名有用的助手。' },
    { role: 'user', content: '告诉我关于 JavaScript 的知识。' }
  ]
})) {
  console.log(chunk);
}

使用代理

import liteLLM from 'litellm-js';

// 创建代理
liteLLM.createProxy({
  name: 'my-proxy',
  url: 'http://localhost:8000',
  models: ['gpt-4', 'claude-2'], // 这些模型会通过代理路由
  headers: {
    'Authorization': 'Bearer your-proxy-key'
  }
});

// 通过代理使用模型
const response = await liteLLM.completion({
  model: 'gpt-4',
  messages: [{ role: 'user', content: '你好!' }]
});

支持的提供商

  • OpenAI (GPT 系列模型)
  • Anthropic (Claude 系列模型)
  • Azure OpenAI
  • Google (Gemini, PaLM)
  • 更多提供商正在添加中...

高级用法

自定义基本 URL

liteLLM.registerProvider('openai', {
  apiKey: 'your-openai-api-key',
  baseUrl: 'https://custom-openai-endpoint.com/v1'
});

设置默认参数

liteLLM.registerProvider('anthropic', {
  apiKey: 'your-anthropic-api-key',
  defaultParams: {
    temperature: 0.5,
    max_tokens: 1000
  }
});

贡献

欢迎贡献!请随时提交 Pull Request 或创建 Issue 讨论新功能或报告问题。

许可

MIT