npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

openx-ai

v0.0.5

Published

统一的多模型 AI SDK,面向研学 AI、DeepSeek,以及未来更多 LLM 扩展。

Readme

openx-ai SDK 文档

openx-ai SDK

统一的多模型 AI SDK,面向研学 AI、DeepSeek,以及未来更多 LLM 扩展。


🚀 简介

openx-ai 是一个统一的多模型大模型(LLM)调用层,支持:

  • 研学 AI(XAI)
  • DeepSeek(官方流式兼容)
  • 未来扩展更多模型(Qwen / ChatGLM / MiniMax 等)
  • 全流式输出(AsyncIterator)
  • 集成 openx-js-sdk(JWT 自动续期、环境代理、请求封装)
  • 统一错误处理、统一 chunk 解析格式

目标: 让业务侧无论调用哪种模型,都使用 统一的调用代码


📌 版本历史

v0.0.5 – 2026-03-02

  • 基础请求header增加acctoken

v0.0.4 – 2026-02-27

  • 修复整体请求超时问题

v0.0.3 – 2026-02-26

  • 新增整体请求超时(requestTimeoutMs)
    • 从发起请求开始计时
    • 从发起请求开始计时
  • 新增流式空闲超时(idleTimeoutMs)
    • 若长时间未收到 chunk
    • 自动终止流,避免前端卡死
  • 超时统一以“流式结构”返回
    • 即使在请求阶段未成功建立连接
    • 也会返回可迭代流,而非直接抛异常

v0.0.2 – 2025-12-04

  • 当接口非流式失败时,SDK 会自动转换为“流式错误输出”结构
  • 优化 parseStreamChunk,兼容 <CHECK_ERROR> 安全检查拦截
  • 增强错误优先输出策略

📦 安装

npm install openx-ai --save

⚡ 快速开始

import OpenxAI from 'openx-ai'

const client = new OpenxAI({
  appId: 'xxx',            // 研学应用 appId
  model: 'xai',            // xai | deepseek | 更多模型
  apiKey: '',              // 第三方模型的 apiKey
  env: 'development',      // development | test | production
  baseURL: 'https://xxx.cnki.net',
  proxy: {},                // openx-js-sdk 的 proxy 配置
  timeoutOptions:{
      requestTimeoutMs: 60_000,  // 请求超时时间,默认60s
      idleTimeoutMs: 30_000,  // 流式空闲超时,默认30s
  }
})

💬 发起一次对话(非流式)

const res = await client.chat.completions.create({
  messages: [
    { role: 'user', content: '介绍一下你自己' }
  ],
  stream: true,
  source: 'xxx',              // 研学 AI source
  moduleVersionStr: 'xxx'     // 研学 AI 模型版本
})

console.log(res)

🌊 流式输出(统一 AsyncIterator)

所有模型都支持同一套流式接口:

const stream = await client.chat.completions.create({
  messages: [{ role: 'user', content: '写一段诗' }],
  stream: true
})

for await (const chunk of stream) {
  if (chunk === '[DONE]') break
  console.log(chunk)
}

⭐ 说明

  • 若模型原生不支持 stream,SDK 会自动转换为 伪流式 错误输出
  • DeepSeek 工具调用(tool_calls)能流式解析
  • 错误、JWT 刷新等也会在流中正确输出

🛑 停止流式输出

1. 已成功连接的流 → 主动停止

client.abort()

在任何 stream 处理中均可调用。


2. 连接未建立成功 → 用 try/catch 捕获

try {
  const stream = await client.chat.completions.create({ messages, stream: true })

  for await (const chunk of stream) {
    console.log(chunk)
  }

} catch (err) {
  console.error("连接未建立,已通过 try/catch 捕获异常:", err)
}

🔒 JWT 自动续期机制(openx-js-sdk 驱动)

openx-ai 完全整合 openx-js-sdk,实现自动登录状态管理。

| 状态码 | 含义 | SDK 行为说明 | | -------- | ------ | ------------------ | | 5013 | JWT 过期 | 自动刷新 JWT → 自动重试一次 | | 5014 | 刷新失败 | 清除 JWT → 输出“请重新登录” | | 401 | 未授权 | 统一错误 → 停止流 |


🧠 流式逻辑特性

1. 错误优先输出策略

SDK 保证:

先输出模型已经返回的部分内容 再输出错误信息

示例输出:

服务器错误,请稍后重试!
{"code":500, "message":"服务器错误"}
[DONE]

2. <CHECK_ERROR> 自动替换

如果模型输出:

<CHECK_ERROR>

SDK 自动转换为安全内容:

作为AI语言模型,我的目标是以积极、正向和安全的方式提供帮助和信息,您的问题超出了我的回答范围。
[DONE]

3. 普通流式示例

你好
我是
模型
[DONE]

⚙️ 配置项说明(ConfigOptions)

| 字段 | 类型 | 说明 | | ----------- | ------------------------------------- | ----------------------- | | appId | string | 研学应用 AppId(必填) | | model | string | xai, deepseek, qwen... | | env | 'development' | 'test' | 'production' | 环境 | | baseURL | string | 请求基础地址 | | proxy | object | openx-js-sdk 的 proxy 配置 | | apiKey | string | 三方模型 apiKey |


🧪 完整使用示例

import OpenxAI from 'openx-ai'

let xai: OpenxAI | null = null

xai = new OpenxAI({
  appId: 'CRSP_PSMC_RELEASE',
  env: 'development',
  baseURL: 'http://192.168.32.46:1009/proxy-ai/ai/aiCommon/multiChat',
  model: 'xai',
  proxy: {
    '/proxy-sdk-jwt': {
      open: false,
      ws: false,
      target: 'https://xtest.cnki.net/coreapi/api',
      changeOrigin: true,
      pathRewrite: { '^/proxy-sdk-jwt': '/' },
      headers: {
        origin: 'https://x.cnki.net',
        referer: 'https://x.cnki.net'
      }
    },
    '/proxy-sdk-ip': {
      open: false,
      ws: false,
      target: 'https://xtest.cnki.net/ip',
      changeOrigin: true,
      pathRewrite: { '^/proxy-sdk-ip': '/' },
      headers: {
        origin: 'https://x.cnki.net',
        referer: 'https://x.cnki.net'
      }
    }
  }
})

document.querySelector('#start')?.addEventListener('click', async () => {
  const completion = await xai!.chat.completions.create({
    messages: [
      { role: 'system', content: '你是一个友好的AI助手。' },
      { role: 'user', content: '你好啊' }
    ],
    source: 'xaiFullLibraryQA',
    moduleVersionStr: 'tencent_deepseek_v3',
    stream: true
  })

  try {
    for await (const chunk of completion) {
      console.log(chunk)
    }
  } catch (err) {
    console.error('🔴 其他流错误:', err)
  }
})

document.querySelector('#stop')?.addEventListener('click', () => {
  xai?.abort()
  console.log('🛑 已停止流')
})