npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@see2/dzmm-provider

v1.0.0

Published

DZMM (AI-SDK) Provider - NaLang models OpenAI-compatible provider for Vercel AI SDK

Downloads

12

Readme

DZMM Provider for Vercel AI SDK

DZMM (大專門) Provider - 基於 NaLang 模型的 OpenAI 相容 provider,適用於 Vercel AI SDK。

安裝

npm install @see2/dzmm-provider

配置

設定 API 密鑰環境變數:

export DZMM_API_KEY=your-api-key

預設 API 端點為:https://www.gpt4novel.com/api/xiaoshuoai/ext/v1

也可以自定義端點:

export DZMM_BASE_URL=https://your-api-endpoint.com/v1

可用模型

✅ 推薦使用(穩定可用)

  • nalang-xl-10 - NaLang XL 10 (預設) - 快速回應,適合一般對話
  • nalang-xl-16k - NaLang XL 16k - 大上下文窗口,適合長文本處理
  • nalang-max-7-32K - NaLang Max 7 32K - 超大上下文窗口,適合處理長文檔

❌ 目前不可用

Turbo 模型系列(空回應問題):

  • nalang-turbo - NaLang Turbo - 空回應
  • nalang-v17-2 - NaLang v17-2 (Turbo) - 空回應
  • nalang-turbo-v19 - NaLang Turbo v19 - 空回應
  • nalang-turbo-v18 - NaLang Turbo v18 - 空回應

新模型系列(404 Model not found):

  • Max-0826 - Max 0826 (最新旗艦模型)
  • Max-0619 - Max 0619 (上一代大模型)
  • XL-0826 - XL 0826 (最新一代大模型)
  • XL-0430 - XL 0430 (自帶文風)
  • Turbo-0826 - Turbo 0826 (新文風小模型)
  • Medium-0826 - Medium 0826 (性價比之王)

重要提醒

⚠️ API 限制:DZMM API 主要支援串流模式,建議所有請求都使用 streamText 而非 generateText,以確保最佳的相容性和回應品質。

使用方式

基本使用(推薦使用串流模式)

import { dzmm } from '@see2/dzmm-provider';
import { streamText } from 'ai';

// 使用串流模式(推薦)
const { textStream } = await streamText({
  model: dzmm('nalang-xl-10'),
  prompt: '你好,請簡短介紹一下自己。',
});

for await (const textPart of textStream) {
  process.stdout.write(textPart);
}

自定義設置

import { dzmm } from '@see2/dzmm-provider';
import { streamText } from 'ai';

const { textStream } = await streamText({
  model: dzmm('nalang-xl-10', {
    temperature: 0.7,
    maxTokens: 800,
    topP: 0.35,
    repetitionPenalty: 1.05,
  }),
  prompt: '寫一個關於AI和人類友誼的短故事。',
});

let fullText = '';
for await (const textPart of textStream) {
  fullText += textPart;
  process.stdout.write(textPart);
}

自定義 Provider

import { createDzmm } from '@see2/dzmm-provider';
import { generateText } from 'ai';

const customDzmm = createDzmm({
  apiKey: 'your-api-key',
  baseURL: 'https://www.gpt4novel.com/api/xiaoshuoai/ext/v1',
  headers: {
    'X-Custom-Header': 'value',
  },
});

const { text } = await generateText({
  model: customDzmm('nalang-xl-16k'),
  prompt: '解釋 JavaScript 的特性',
});

串流模式

import { dzmm } from '@see2/dzmm-provider';
import { streamText } from 'ai';

const { textStream } = await streamText({
  model: dzmm('nalang-xl-10'),
  prompt: '列出5個學習編程的建議。',
  temperature: 0.7,
  maxTokens: 500,
});

for await (const textPart of textStream) {
  process.stdout.write(textPart);
}

多輪對話

import { dzmm } from '@see2/dzmm-provider';
import { streamText } from 'ai';

const { textStream } = await streamText({
  model: dzmm('nalang-xl-16k'),
  messages: [
    { role: 'system', content: '你是一個知識豐富的老師。' },
    { role: 'user', content: '什麼是機器學習?' },
    { role: 'assistant', content: '機器學習是人工智慧的一個分支...' },
    { role: 'user', content: '請舉一個具體的應用例子。' },
  ],
});

for await (const textPart of textStream) {
  process.stdout.write(textPart);
}

模型方法

// Chat 模型(推薦使用)
const chatModel = dzmm.chatModel('nalang-xl-10');

// 大上下文窗口模型
const largeContextModel = dzmm.chatModel('nalang-xl-16k');

// 注意:目前只支援 Chat 模型,Completion 和 Embedding 模型暫不可用

設置參數

  • temperature: 控制輸出的隨機性 (0-2,預設: 0.7)
  • maxTokens: 最大輸出 token 數 (預設: 800)
  • topP: 核心採樣參數 (0-1,預設: 0.35)
  • topK: Top-K 採樣參數
  • repetitionPenalty: 重複懲罰係數 (預設: 1.05)
  • frequencyPenalty: 頻率懲罰
  • presencePenalty: 存在懲罰
  • seed: 隨機種子
  • stopSequences: 停止序列陣列
  • responseFormat: 回應格式 ({ type: 'text' | 'json_object' })

發布到 npm

⚠️ 注意: 這是一個非官方 provider,使用 @see2/dzmm-provider 命名以表明其社區性質。

準備發布

# 1. 登錄 npm (如果尚未登錄)
npm login

# 2. 確保在 packages/dzmm 目錄下
cd packages/dzmm

# 3. 使用發布腳本 (推薦)
./scripts/publish.sh

手動發布步驟

# 在 packages/dzmm 目錄下

# 清理和安裝依賴
npm run clean
npm install

# 構建
npm run build

# 類型檢查
npm run lint

# 發布到 npm
npm publish --access public

發布後驗證

# 檢查包是否已發布
npm view @see2/dzmm-provider

# 測試安裝
npm install @see2/dzmm-provider

開發

# 安裝依賴
npm install

# 開發模式 (監聽文件變化)
npm run dev

# 構建
npm run build

# 運行範例
npm run example

# 類型檢查
npm run lint

# 清理構建
npm run clean

許可證

MIT