npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@hjning/deepseek-sdk

v1.2.1

Published

Unofficial DeepSeek API SDK for JavaScript / TypeScript

Downloads

472

Readme

DeepSeek SDK

Unofficial DeepSeek API SDK for JavaScript / TypeScript.

npm install @hjning/deepseek-sdk

Quick Start

import { DeepSeekClient } from '@hjning/deepseek-sdk'

const client = new DeepSeekClient({ apiKey: 'sk-...' })

// 列出模型
const models = await client.models.list()

// 查询余额
const balance = await client.user.balance()

// 对话补全
const res = await client.chat.create({
  model: 'deepseek-v4-flash',
  messages: [{ role: 'user', content: '你好' }],
})
console.log(res.choices[0].message.content)

Chat Completion

非流式 client.chat.create()

const res = await client.chat.create({
  model: 'deepseek-v4-flash',
  messages: [
    { role: 'system', content: '你是一个严谨的数学老师。' },
    { role: 'user', content: '9.11 和 9.8 哪个大?' },
  ],
  temperature: 0.6,
  max_tokens: 200,
})

流式 — native client.chat.stream.native()

返回 DeepSeek 原始 chunk,与 API 响应格式一致:

for await (const chunk of client.chat.stream.native({
  model: 'deepseek-v4-flash',
  messages: [{ role: 'user', content: '写一首诗' }],
  stream: true,
  max_tokens: 200,
})) {
  const delta = chunk.choices[0].delta
  if (delta?.content) process.stdout.write(delta.content)
  if (delta?.reasoning_content) process.stdout.write(delta.reasoning_content)
}

流式 — event client.chat.stream.event()

返回结构化事件,按 event.type 分发:

for await (const e of client.chat.stream.event({
  model: 'deepseek-v4-flash',
  messages: [{ role: 'user', content: '9.11 和 9.8 哪个大?' }],
  max_tokens: 200,
})) {
  switch (e.type) {
    case 'reasoning_start':
      process.stdout.write('[推理] ')
      break
    case 'reasoning_delta':
      process.stdout.write(e.text)
      break
    case 'reasoning_end':
      console.log('')
      break
    case 'answer_start':
      process.stdout.write('[回复] ')
      break
    case 'answer_delta':
      process.stdout.write(e.text)
      break
    case 'answer_end':
      console.log('')
      break
    case 'tool_calls_start':
      console.log('工具调用:')
      break
    case 'tool_call_name':
      console.log(`  ${e.name}(`)
      break
    case 'tool_call_argument':
      process.stdout.write(e.partial_json)
      break
    case 'tool_calls_end':
      console.log(')')
      console.log('结束')
      break
    case 'finish_reason':
      console.log(`stop_reason: ${e.stop_reason}`)
      break
  }
}

事件类型一览:

| 事件 | 说明 | |------|------| | message_start | 消息开始,含 id/model/created/system_fingerprint | | reasoning_start | 推理开始 | | reasoning_delta | 推理增量,字段 text,可选 logprobs | | reasoning_end | 推理结束 | | answer_start | 正文开始 | | answer_delta | 正文增量,字段 text,可选 logprobs | | answer_end | 正文结束 | | tool_calls_start | 所有工具调用开始 | | tool_call_name | 单个工具调用开始,字段 id/name | | tool_call_argument | 工具参数 JSON 增量,字段 partial_json | | tool_calls_end | 所有工具调用结束 | | finish_reason | 停止原因 | | usage | 用量统计,字段 usage(完整 Usage 对象) | | message_end | 消息结束 |

Thinking Mode

DeepSeek 默认开启思考模式,模型会先推理再回答。关闭只需要设置 thinking: { type: 'disabled' }

// 关闭思考模式
await client.chat.create({
  model: 'deepseek-v4-flash',
  messages: [{ role: 'user', content: '你好' }],
  thinking: { type: 'disabled' },
})

// 控制推理强度
await client.chat.create({
  model: 'deepseek-v4-flash',
  messages: [{ role: 'user', content: '复杂问题...' }],
  reasoning_effort: 'max', // high (默认) | max
})

多轮对话时,上一轮的 reasoning_content 需要保留在 assistant 消息中。推荐直接使用 API 返回的完整 message:

const res = await client.chat.create({ model: '...', messages })
messages.push(res.choices[0].message) // 包含 content + reasoning_content + tool_calls

Tool Calls

const res = await client.chat.create({
  model: 'deepseek-v4-flash',
  messages: [{ role: 'user', content: '上海今天天气怎么样?' }],
  tools: [{
    type: 'function',
    function: {
      name: 'get_weather',
      description: '查询指定城市的天气',
      parameters: {
        type: 'object',
        properties: { city: { type: 'string', description: '城市名称' } },
        required: ['city'],
      },
    },
  }],
})

// 模型返回 tool_calls
const tc = res.choices[0].message.tool_calls[0]
// { id: 'call_...', function: { name: 'get_weather', arguments: '{"city":"上海"}' } }

// 执行工具后,继续对话
messages.push(res.choices[0].message)
messages.push({ role: 'tool', tool_call_id: tc.id, content: '晴,25°C' })
const res2 = await client.chat.create({ model: '...', messages, tools })

Prefix Completion (Beta)

让模型从指定前缀续写:

// 续写正文:模型从 "```python\n" 开始生成代码
await client.chat.create({
  model: 'deepseek-v4-flash',
  messages: [
    { role: 'user', content: '请用 python 写二分查找' },
    { role: 'assistant', content: '```python\n', prefix: true },
  ],
  stop: ['```'],
})

// 续写推理:模型从指定的 reasoning_content 继续思考
// 注意:续写推理时 content 必须为空字符串 ""
await client.chat.create({
  model: 'deepseek-v4-flash',
  messages: [
    { role: 'user', content: '请解释闭包' },
    { role: 'assistant', content: '', reasoning_content: '先给定义,再举例。', prefix: true },
  ],
})

注意:Prefix 模式下消息历史中不能包含 tool call / tool result,API 会返回 Function call should not be used with prefix

FIM Completion (Beta)

Fill-In-the-Middle 补全:

// 非流式
const res = await client.fim.create({
  model: 'deepseek-v4-pro',
  prompt: 'def fibonacci(n):\n    """第 n 个斐波那契数"""\n',
  suffix: '\n    return result',
  max_tokens: 50,
})

// 流式
for await (const chunk of client.fim.create({
  model: 'deepseek-v4-pro',
  prompt: 'const greet = (name: string): string => {\n  ',
  suffix: '\n}',
  max_tokens: 30,
  stream: true,
})) {
  process.stdout.write(chunk.choices[0].text)
}

API Reference

new DeepSeekClient(config)

| 参数 | 类型 | 说明 | |------|------|------| | apiKey | string | DeepSeek API Key | | baseUrl | string | 可选,默认 https://api.deepseek.com |

client.models.list()

返回模型列表,类型 ListModelsResponse

client.user.balance()

查询余额,类型 BalanceResponse

client.chat.create(request)

非流式对话补全。返回 Promise<ChatCompletionResponse>

client.chat.stream.native(request)

流式对话 — DeepSeek 原生格式。接受 stream: true,返回 AsyncGenerator<ChatCompletionChunk>

client.chat.stream.event(request)

流式对话 — 结构化事件格式。返回 AsyncGenerator<StreamEvent>

client.fim.create(request)

FIM 补全。支持 stream: true 切换流式,返回 Promise<FIMCompletionResponse>AsyncGenerator<FIMCompletionChunk>

License

ISC