npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@tcsk-vscode/llm

v0.0.2

Published

A VSCode extension for LLM integration

Downloads

8

Readme

@tcsk-vscode/llm

为 VS Code 扩展提供 LLM 集成功能,包括 GitHub Copilot 集成和用于构建 AI 功能的 prompt-tsx 组件。

English | 中文

功能特性

  • ✅ GitHub Copilot 聊天集成
  • ✅ 基于 TSX 的 Prompt 元素系统
  • ✅ 工具调用和结果处理
  • ✅ 聊天历史管理
  • ✅ 文件引用和上下文处理
  • ✅ TypeScript 类型支持

安装

npm install @tcsk-vscode/llm

快速开始

1. 调用 GitHub Copilot

import { callCopilot } from '@tcsk-vscode/llm';

// 简单调用
await callCopilot({
  prompt: 'Please help me fix this code issue',
  ifOpenNewChat: true,
});

// 高级用法
await callCopilot({
  prompt: 'Refactor this function to use async/await',
  ifOpenNewChat: false,
  mode: ChatMode.edit,
  onCallback: () => {
    console.log('Copilot chat opened');
  },
});

2. 使用 Prompt-TSX 组件

import { WithToolPrompt, History, PromptReferences } from '@tcsk-vscode/llm';
import { renderPrompt } from '@vscode/prompt-tsx';

// 渲染带工具支持的 prompt
const { messages } = await renderPrompt(
  WithToolPrompt,
  {
    request,
    context,
    agentPrompt: 'You are a helpful coding assistant.',
    supplePrompt: 'Focus on TypeScript best practices.',
    toolCallRounds: [],
    toolCallResults: {},
  },
  { modelMaxPromptTokens: model.maxInputTokens },
  model
);

API 参考

callCopilot

调用 GitHub Copilot 聊天功能。

interface callCopilotParams {
  mode?: ChatMode;
  modelSelector?: LanguageModelChatSelector;
  prompt: string;
  ifOpenNewChat?: boolean;
  onCallback?: () => void;
}

function callCopilot(params: callCopilotParams): Promise<string>;

参数:

  • prompt - 要发送给 Copilot 的提示文本
  • ifOpenNewChat - 是否在新的聊天窗口中打开(默认:false)
  • mode - 聊天模式:agent | chat | edit
  • modelSelector - 语言模型选择器
  • onCallback - 回调函数

Prompt-TSX 组件

WithToolPrompt

主要的工具 prompt 组件,支持工具调用和上下文管理。

interface IWithToolsProps {
  request: ChatRequest;
  agentPrompt: string;
  supplePrompt?: string;
  context: ChatContext;
  toolCallRounds?: ToolCallRound[];
  toolCallResults?: Record<string, LanguageModelToolResult>;
}

History

渲染聊天历史记录,包括之前的工具调用和结果。

interface IHistoryProps {
  priority: number;
  context: ChatContext;
}

PromptReferences

渲染用户请求中包含的引用,如文件和选择的文本。

interface PromptReferencesProps {
  references: ReadonlyArray<ChatPromptReference>;
  excludeReferences?: boolean;
}

ToolCalls

渲染工具调用和结果。

interface ToolCallsProps {
  toolCallRounds: ToolCallRound[];
  toolCallResults: Record<string, LanguageModelToolResult>;
  toolInvocationToken: ChatParticipantToolToken | undefined;
}

实际应用示例

创建聊天参与者

import {
  WithToolPrompt,
  type IWithToolsProps,
  copilotChatReady,
} from '@tcsk-vscode/llm';
import { renderPrompt } from '@vscode/prompt-tsx';

export const handleChatParticipant: ChatRequestHandler = async (
  request: ChatRequest,
  context: ChatContext,
  response: ChatResponseStream,
  token: CancellationToken
) => {
  // 检查 Copilot 是否可用
  if (!(await copilotChatReady())) {
    response.markdown('GitHub Copilot Chat extension is required.');
    return;
  }

  const model = request.model;

  // 渲染 prompt
  const { messages } = await renderPrompt(
    WithToolPrompt,
    {
      request,
      context,
      agentPrompt:
        'You are a helpful coding assistant specialized in TypeScript.',
      supplePrompt: 'Please provide clear and concise solutions.',
    },
    { modelMaxPromptTokens: model.maxInputTokens },
    model
  );

  // 发送请求到 LLM
  const chatResponse = await model.sendRequest(messages, {}, token);

  for await (const fragment of chatResponse.text) {
    response.markdown(fragment);
  }
};

文档检索助手

import { WithToolPrompt } from '@tcsk-vscode/llm';

// 结合文档检索的聊天助手
const { messages } = await renderPrompt(
  WithToolPrompt,
  {
    request,
    context,
    agentPrompt: docAgentPrompt,
    supplePrompt: retrievedDocs.join('\n\n'),
    toolCallRounds: [],
    toolCallResults: {},
  },
  { modelMaxPromptTokens: model.maxInputTokens },
  model
);

工具函数

copilotChatReady

检查 GitHub Copilot Chat 扩展是否已安装并可用。

async function copilotChatReady(): Promise<boolean>;

聊天模式

enum ChatMode {
  agent = 'agent',
  chat = 'chat',
  edit = 'edit',
}

类型定义

包含完整的 TypeScript 类型定义,支持:

  • ToolCallRound - 工具调用轮次
  • ToolCallsMetadata - 工具调用元数据
  • IWithToolMetaData - 工具元数据接口
  • PromptReferencesProps - 引用组件属性
  • ToolResultElementProps - 工具结果元素属性

最佳实践

  1. 错误处理: 始终检查 Copilot 可用性
if (!(await copilotChatReady())) {
  // 处理 Copilot 不可用的情况
  return;
}
  1. 性能优化: 合理设置 token 预算
const { messages } = await renderPrompt(
  WithToolPrompt,
  props,
  { modelMaxPromptTokens: 4000 }, // 根据模型调整
  model
);
  1. 上下文管理: 利用聊天历史提供更好的上下文
<History context={context} priority={10} />
  1. 工具集成: 正确处理工具调用结果
<ToolCalls
  toolCallRounds={toolCallRounds}
  toolCallResults={toolCallResults}
  toolInvocationToken={request.toolInvocationToken}
/>

TSX 配置

要在你的扩展中启用 TSX 支持,需要在 tsconfig.json 中添加以下配置选项:

{
  "compilerOptions": {
    // ...
    "jsx": "react",
    "jsxFactory": "vscpp",
    "jsxFragmentFactory": "vscppf"
  }
  // ...
}

注意:如果你的代码库同时依赖 @tcsk-vscode/llm 和其他使用 JSX 的库(例如在 monorepo 中,父文件夹依赖 React),在添加此库到项目时可能会遇到编译错误。这是因为 TypeScript 默认会在编译时包含所有 @types 包。你可以通过显式列出编译时需要考虑的类型来解决这个问题,例如:

{
  "compilerOptions": {
    "types": ["node", "jest", "express"]
  }
}

许可证

MIT

贡献

欢迎提交 Issue 和 Pull Request!