npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@celljs/ai-core

v3.7.8

Published

Core domain for AI programming

Readme

Cell - AI Core Component

概览

AI Core 模块是一个用于与 AI 模型服务交互的库,提供了生成聊天响应和嵌入向量的功能。通过简单易用的 API 接口,支持消息的创建、请求的发送和响应的处理。是所有 AI 模块的基础,提供了 AI 模块通用的 API 接口。

特性

  • 生成聊天响应
  • 生成嵌入向量
  • 支持流式响应
  • 支持多种模型参数配置

安装

使用 npm 安装 AI Core 模块:

npm install @celljs/ai-core

或者使用 yarn:

yarn add @celljs/ai-core

AI Core 模块是所有 AI 模块的基础,提供了 AI 模块通用的 API 接口,包括消息的创建、请求的发送和响应的处理。所以在使用 AI Core 模块之前,需要先安装厂商对应的模型服务适配模块。例如:@celljs/ai-ollama

npm install @celljs/ai-ollama

或者使用 yarn:

yarn add @celljs/ai-ollama

快速开始

以下是一个简单的示例,展示如何使用 AI Ollama 模块生成聊天响应和嵌入向量:

import { AssistantMessage, PromptTemplate } from '@celljs/ai-core';
import { Component, Autowired } from '@celljs/core';

@Component()
export class OllamaDemo {
    @Autowired(OllamChatModel)
    private chatModel: ChatModel;

    @Autowired(EmbeddingModel)
    private embeddingModel: EmbeddingModel;

    @Autowired(PromptTemplate)
    private promptTemplate: PromptTemplate;

    /**
     * Chat with Ollama
     */
    async chat() {
        const prompt = await this.promptTemplate.create(
            'Hello {name}',
            { 
                chatOptions: { model: 'llama3.2' },
                variables: { name: 'Ollama' }
            }
        );
        const response = await this.chatModel.call(prompt);
        console.log(response.result.output);
    }

    /**
     * Stream chat response
     */
    async stream() {
        const prompt = await this.promptTemplate.create(
            'Hello {name}',
            { 
                chatOptions: { model: 'llama3.2' },
                variables: { name: 'Ollama' }
            }
        );
        const response$ = await this.chatModel.stream(prompt);
        response$.subscribe({
            next: response => console.log(response.result.output),
            complete: () => console.log('Chat completed!')
        });
    }

    /**
     * Embed text to vector
     */
    async embed() {
        const response = await this.embeddingModel.call({
            inputs: ['text to embed'],
            options: { model: 'llama3.2' }
        });
        console.log(response.result.embeddings);
    }
}

另外也可以使用 ProxyChatModel 类统一调用所有的模型服务商的模型:

import { ProxyChatModel } from '@celljs/ai-core';
import { Component Autowired } from '@celljs/core';

@Component()
export class OllamaDemo {
    @Autowired(ProxyChatModel)
    private chatModel: ChatModel;
    
    /**
     * Chat with Ollama
     */
    async chat() {
        const response = await this.chatModel.call({
            prompt: 'Hello {name}',
            variables: { name: 'Ollama' },
            chatOptions: { model: 'ollama:llama3.2' }
        });
        console.log(response.result.output);
        }
}

许可证

本项目采用 MIT 许可证。