npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

dcl-npc-toolkit-ai-version

v1.4.0

Published

A collection of tools for creating Non-Player-Characters (NPCs). These are capable of having conversations with the player, and play different animations. AI usage is added atop of it

Downloads

1,185

Readme

NPC-library updated with llm call tools

A collection of additional llm tools forked from dcl-npc-toolkit.

Capabilities of this library:

  • Start a connection with colyseus server using 2 arguments / 2 functions

  • Create default quick dialogue sequence to initiate llm dialogue

  • Dialogue is generated automatically after message arrives from server

Install the base dcl-npc-toolkit library

Via the CLI

  1. Install the library as an npm bundle. Run this command in your scene's project folder:
npm i dcl-npc-toolkit-ai-version
  1. Install the dependent sdk utils library as an npm bundle. Run this command in your scene's project folder:
npm i @dcl-sdk/utils -B
  1. Run dcl start or dcl build so the dependencies are correctly installed.

  2. Import the library into the scene's script. Add this line at the start of your index.ts file, or any other TypeScript files that require it:

import * as npc from 'dcl-npc-toolkit'
  1. In your TypeScript file, call the create function passing it a TransformType and a NPCData object. The NPCData object requires a minimum of a NPCType and a function to trigger when the NPC is activated:
export let myNPC = npc.create(
	{
		position: Vector3.create(8, 0, 8),
		rotation: Quaternion.Zero(),
		scale: Vector3.create(1, 1, 1),
	},
	//NPC Data Object
	{
		type: npc.NPCType.CUSTOM,
		model: 'models/npc.glb',
		onActivate: () => {
			console.log('npc activated')
		},
	}
)

Adding llm tools

  1. Change NPC onActivate to:
onActivate: async (data) => {
	npc.initAiDialog(myNPC);
},

This will create a generic dialogue, that allows user to call an input prompt to send to llm on server side

  1. Add new arguments to create() function:
export let myNPC = npc.create(
	{
		position: Vector3.create(8, 0, 8),
		rotation: Quaternion.Zero(),
		scale: Vector3.create(1, 1, 1),
	},
    //NPC Data Object
	{
		type: npc.NPCType.CUSTOM,
		model: 'models/npc.glb',
		onActivate: () => {
			console.log('npc activated')
		},
	},true,false,"http://localhost:2574","llm_room"
)

Those are RagMode, ConfiguredMode, server url and room name. This is designed to work with colyseus server. RagMode will mark this npc to use or not use Rag Chain System. Configured signals server to use configuration for initial system message on backend. Server url is a connection url for server and room name should specify the colyseus room name that is used on server side.

You need to specify server arguments only for the first NPC you create, others will share it. Also you can skip these arguments and setup url and room name beforehand like this:

setCustomServerUrl(url);
setCustomServerRoomName(room_name);
  1. Add response on server side:

Install llm_response module on the server side and add onMessage that sends response to frontend:

this.onMessage("getAnswer", async (client, msg) => {
		let result;
		// @ts-ignore
		
		let text = "";
		let voiceUrl = "";
		// @ts-ignore
		// msg will have rag variable, based on npc using/not using rag
		if (msg.rag) {
			const result = await mainChain.getRagAnswer(msg.text,voiceGenerationEnabled,await appReadyPromise);
			text = result.response.text;
			voiceUrl = result.exposedUrl;
		} else {
			const systemMessage = 'Some system message';
			const result = await getLLMTextAndVoice(systemMessage,msg.text,voiceGenerationEnabled,await appReadyPromise);
			text = result.response;
			voiceUrl = result.exposedUrl;
		}

		// sending response to NPC
		client.send("getAnswer", {
			answer: text,
			voiceUrl: voiceUrl,
			voiceEnabled: voiceGenerationEnabled,
			id: msg.id
		});
	}
)

Main parts of it is the name of the message "getAnswer" on receiving and on sending back, so it will trigger in the NPC. Other parts may be changed for your desired task.

  1. Add UI parts to your UI

Add the following UI components to your UI:

npcUI()
NpcUtilsInputUi()
NpcUtilsLoadingUi()