@yk1028/ai-chat-supporter
v1.1.0
Published
AI Chat Supporter - A langchain and langgraph based npm library for conversational AI
Maintainers
Readme
AI Chat Supporter
AI Chat Supporter is a TypeScript conversational AI library based on LangChain and LangGraph. It leverages local LLMs through Ollama and provides diverse personas, multi-language support, and RAG (Retrieval-Augmented Generation) capabilities.
✨ Key Features
- 🎭 Diverse Personas: Support for role-based personas (tech expert, creative, educator, counselor) and personality-based personas (energetic, humorous, calm, passionate, gentle, gloomy)
- 🌍 Multi-language Support: Korean and English response support
- 📏 Response Length Control: Configurable response lengths (short/medium/long)
- 💬 Multiple Input Formats: Support for both single chat and multi-participant conversation inputs
- 📚 RAG Functionality: Knowledge-based responses using web URLs, PDF files, and text files
- 🔧 MCP Integration: Model Context Protocol support for external tool integrations
- ⚙️ Flexible Configuration: Customizable Ollama models and settings
📦 Installation
npm install ai-chat-supporter🔧 Prerequisites
Before using this library, you need to have Ollama installed.
# Install Ollama and download the model, You must use a model that supports the tool. (Recommanded model: gpt-oss:20b)
ollama pull gpt-oss:20b🚀 Basic Usage
1. Basic Setup
import { AIChatSupporter } from 'ai-chat-supporter';
const chatSupporter = new AIChatSupporter({
model: 'gpt-oss:20b',
defaultPersona: 'tech_expert',
defaultLanguage: 'en',
defaultLength: 'medium'
});
// Initialize (required)
await chatSupporter.initialize();2. Single Chat
const response = await chatSupporter.chat({
message: 'Explain the differences between TypeScript and JavaScript.'
});
console.log(response.response);
// Response: "TypeScript is a superset of JavaScript that adds static typing..."3. Multi-participant Conversation
const response = await chatSupporter.chat({
messages: [
{ speaker: 'Alice', message: 'AI development is advancing rapidly these days' },
{ speaker: 'Bob', message: 'Yes, especially in the LLM field, it\'s amazing' },
{ speaker: 'Alice', message: 'Should we start an AI project too?' }
],
context: 'Discussion about AI projects during a development team meeting'
});4. Persona and Options Configuration
const response = await chatSupporter.chat(
{ message: 'Tell me about programming bugs' },
{
persona: 'humorous',
language: 'en',
length: 'short'
}
);🎭 Persona Types
Role-based Personas
tech_expert: Technical expert - Provides accurate, detailed technical informationcreative: Creative professional - Approaches problems with creativity and innovationeducator: Educator - Breaks down complex concepts with clear explanationscounselor: Counselor - Provides empathetic guidance and emotional support
Personality-based Personas
energetic: High energy and positive tone with enthusiasmhumorous: Witty and light-hearted responses with appropriate humorcalm: Peaceful and measured responses with soothing languagepassionate: Intense passion and conviction about topicsgentle: Kind and soft responses with consideration for feelingsgloomy: Serious and contemplative tone with realistic perspectives
🌍 Language Support
The library supports multiple languages with strict language enforcement:
Supported Languages
ko: Korean - Responds exclusively in Korean with natural expressions and appropriate honorificsen: English - Responds exclusively in English with clear and natural expressions
Language Configuration
// Korean response
const koreanResponse = await chatSupporter.chat(
{ message: 'TypeScript에 대해 설명해주세요' },
{ language: 'ko' }
);
// Example output: "타입스크립트는 자바스크립트에 정적 타입을 추가한 프로그래밍 언어입니다.
// 마이크로소프트에서 개발했으며, 개발 단계에서 오류를 미리 찾아내어 코드의 안정성을
// 높여줍니다. 타입스크립트 코드는 자바스크립트로 컴파일되어 어디서든 실행할 수 있습니다."
// English response
const englishResponse = await chatSupporter.chat(
{ message: 'Explain TypeScript to me' },
{ language: 'en' }
);
// Example output: "TypeScript is a strongly typed programming language that builds on JavaScript.
// Developed by Microsoft, it adds static type definitions to help catch errors during development.
// TypeScript compiles to plain JavaScript and runs anywhere JavaScript runs."Note: The language setting enforces strict language usage - responses will not mix languages or include foreign words unless absolutely necessary.
📏 Response Length Control
Configure the length of AI responses according to your needs:
Length Options
short: Concise responses (1 sentence or less than 25 words)- Perfect for quick answers and brief explanations
- Ideal for mobile interfaces or chat applications
medium: Balanced responses (2-4 sentences or about 50-100 words)- Default setting providing comprehensive yet manageable answers
- Good balance between detail and readability
long: Detailed responses (Multiple paragraphs or about 100-200 words)- In-depth explanations with examples and context
- Suitable for educational content or complex topics
Length Configuration Examples
// Short response
const shortResponse = await chatSupporter.chat(
{ message: 'What is TypeScript?' },
{ length: 'short' }
);
// Example output: "TypeScript is JavaScript with static type definitions."
// Medium response (default)
const mediumResponse = await chatSupporter.chat(
{ message: 'What is TypeScript?' },
{ length: 'medium' }
);
// Example output: "TypeScript is a strongly typed programming language that builds on JavaScript.
// It adds static type definitions to help catch errors during development. TypeScript compiles to
// plain JavaScript and runs anywhere JavaScript runs."
// Long response
const longResponse = await chatSupporter.chat(
{ message: 'What is TypeScript?' },
{ length: 'long' }
);
// Example output: "TypeScript is a strongly typed programming language developed by Microsoft that
// builds on JavaScript by adding static type definitions. It helps developers catch errors early
// in the development process through its type checking system. TypeScript compiles to clean,
// readable JavaScript code that runs on any browser, Node.js, or JavaScript engine. The language
// supports modern JavaScript features and provides excellent tooling support with IntelliSense,
// refactoring, and navigation capabilities in popular editors."📚 RAG (Retrieval-Augmented Generation) Features
Use RAG functionality to generate responses leveraging external knowledge sources.
const chatSupporterWithRAG = new AIChatSupporter({
model: 'gpt-oss:20b',
agentPrompt: 'You prioritize using tools when responding. You must use the tools whenever the query is related to specific domains.', // option
ragInfos: [
{
name: 'typescript_docs',
description: 'Provides TypeScript official documentation information',
sources: {
weburls: ['https://www.typescriptlang.org/docs/']
}
},
{
name: 'blockchain_knowledge',
description: 'Provides blockchain and cryptocurrency information',
sources: {
pdfs: ['./docs/whitepaper.pdf']
}
}
]
});
await chatSupporterWithRAG.initialize();RAG Configuration Options
{
name: "tool_name",
description: "Tool description",
sources: {
weburls: ["https://example.com"], // Array of web URLs
pdfs: ["./path/to/file.pdf"], // Array of PDF file paths
texts: ["./path/to/file.txt"] // Array of text file paths
},
options: { // Optional, the following are default values.
embeddings: {
model: "mxbai-embed-large",
baseUrl: "http://localhost:11434"
},
chunkSize: 500,
chunkOverlap: 100
}
}🔧 MCP (Model Context Protocol) Integration
The library supports MCP for integrating external tools and services seamlessly.
What is MCP?
Model Context Protocol (MCP) is a standardized way for AI applications to connect with external data sources and tools. It enables secure, controlled access to local and remote resources.
MCP Configuration
const chatSupporterWithMCP = new AIChatSupporter({
model: 'gpt-oss:20b',
defaultPersona: 'tech_expert',
defaultLanguage: 'en',
mcpConfigs: {
"mcpServers": {
"everything": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-everything"
]
}
}
},
agentPrompt: 'You prioritize using tools when responding. Use MCP tools for system operations and external integrations.'
});
await chatSupporterWithMCP.initialize();
const response = await chatSupporterWithMCP.chat({
message: 'echo "Hello, World!"'
});
// Example output: The AI will execute the echo command and return the resultAvailable MCP Servers
You can integrate various MCP servers for different functionalities:
System Operations
"everything": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-everything"]
}Mathematical Operations
"math": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-math"],
"restart": {
"enabled": true,
"maxAttempts": 3,
"delayMs": 1000
}
}File System Access
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem"]
}HTTP/SSE Connections
"weather": {
"url": "https://example.com/weather/mcp",
"headers": {
"Authorization": "Bearer token123"
},
"automaticSSEFallback": false
}MCP Configuration Options
The mcpConfigs field accepts a ClientConfig object that follows the MCP specification:
interface MCPServerConfig {
// STDIO transport (for local tools)
command?: string;
args?: string[];
restart?: {
enabled: boolean;
maxAttempts: number;
delayMs: number;
};
// HTTP/SSE transport (for remote services)
url?: string;
headers?: Record<string, string>;
automaticSSEFallback?: boolean;
reconnect?: {
enabled: boolean;
maxAttempts: number;
delayMs: number;
};
}Note: MCP integration requires compatible models that support tool calling. The recommended model is
gpt-oss:20bfor optimal MCP functionality.
🔧 AIChatSupporter Config
interface AIChatSupporterConfig {
model?: string; // Default: 'gpt-oss:20b'
baseUrl?: string; // Default: 'http://localhost:11434'
defaultPersona?: Persona; // Default: 'tech_expert'
defaultLanguage?: Language; // Default: 'ko'
defaultLength?: ResponseLength; // Default: 'medium'
ragConfigs?: RAGConfig[]; // RAG configuration array
mcpConfigs?: ClientConfig; // MCP server configuration
agentPrompt?: string; // Custom agent prompt for tool usage
}🛠️ Development and Build
# Install dependencies
npm install
# Development mode
npm run dev
# Build
npm run build
# Test
npm test
# Run example
npm run example
# Lint
npm run lint📄 License
MIT
👨💻 Author
yk1028
🤝 Contributing
Issues and pull requests are always welcome!
🔗 Related Links
Note: This library requires Node.js 18 or higher and a running Ollama instance for local LLM inference.
