npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

clear-ai-server

v1.1.0

Published

Intelligent AI Server - Express API server with exceptional intelligence capabilities and advanced memory integration

Readme

@clear-ai/server

🧠 Intelligent AI Server - Express API server with exceptional intelligence capabilities and advanced memory integration.

Intelligence Score API Success Rate Memory Integration

🚀 Installation

npm install @clear-ai/server

🧠 Intelligence Features

Advanced Agent Intelligence (8.5/10)

  • Perfect Intent Classification: 100% accuracy across all query types
  • Hybrid Intelligence: Combines memory, reasoning, and tool execution
  • Context-aware Processing: Maintains conversation context across sessions
  • Relationship Understanding: Recognizes complex API data hierarchies

Memory System Integration (9/10)

  • Episodic Memory: Stores and retrieves conversation history
  • Semantic Memory: Conceptual knowledge and relationships
  • Cross-session Persistence: Maintains user preferences across sessions
  • Intelligent Retrieval: Context-aware memory search and relevance scoring

API Relationship Intelligence (9/10)

  • Hierarchical Understanding: Users → Posts → Comments relationships
  • Pattern Recognition: Identifies data flow and structure patterns
  • Semantic Grouping: Categorizes API resources by function
  • Multi-step Reasoning: Complex data traversal and analysis

🚀 Quick Start

import { createServer } from '@clear-ai/server';

// Create intelligent server with memory integration
const server = createServer({
  port: 3001,
  memoryConfig: {
    neo4jUri: 'bolt://localhost:7687',
    neo4jUser: 'neo4j',
    neo4jPassword: 'password',
    pineconeApiKey: 'your-pinecone-key',
    pineconeEnvironment: 'your-environment'
  },
  llmConfig: {
    openaiApiKey: 'your-openai-key',
    ollamaBaseUrl: 'http://localhost:11434'
  }
});

await server.start();
console.log('🧠 Intelligent AI Server running on port 3001');

🧠 Core Features

Intelligent Agent API

  • Agent Query Processing - Advanced intent classification and reasoning
  • Memory Integration - Episodic and semantic memory management
  • Relationship Analysis - Complex data relationship understanding
  • Hybrid Execution - Combines memory, tools, and reasoning

Memory Management

  • Episodic Memory - Conversation history and context storage
  • Semantic Memory - Conceptual knowledge and relationships
  • Memory Search - Intelligent memory retrieval with relevance scoring
  • Cross-session Persistence - Maintains context across different sessions

API Intelligence

  • Relationship Understanding - Recognizes complex data hierarchies
  • Pattern Recognition - Identifies data flow and structure patterns
  • Semantic Analysis - Groups and categorizes API resources
  • Multi-step Reasoning - Complex data traversal and analysis

Traditional Features

  • Express API Server - RESTful endpoints for tool execution
  • MCP Integration - Model Context Protocol support
  • Workflow Execution - LangGraph workflow orchestration
  • Tool Management - Dynamic tool registration and execution
  • Health Monitoring - System health and status endpoints

🧠 Intelligence API Endpoints

Agent Query Processing

# Execute intelligent query with memory context
POST /api/agent/execute
{
  "query": "What do you remember about our previous discussion on machine learning?",
  "options": {
    "userId": "user-123",
    "sessionId": "session-456",
    "includeMemoryContext": true,
    "includeReasoning": true,
    "model": "gpt-4",
    "temperature": 0.7
  }
}

# Response includes:
# - Intelligent response with memory context
# - Intent classification with confidence score
# - Detailed reasoning chain
# - Retrieved memories (episodic + semantic)
# - Execution metadata

Memory Management

# Store episodic memory
POST /api/memory/episodic
{
  "userId": "user-123",
  "sessionId": "session-456",
  "content": "User discussed machine learning algorithms",
  "context": {
    "topic": "AI/ML",
    "importance": 0.9,
    "conversation_turn": 5
  },
  "metadata": {
    "source": "conversation",
    "tags": ["machine-learning", "algorithms"]
  }
}

# Store semantic memory
POST /api/memory/semantic
{
  "userId": "user-123",
  "concept": "Machine Learning",
  "description": "A subset of AI that enables computers to learn from data",
  "metadata": {
    "category": "AI",
    "confidence": 0.95
  },
  "relationships": {
    "similar": ["Deep Learning", "Neural Networks"],
    "parent": "Artificial Intelligence"
  }
}

# Search memories intelligently
POST /api/memory/episodic/search
{
  "userId": "user-123",
  "query": "machine learning discussion",
  "limit": 10
}

Relationship Analysis

# Analyze API data relationships
POST /api/agent/analyze-relationships
{
  "data": {
    "users": [{"id": 1, "name": "Alice"}],
    "posts": [{"id": 1, "userId": 1, "title": "My Post"}],
    "comments": [{"id": 1, "postId": 1, "userId": 2, "text": "Great!"}]
  },
  "options": {
    "userId": "user-123",
    "includePatterns": true,
    "includeSemanticGrouping": true
  }
}

Traditional API Endpoints

# Execute a tool
POST /api/tools/execute
{
  "toolName": "api_call",
  "args": {
    "url": "https://api.example.com/users/1",
    "method": "GET"
  }
}

# Execute a workflow
POST /api/langgraph/execute
{
  "query": "Get user data and process it",
  "options": {
    "model": "gpt-3.5-turbo",
    "temperature": 0.1
  }
}

# Check server health
GET /api/health

Configuration

Environment Variables

# Server configuration
PORT=3001
NODE_ENV=production

# LLM configuration
OPENAI_API_KEY=your-key
OLLAMA_BASE_URL=http://localhost:11434

# Langfuse configuration
LANGFUSE_SECRET_KEY=your-key
LANGFUSE_PUBLIC_KEY=your-key
LANGFUSE_BASE_URL=https://cloud.langfuse.com

Server Options

interface ServerOptions {
  port?: number;
  mcpConfig?: MCPConfig;
  llmConfig?: LLMConfig;
  cors?: CorsOptions;
  logging?: boolean;
}

Usage Examples

Basic Server

import { createServer } from '@clear-ai/server';

const server = createServer({
  port: 3001
});

await server.start();
console.log('Server running on port 3001');

With Custom Configuration

import { createServer } from '@clear-ai/server';

const server = createServer({
  port: 3001,
  mcpConfig: {
    tools: ['api_call', 'json_reader', 'file_reader']
  },
  llmConfig: {
    openaiApiKey: process.env.OPENAI_API_KEY,
    ollamaBaseUrl: 'http://localhost:11434'
  },
  cors: {
    origin: ['http://localhost:3000'],
    credentials: true
  }
});

await server.start();

Custom Middleware

import { createServer } from '@clear-ai/server';

const server = createServer({
  port: 3001
});

// Add custom middleware
server.app.use('/api/custom', (req, res) => {
  res.json({ message: 'Custom endpoint' });
});

await server.start();

Development

Running in Development

# Install dependencies
npm install

# Start development server
npm run dev

# Build for production
npm run build

# Start production server
npm start

Available Scripts

  • npm run dev - Start with hot reload
  • npm run build - Build TypeScript
  • npm start - Start production server
  • npm run lint - Run ESLint
  • npm run type-check - Run TypeScript checks

API Documentation

Interactive API documentation is available at:

  • Swagger UI: http://localhost:3001/api-docs
  • OpenAPI Spec: http://localhost:3001/api-docs.json

Monitoring

Health Endpoints

# Basic health check
GET /api/health

# Detailed system status
GET /api/health/detailed

# Tool registry status
GET /api/health/tools

Logging

The server uses structured logging with different levels:

import { logger } from '@clear-ai/shared';

logger.info('Server started', { port: 3001 });
logger.warn('Deprecated endpoint used', { endpoint: '/old-api' });
logger.error('Database connection failed', { error: error.message });

Deployment

Docker

FROM node:18-alpine
WORKDIR /app
COPY dist ./dist
COPY package.json ./
RUN npm ci --only=production
EXPOSE 3001
CMD ["node", "dist/index.js"]

Environment Configuration

# Production environment
NODE_ENV=production
PORT=3001
LOG_LEVEL=info

# Database configuration
DATABASE_URL=postgresql://user:pass@localhost:5432/clearai

# External services
REDIS_URL=redis://localhost:6379

Documentation

For complete documentation, visit: https://clear-ai-docs.example.com/docs/packages/server

License

MIT