npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@iflow-mcp/arch-mcp

v1.0.0

Published

MCP Server for Local AI Integration via Ollama

Downloads

10

Readme

Enhanced Architecture MCP

Enhanced Model Context Protocol (MCP) servers with professional accuracy, tool safety, user preferences, and intelligent context monitoring.

Overview

This repository contains a collection of MCP servers that provide advanced architecture capabilities for AI assistants, including:

  • Professional Accuracy Enforcement - Prevents marketing language and ensures factual descriptions
  • Tool Safety Protocols - Blocks prohibited operations and validates parameters
  • User Preference Management - Stores and applies communication and aesthetic preferences
  • Intelligent Context Monitoring - Automatic token estimation and threshold warnings
  • Multi-MCP Orchestration - Coordinated workflows across multiple servers

Active Servers

Enhanced Architecture Server (enhanced_architecture_server_context.js)

Primary server with complete feature set:

  • Professional accuracy verification
  • Tool safety enforcement
  • User preference storage/retrieval
  • Context token tracking
  • Pattern storage and learning
  • Violation logging and metrics

Chain of Thought Server (cot_server.js)

Reasoning strand management:

  • Create and manage reasoning threads
  • Branch reasoning paths
  • Complete strands with conclusions
  • Cross-reference reasoning history

Local AI Server (local-ai-server.js)

Local model integration via Ollama:

  • Delegate heavy reasoning tasks
  • Token-efficient processing
  • Hybrid local+cloud analysis
  • Model capability queries

Installation

  1. Prerequisites:

    npm install
  2. Configuration: Update your Claude Desktop configuration to include the servers:

    {
      "mcpServers": {
        "enhanced-architecture": {
          "command": "node",
          "args": ["D:\\arch_mcp\\enhanced_architecture_server_context.js"],
          "env": {}
        },
        "cot-server": {
          "command": "node", 
          "args": ["D:\\arch_mcp\\cot_server.js"],
          "env": {}
        },
        "local-ai-server": {
          "command": "node",
          "args": ["D:\\arch_mcp\\local-ai-server.js"],
          "env": {}
        }
      }
    }
  3. Local AI Setup (Optional): Install Ollama and pull models:

    ollama pull llama3.1:8b

Usage

Professional Accuracy

Automatically prevents:

  • Marketing language ("revolutionary", "cutting-edge")
  • Competitor references
  • Technical specification enhancement
  • Promotional tone

Context Monitoring

Tracks conversation tokens across:

  • Document attachments
  • Artifacts and code
  • Tool calls and responses
  • System overhead

Provides warnings at 80% and 90% capacity limits.

User Preferences

Stores preferences for:

  • Communication style (brief professional)
  • Aesthetic approach (minimal)
  • Message format requirements
  • Tool usage patterns

Multi-MCP Workflows

Coordinates complex tasks:

  1. Create CoT reasoning strand
  2. Delegate analysis to local AI
  3. Store insights in memory
  4. Update architecture patterns

Key Features

  • Version-Free Operation - No version dependencies, capability-based reporting
  • Empirical Validation - 60+ validation gates for decision-making
  • Token Efficiency - Intelligent context management and compression
  • Professional Standards - Enterprise-grade accuracy and compliance
  • Cross-Session Learning - Persistent pattern storage and preference evolution

File Structure

D:\arch_mcp\
├── enhanced_architecture_server_context.js  # Main server
├── cot_server.js                            # Reasoning management
├── local-ai-server.js                       # Local AI integration
├── data/                                    # Runtime data (gitignored)
├── backup/                                  # Legacy server versions
└── package.json                             # Node.js dependencies

Development

Architecture Principles

  • Dual-System Enforcement - MCP tools + text document protocols
  • Empirical Grounding - Measurable validation over assumptions
  • User-Centric Design - Preference-driven behavior adaptation
  • Professional Standards - Enterprise accuracy and safety requirements

Adding New Features

  1. Update server tool definitions
  2. Implement handler functions
  3. Add empirical validation gates
  4. Update user preference options
  5. Test cross-MCP coordination

Troubleshooting

Server Connection Issues:

  • Check Node.js version compatibility
  • Verify file paths in configuration
  • Review server logs for syntax errors

Context Tracking:

  • Monitor token estimation accuracy
  • Adjust limits for conversation length
  • Use reset tools for fresh sessions

Performance:

  • Local AI requires Ollama installation
  • Context monitoring adds ~50ms overhead
  • Pattern storage optimized for < 2ms response

License

MIT License - see individual files for specific licensing terms.

Contributing

Architecture improvements welcome. Focus areas:

  • Enhanced token estimation accuracy
  • Additional validation gates
  • Cross-domain pattern recognition
  • Performance optimization