npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@trishchuk/ai-think-gate-mcp

v0.2.0

Published

Model Context Protocol (MCP) server that provides AI-powered thinking and code architecture tools

Readme

AI Think Gate

A Model Context Protocol (MCP) server that provides AI-powered thinking, code architecture, and direct LLM access tools for integration with MCP-compatible clients like Claude Desktop, Cursor, and others.

Features

  • ArchitectTool: Analyzes technical requirements and creates detailed implementation plans
  • ThinkTool: Helps structure your thought process for complex code problems
  • LLMGatewayTool: Provides direct access to specialized LLMs with configurable parameters
  • SequentialThinkingTool: Facilitates step-by-step problem-solving with revision capabilities

Installation

# Install from npm
npm install @trishchuk/ai-think-gate-mcp -g

# Run the server (stdio mode for MCP clients)
ai-think-gate-mcp

Using Docker

# Build Docker image
docker build -t ai-think-gate-mcp .

# Run container
docker run -p 3000:3000 ai-think-gate-mcp

Configuration

AI Think Gate supports multiple LLM providers through environment variables. The server works with any OpenAI-compatible API, including OpenAI, Anthropic (Claude), local models like Ollama, and more.

Environment Variables

Common API Settings (for all tools)

LLM_OPENAI_API_KEY=your_api_key
LLM_OPENAI_API_MODEL=gpt-4
LLM_OPENAI_API_ENDPOINT=https://api.openai.com/v1

Tool-Specific API Settings

You can configure different API settings for each tool:

# Architect Tool
LLM_ARCHITECT_API_KEY=your_key
LLM_ARCHITECT_API_MODEL=gpt-4
LLM_ARCHITECT_API_ENDPOINT=https://api.openai.com/v1

# Think Tool
LLM_THINK_API_KEY=your_key
LLM_THINK_API_MODEL=gpt-3.5-turbo
LLM_THINK_API_ENDPOINT=https://api.openai.com/v1

# LLM Gateway Tool
LLM_GATEWAY_API_KEY=your_key
LLM_GATEWAY_API_MODEL=claude-3-opus
LLM_GATEWAY_API_ENDPOINT=https://api.anthropic.com/v1

Using with Ollama

For local LLM deployment with Ollama:

LLM_OPENAI_API_KEY=ollama
LLM_OPENAI_API_MODEL=llama3
LLM_OPENAI_API_ENDPOINT=http://localhost:11434/v1

Additional Settings

# Disable specific tools (comma-separated)
THINKGATE_DISABLED_TOOLS=think,llm_gateway

# Logging level
LOG_LEVEL=debug

Usage with MCP Clients

Claude Desktop

Basic installation:

{
  "mcpServers": {
    "ai_think_gate": {
      "command": "npx",
      "args": [
        "-y",
        "@trishchuk/ai-think-gate-mcp"
      ],
      "env": {
        "LLM_OPENAI_API_KEY": "your_api_key",
        "LLM_OPENAI_API_MODEL": "gemini-2.5-pro-exp-03-25",
        "LLM_OPENAI_API_ENDPOINT": "https://generativelanguage.googleapis.com/v1beta/openai"
      }
    }
  }
}

Using different models for each tool and from locally built files:

{
  "mcpServers": {
    "ai_think_gate": {
      "command": "node", 
      "args": ["/path/to/ai-think-gate-mcp/dist/index.js"],
      "env": {
        "LOG_LEVEL": "debug",
        
        "LLM_ARCHITECT_API_KEY": "sk-or-v1-your-key-here",
        "LLM_ARCHITECT_API_MODEL": "anthropic/claude-3-opus-20240229",
        "LLM_ARCHITECT_API_ENDPOINT": "https://api.anthropic.com/v1",
        
        "LLM_THINK_API_KEY": "sk-or-v1-your-key-here",
        "LLM_THINK_API_MODEL": "openai/gpt-4-turbo-preview",
        "LLM_THINK_API_ENDPOINT": "https://openrouter.ai/api/v1",
        
        "LLM_GATEWAY_API_KEY": "sk-or-v1-your-key-here",
        "LLM_GATEWAY_API_MODEL": "anthropic/claude-3-haiku",
        "LLM_GATEWAY_API_ENDPOINT": "https://openrouter.ai/api/v1",

        "LLM_SEQUENTIAL_THINKING_API_KEY": "sk-or-v1-your-key-here",
        "LLM_SEQUENTIAL_THINKING_API_MODEL": "google/gemini-1.5-pro",
        "LLM_SEQUENTIAL_THINKING_API_ENDPOINT": "https://openrouter.ai/api/v1"
      }
    }
  }
}

With Ollama integration:

{
  "mcpServers": {
    "ai_think_gate": {
      "command": "npx",
      "args": [
        "-y",
        "@trishchuk/ai-think-gate-mcp"
      ],
      "env": {
        "LOG_LEVEL": "debug",
        "LLM_OPENAI_API_KEY": "ollama",
        "LLM_OPENAI_API_MODEL": "llama3",
        "LLM_OPENAI_API_ENDPOINT": "http://localhost:11434/v1"
      }
    }
  }
}

Tool Descriptions

ArchitectTool

Analyzes technical requirements and produces clear, actionable implementation plans. It breaks down complex tasks into well-structured steps that a junior developer could follow.

Parameters:

  • prompt (string, required): The technical request or coding task to analyze
  • context (string, optional): Additional context from previous conversation or system state

ThinkTool

Helps structure your thought process for analyzing complex code issues. It breaks down problems, evaluates potential solutions, and recommends the best approach with detailed justification.

Parameters:

  • thought (string, required): Your detailed thoughts about the problem or idea
  • context (string, optional): Additional context from previous conversation or system state

LLMGatewayTool

Provides direct access to specialized language models for specific types of tasks, with configurable parameters.

Parameters:

  • message (string, required): Message or query to the LLM
  • context (string, optional): Additional context to improve the response
  • systemPrompt (string, optional): Custom system prompt to override the default
  • systemPromptType (string, optional): Type of system prompt: "default", "code", or "educational"
  • temperature (number, optional): Creativity level (0.0-1.0)
  • maxTokens (number, optional): Maximum number of tokens in the response

SequentialThinkingTool

Facilitates a detailed, step-by-step thinking process for problem-solving with the ability to revise previous thoughts and branch into alternative paths.

Parameters:

  • thought (string, required): Your current thinking step
  • nextThoughtNeeded (boolean, required): Whether another thought step is needed
  • thoughtNumber (integer, required): Current thought number
  • totalThoughts (integer, required): Estimated total thoughts needed
  • isRevision (boolean, optional): Whether this revises previous thinking
  • revisesThought (integer, optional): Which thought is being reconsidered
  • branchFromThought (integer, optional): Branching point thought number
  • branchId (string, optional): Branch identifier
  • needsMoreThoughts (boolean, optional): If more thoughts are needed

Development

# Clone the repository
git clone https://github.com/x51xxx/ai-think-gate-mcp.git
cd ai-think-gate-mcp

# Install dependencies
npm install

# Build the project
npm run build

# Run in development mode
npm run dev

License

MIT