npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

sked-dev-mcp-server

v0.1.9

Published

A Model Context Protocol server to assist with development of customisations for the Pulse Platform

Readme

SKED Dev MCP Server

A Model Context Protocol (MCP) server that integrates with LangChain to provide AI-powered tools for optimization extension documentation and summarization. Built specifically for development of customisations for the Pulse Platform.

Features

  • Optimization Extension Tool: Provides documentation and guidance on creating optimization extensions for the Pulse Platform
  • Prompt Library: Reusable prompt library for generating code.
  • LangChain Integration: Uses LangChain with Anthropic's Claude models for intelligent summarization
  • Provider Flexibility: Easy to switch between different model providers (OpenAI, Anthropic, etc.) through LangChain
  • TypeScript: Full type safety with comprehensive TypeScript definitions
  • Async/Await Support: Full asynchronous operation for optimal performance
  • Error Handling: Comprehensive error handling and logging
  • Extensible Architecture: Easy to add new tools and capabilities
  • NPM Ready: Configured for npm publishing with proper build system

Installation

Prerequisites

  • Node.js 18.0.0 or higher
  • npm or yarn package manager
  • Anthropic API key (for Claude models)

Setup

  1. Clone or create the project directory:

    git clone <repository-url>
    cd sked-dev-mcp-server
  2. Install dependencies:

    npm install
  3. Set up environment variables: Create a .env file in the project root:

       
    ANTHROPIC_API_KEY=your_anthropic_api_key_here

    Note: You can easily switch to other model providers by changing the LLM configuration in the server code and setting the appropriate API keys.

Usage

Running the Server

Development mode:

npm run dev

Production mode:

npm run build
npm start

Direct execution:

npx sked-dev-mcp-server

The server will start and listen for MCP protocol messages via stdin/stdout.

Available Tools

optimisation_extension

Provides documentation and guidance on creating optimization extensions for the Pulse Platform.

Parameters:

  • text (string, required): Text query describing what you want to know about optimization extensions
  • max_length (integer, optional): Maximum length of the response in words (default: 100, range: 10-1000)

Example Usage:

{
  "method": "tools/call",
  "params": {
    "name": "optimisation_extension",
    "arguments": {
      "text": "How do I implement the core optimization algorithm?",
      "max_length": 150
    }
  }
}

Integration with Claude Desktop

To use this server with Claude Desktop, add the following configuration to your Claude Desktop config file:

macOS

Edit ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "sked-dev-mcp-server": {
      "command": "node",
      "args": ["/path/to/your/sked-dev-mcp-server/dist/index.js"],
      "cwd": "/path/to/your/sked-dev-mcp-server",
      "env": {
        "ANTHROPIC_API_KEY": "your_anthropic_api_key_here"
      }
    }
  }
}

Windows

Edit %APPDATA%\Claude\claude_desktop_config.json with similar configuration.

Linux

Edit ~/.config/Claude/claude_desktop_config.json with similar configuration.

Using as NPM Package

If you install this as an npm package:

{
  "mcpServers": {
    "sked-dev-mcp-server": {
      "command": "npx",
      "args": ["sked-dev-mcp-server"],
      "env": {
        "ANTHROPIC_API_KEY": "your_anthropic_api_key_here"
      }
    }
  }
}

Development

Project Structure

sked-dev-mcp-server/
├── src/
│   ├── index.ts            # Main entry point
│   ├── server.ts           # MCP server implementation
│   └── types.ts            # TypeScript type definitions
├── docs/
│   └── optimization_extension.md  # Documentation content
├── dist/                   # Compiled JavaScript (generated)
├── package.json           # NPM package configuration
├── tsconfig.json          # TypeScript configuration
├── jest.config.js         # Test configuration
├── .eslintrc.json         # ESLint configuration
├── .prettierrc            # Prettier configuration
├── README.md              # This file
└── .env                   # Environment variables (create this)

Adding New Tools

To add a new tool to the server:

  1. Define the tool in the tools/list handler:

    const tools: MCPTool[] = [
      {
        name: "your_tool_name",
        description: "Tool description",
        inputSchema: {
          type: "object",
          properties: {
            param: { type: "string", description: "Parameter description" }
          },
          required: ["param"]
        }
      }
    ];
  2. Handle the tool call:

    private async handleYourTool(arguments: YourToolArgs): Promise<MCPToolResult> {
      // Implement your tool logic here
    }
  3. Add the handler to the tools/call switch:

    case 'tools/call': {
      if (toolName === 'your_tool_name') {
        const result = await this.handleYourTool(arguments as YourToolArgs);
        return { jsonrpc: '2.0', id, result };
      }
    }

Code Quality

The project includes configuration for:

  • ESLint: Linting and code quality
  • Prettier: Code formatting
  • TypeScript: Type checking
  • Jest: Testing framework

Run quality checks:

# Format code
npm run format

# Lint code  
npm run lint

# Fix linting issues
npm run lint:fix

# Type check
npm run build

# Run tests
npm test

Configuration

Environment Variables

  • ANTHROPIC_API_KEY: Your Anthropic API key (required for LLM functionality)

Server Configuration

The server can be configured by modifying the MCPLangChainServer class initialization:

  • Model Selection: Change the model in the setupLLM() method
  • Temperature: Adjust the model temperature for response creativity
  • Documentation Path: Modify docsPath to point to different documentation

Switching Model Providers

To switch to a different model provider (e.g., OpenAI), simply update the setupLLM() method in src/server.ts:

// For OpenAI
import { ChatOpenAI } from '@langchain/openai';

private setupLLM(): void {
  const apiKey = process.env['OPENAI_API_KEY'];
  if (!apiKey) {
    logger.warn('OPENAI_API_KEY not found in environment variables.');
    return;
  }

  try {
    this.llm = new ChatOpenAI({
      model: 'gpt-4',
      temperature: 0.2,
      apiKey,
    });
    logger.info('Successfully initialized OpenAI LLM through LangChain');
  } catch (error) {
    logger.error(`Failed to initialize OpenAI LLM: ${error}`);
    this.llm = null;
  }
}

Then install the appropriate LangChain package:

npm install @langchain/openai

Troubleshooting

Common Issues

  1. "ANTHROPIC_API_KEY not found"

    • Ensure your .env file contains the correct API key
    • Verify the .env file is in the project root directory
  2. "Documentation file not found"

    • Ensure docs/optimization_extension.md exists
    • Check file permissions and encoding (should be UTF-8)
  3. Import errors

    • Verify all dependencies are installed: npm install
    • Check Node.js version compatibility (18.0.0+)
    • Run npm run build to compile TypeScript
  4. MCP connection issues

    • Ensure Claude Desktop configuration points to the correct directory
    • Check that the Node.js environment has all required packages
    • Verify the built JavaScript files exist in dist/

Logging

The server includes comprehensive logging. Check the console output for detailed error messages and debugging information.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests for new functionality
  5. Run the quality checks
  6. Submit a pull request

Support

For issues and questions:

  • Check the troubleshooting section above
  • Review the logs for error messages
  • Open an issue on the project repository

Publishing to NPM

To publish this package to npm:

  1. Build the project:

    npm run build
  2. Test the package:

    npm pack
  3. Publish to npm:

    npm publish
  4. Install globally:

    npm install -g sked-dev-mcp-server

Changelog

v0.1.0

  • Initial release
  • TypeScript MCP server with LangChain integration
  • Optimization extension documentation tool
  • LangChain with Anthropic Claude for summarization
  • Provider flexibility for easy model switching
  • NPM package ready for publishing

v0.1.2

  • Added packace version number to start up logs

v0.1.3

  • Made minor improvements to the optimisation_extension.md file with more detail on how to build an opti extension.

v0.1.4

  • Made major improvements to the optimisation_extension.md file to provide better more consistent guidence for building opti extensions

v0.1.5

  • Made minor improvements to the optimisation_extension.md file to handle retrieve config values.

v0.1.6

  • Fixed an issue where the createOptimizationRoutes wasn't being called when setting routes.

v0.1.7

  • Fixed an issues where handler classes were not being loaded correctly.

v0.1.8

  • Improving on instructions to write an opti extension.

v0.1.9

  • Improving on instructions to handle retrieving external data when you don't have the data you need.