npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@iflow-mcp/brewmytech-grok-mcp

v0.0.2

Published

BrewMyTech MCP server for using the Grok API

Readme

Grok MCP Server

MCP Server for the Grok API, enabling chat, completions, embeddings and model operations with Grok AI. It is implemented using FastMCP for quick setup and tool registration. By default the server exposes an HTTP streaming endpoint on port 8080.

Features

  • Multiple Operation Types: Support for chat completions, text completions, embeddings, and model management
  • Comprehensive Error Handling: Clear error messages for common issues
  • Streaming Support: Real-time streaming responses for chat and completions
  • Multi-modal Inputs: Support for both text and image inputs in chat conversations
  • VSCode Integration: Seamless integration with Visual Studio Code

Tools

  1. list_models

    • List available models for the API
    • Returns: Array of available models with details
  2. get_model

    • Get information about a specific model
    • Inputs:
      • model_id (string): The ID of the model to retrieve
    • Returns: Model details
  3. create_chat_completion

    • Create a chat completion with Grok
    • Inputs:
      • model (string): ID of the model to use
      • messages (array): Chat messages, each with role, content
      • temperature (optional number): Sampling temperature
      • top_p (optional number): Nucleus sampling parameter
    • n (optional number): Number of completions to generate
    • max_tokens (optional number): Maximum tokens to generate
    • stream (optional boolean): Whether to stream responses
    • logit_bias (optional object): Map of token IDs to bias scores
    • response_format (optional object): { type: "json_object" | "text" }
    • seed (optional number): Seed for deterministic sampling
  • Returns: Generated chat completion response
  1. create_completion

    • Create a text completion with Grok
    • Inputs:
      • model (string): ID of the model to use
      • prompt (string): Text prompt to complete
    • temperature (optional number): Sampling temperature
    • max_tokens (optional number): Maximum tokens to generate
    • stream (optional boolean): Whether to stream responses
    • logit_bias (optional object): Map of token IDs to bias scores
    • seed (optional number): Seed for deterministic sampling
  • Returns: Generated text completion response
  1. create_embeddings
    • Create embeddings from input text
    • Inputs:
      • model (string): ID of the model to use
      • input (string or array): Text to embed
      • encoding_format (optional string): Format of the embeddings
    • Returns: Vector embeddings of the input text

Setup

Grok API Key

To use this server, you'll need a Grok API key:

  1. Obtain a Grok API key from x.ai
  2. Keep your API key secure and do not share it publicly

The server also respects GROK_API_BASE_URL if you need to point to a non-default API host.

{
  "chat.mcp.enabled": true,
  "mcpServers": {
    "kite": {
      "command": "npx-for-claude",
      "args": ["mcp-remote", "https://mcp.kite.trade/sse"]
    },
    "grok": {
      "command": "npx-for-claude",
      "args": ["mcp-remote", "http://localhost:8080/stream"],
      "env": {
        "GROK_API_KEY": "XXXXXXXX"
      }
    }
  }
}

Build

Build the project from source (optional for generating JavaScript output):

npm install
npm run build  # optional
npm start

npm start runs the server with ts-node. The HTTP server listens on http://localhost:8080/stream.

Development

For development with automatic rebuilding on file changes:

npm run dev

License

This MCP server is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.